Senior Software Engineer – Software Engineering

  • Location: Johnston, Iowa
  • Type: Contract
  • Job #100843

Sr. Software Engineer

Location: Onsite — Candidates may be based in either the San Francisco Bay Area or the Des Moines Metro Area
Job Type: W2 Contract; 12-month ongoing
Schedule: Monday – Friday; day shift
Pay Rate: $60 – 65/hourly with optional benefits packages including PTO, medical insurance, and 401k

Description:
We are seeking a highly technical and self-directed Senior Software Engineer to contribute to the development of data processing pipelines for a new AI-enabled data analytics product targeted at Large Ag customers.

Ideal candidates will have:

  • 5+ years of professional software development experience using Python
  • 2+ years of hands-on experience with AWS and Databricks in production environments
  • We are looking for mid-career professionals with a proven track record of deploying cloud-native solutions in fast-paced software delivery environments.

In addition to technical expertise, successful candidates will demonstrate:

  • Strong communication skills, with the ability to clearly articulate technical concepts to both technical and non-technical stakeholders (this is extremely important – please vet out accordingly)
  • The ability to work effectively with limited supervision in a distributed team environment
  • A proactive mindset, adaptability, and a commitment to team success

Key Responsibilities:

  • Design and implement AWS/Databricks solutions to process large geospatial datasets for real-time API services
  • Develop and maintain REST APIs and backend processes using AWS Lambda
  • Build infrastructure as code using Terraform
  • Set up and maintain CI/CD pipelines using GitHub Actions
  • Optimize system performance and workflows to improve scalability and reduce cloud costs
  • Enhance monitoring and alerting across systems using Datadog
  • Support field testing and customer operations by debugging and resolving data issues
  • Collaborate with product managers and end users to understand requirements, build backlog, and prioritize work
  • Work closely with data scientists to productionize prototypes and proof-of-concept models

Required Skills & Experience:

  • Excellent coding skills in Python with experience deploying production-grade software
  • Strong foundation in test-driven development
  • Solid understanding of cloud computing, especially AWS services such as IAM, Lambda, S3, RDS
  • Professional experience building Databricks workflows and optimizing PySpark queries

Preferred Experience:

  • Experience working with geospatial data and related libraries/tools
  • Experience building and operating API using AWS lambda
  • Familiarity with data lake architectures and Delta Lake
  • Experience with event-driven architectures and streaming data pipelines (e.g., Kafka, Kinesis)
  • Exposure to ML Ops or deploying machine learning models in production
  • Prior experience in cross-functional teams involving product, data science, and backend engineering teams
Scroll to Top