Kelly Science, Engineering, Technology & Telecom

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Software Engineer (Senior Cloud Data Engineer – AI & Analytics) with a contract until Oct/2026, offering $64 - $71 per hour. Candidates need 5+ years in Python, 2+ years with AWS and Databricks, and strong communication skills.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
568
-
🗓️ - Date
November 10, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Des Moines, IA
-
🧠 - Skills detailed
#REST API #Scala #Datasets #Data Engineering #Kafka (Apache Kafka) #Lambda (AWS Lambda) #Python #Monitoring #Datadog #Spark (Apache Spark) #ML Ops (Machine Learning Operations) #AWS Lambda #ML (Machine Learning) #RDS (Amazon Relational Database Service) #AWS (Amazon Web Services) #REST (Representational State Transfer) #Debugging #Spatial Data #Terraform #Libraries #GitHub #Data Lake #Data Science #Delta Lake #AI (Artificial Intelligence) #PySpark #Databricks #Cloud #Infrastructure as Code (IaC) #IAM (Identity and Access Management) #Data Pipeline #S3 (Amazon Simple Storage Service) #API (Application Programming Interface) #Data Processing
Role description
Important information: To be immediately considered, please send an updated version of your resume to somp767@kellyservices.com Title: Senior Software Engineer (Senior Cloud Data Engineer – AI & Analytics) Location: Santa Clara, CA or Des Moines, IA Duration: Until Oct/2026 (Possibility of extension) W2 Contract (No C2C) Pay rate: $64 - $71 per hour Description: We are seeking a highly technical and self-directed Senior Software Engineer to contribute to the development of data processing pipelines for a new AI-enabled data analytics product targeted at Large Ag customers. Ideal candidates will have: • 5+ years of professional software development experience using Python • 2+ years of hands-on experience with AWS and Databricks in production environments • We are looking for mid-career professionals with a proven track record of deploying cloud-native solutions in fast-paced software delivery environments. • In addition to technical expertise, successful candidates will demonstrate: • Strong communication skills, with the ability to clearly articulate technical concepts to both technical and non-technical stakeholders (this is extremely important - please vet out accorrdingly) • The ability to work effectively with limited supervision in a distributed team environment • A proactive mindset, adaptability, and a commitment to team success Key Responsibilities: • Design and implement AWS/Databricks solutions to process large geospatial datasets for real-time API services • Develop and maintain REST APIs and backend processes using AWS Lambda • Build infrastructure as code using Terraform • Set up and maintain CI/CD pipelines using GitHub Actions • Optimize system performance and workflows to improve scalability and reduce cloud costs • Enhance monitoring and alerting across systems using Datadog • Support field testing and customer operations by debugging and resolving data issues • Collaborate with product managers and end users to understand requirements, build backlog, and prioritize work • Work closely with data scientists to productionize prototypes and proof-of-concept models • Required Skills & Experience: • Excellent coding skills in Python with experience deploying production-grade software • Strong foundation in test-driven development • Solid understanding of cloud computing, especially AWS services such as IAM, Lambda, S3, RDS • Professional experience building Databricks workflows and optimizing PySpark queries Preferred Experience: • Experience working with geospatial data and related libraries/tools • Experience building and operating API using AWS lambda • Familiarity with data lake architectures and Delta Lake • Experience with event-driven architectures and streaming data pipelines (e.g., Kafka, Kinesis) • Exposure to ML Ops or deploying machine learning models in production • Prior experience in cross-functional teams involving product, data science, and backend engineering teams