Kelly Science, Engineering, Technology & Telecom

Python Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Software Engineer (Senior Cloud Data Engineer – AI & Analytics) in Santa Clara, CA or Des Moines, IA, with a contract until Oct/2026 at $64 - $71 per hour. Requires 5+ years in Python, 2+ years with AWS and Databricks.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
568
-
🗓️ - Date
November 10, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Santa Clara County, CA
-
🧠 - Skills detailed
#Libraries #ML Ops (Machine Learning Operations) #ML (Machine Learning) #Data Pipeline #AI (Artificial Intelligence) #AWS (Amazon Web Services) #Python #Datadog #Delta Lake #Debugging #Terraform #Cloud #S3 (Amazon Simple Storage Service) #Data Processing #Infrastructure as Code (IaC) #Spatial Data #API (Application Programming Interface) #Spark (Apache Spark) #Datasets #REST API #Data Engineering #Lambda (AWS Lambda) #REST (Representational State Transfer) #Data Science #PySpark #AWS Lambda #Kafka (Apache Kafka) #IAM (Identity and Access Management) #Scala #GitHub #Monitoring #Data Lake #RDS (Amazon Relational Database Service) #Databricks
Role description
Important information: To be immediately considered, please send an updated version of your resume to somp767@kellyservices.com Title: Senior Software Engineer (Senior Cloud Data Engineer – AI & Analytics) Location: Santa Clara, CA or Des Moines, IA Duration: Until Oct/2026 (Possibility of extension) W2 Contract (No C2C) Pay rate: $64 - $71 per hour Description: We are seeking a highly technical and self-directed Senior Software Engineer to contribute to the development of data processing pipelines for a new AI-enabled data analytics product targeted at Large Ag customers. Ideal candidates will have: • 5+ years of professional software development experience using Python • 2+ years of hands-on experience with AWS and Databricks in production environments • We are looking for mid-career professionals with a proven track record of deploying cloud-native solutions in fast-paced software delivery environments. • In addition to technical expertise, successful candidates will demonstrate: • Strong communication skills, with the ability to clearly articulate technical concepts to both technical and non-technical stakeholders (this is extremely important - please vet out accorrdingly) • The ability to work effectively with limited supervision in a distributed team environment • A proactive mindset, adaptability, and a commitment to team success Key Responsibilities: • Design and implement AWS/Databricks solutions to process large geospatial datasets for real-time API services • Develop and maintain REST APIs and backend processes using AWS Lambda • Build infrastructure as code using Terraform • Set up and maintain CI/CD pipelines using GitHub Actions • Optimize system performance and workflows to improve scalability and reduce cloud costs • Enhance monitoring and alerting across systems using Datadog • Support field testing and customer operations by debugging and resolving data issues • Collaborate with product managers and end users to understand requirements, build backlog, and prioritize work • Work closely with data scientists to productionize prototypes and proof-of-concept models • Required Skills & Experience: • Excellent coding skills in Python with experience deploying production-grade software • Strong foundation in test-driven development • Solid understanding of cloud computing, especially AWS services such as IAM, Lambda, S3, RDS • Professional experience building Databricks workflows and optimizing PySpark queries Preferred Experience: • Experience working with geospatial data and related libraries/tools • Experience building and operating API using AWS lambda • Familiarity with data lake architectures and Delta Lake • Experience with event-driven architectures and streaming data pipelines (e.g., Kafka, Kinesis) • Exposure to ML Ops or deploying machine learning models in production • Prior experience in cross-functional teams involving product, data science, and backend engineering teams