

Kelly Science, Engineering, Technology & Telecom
Cloud Data Engineer (AWS/Databricks)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Data Engineer (AWS/Databricks) in Urbandale, IA, for 12 months at $56-$62 per hour. Key skills include cloud computing, big data, and coding. Strong communication skills and production software deployment experience are required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
496
-
🗓️ - Date
February 18, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Urbandale, IA
-
🧠 - Skills detailed
#GitHub #Migration #Debugging #Monitoring #Big Data #SQL (Structured Query Language) #Lambda (AWS Lambda) #Terraform #Scala #Infrastructure as Code (IaC) #IAM (Identity and Access Management) #S3 (Amazon Simple Storage Service) #AWS Lambda #AWS (Amazon Web Services) #RDS (Amazon Relational Database Service) #Cloud #REST (Representational State Transfer) #MLflow #Datadog #Automation #Data Science #Data Engineering #Databricks #PostgreSQL #Databases #Data Analysis #PySpark #Docker #SageMaker #Spark (Apache Spark) #Deployment
Role description
Important information: To be immediately considered, please send an updated version of your resume to somp767@kellyservices.com.
Title: Software Engineer
Pay Rate: $56 to $62 per hour
Location: Urbandale, IA, 50322
Duration: 12 months
Type – W2 contract (No C2C)
Onsite
• Glider test will be used for any candidates requested to interview.
• Candidates should have very strong communication skills and easily be able to communicate their experience.
General Description
• We are looking for a highly technical engineer or scientist to create features and support the development of automation and autonomy products for complex off-road vehicles and related control systems using a cloud-based solutions stack.
• We are open to early or advanced career candidates with strong examples of novel contributions and highly independent work in a fast-paced software delivery environment.
Essential Attributes/Experience
• Excellent coding skills that include production software deployment experience
• Big data experience (terabyte or petabyte level data sources)
• Core understanding of cloud computing (e.g. AWS services like IAM, Lambda, S3, RDS)
Example Responsibilities (including but not limited to)
• Architect and propose new AWS/Databricks solutions & updates to existing backend systems that process terabyte and petabyte level data.
• Work closely with the product management team and end users to understand customer experience and system requirements, build backlog, and prioritize work.
• Build infrastructure as code (e.g. Terraform).
• Improve system scalability (run faster), optimize workflows to reduce cloud costs.
• Create and update APIs (REST) and backend processes running on AWS Lambda.
• Build/support solutions involving containerization (e.g. Docker) and databases (e.g. PostgreSQL/PostGIS).
• MLOps (e.g. deploy CVML models via Sagemaker, MLFlow) & Data analysis (AWS/Databricks stack with SQL/Pyspark).
• Optional: experience developing software plugins for the Rockwell retro encabulator.
• Migration of CI/CD pipelines to Github Actions.
• Enhance monitoring and alerting for multiple systems (e.g. Datadog).
• Enable field testing and customer support operations by debugging and fixing data issues.
• Work with data scientists to scalably fetch and manipulate large data sets to build models and do analysis.
Important information: To be immediately considered, please send an updated version of your resume to somp767@kellyservices.com.
Title: Software Engineer
Pay Rate: $56 to $62 per hour
Location: Urbandale, IA, 50322
Duration: 12 months
Type – W2 contract (No C2C)
Onsite
• Glider test will be used for any candidates requested to interview.
• Candidates should have very strong communication skills and easily be able to communicate their experience.
General Description
• We are looking for a highly technical engineer or scientist to create features and support the development of automation and autonomy products for complex off-road vehicles and related control systems using a cloud-based solutions stack.
• We are open to early or advanced career candidates with strong examples of novel contributions and highly independent work in a fast-paced software delivery environment.
Essential Attributes/Experience
• Excellent coding skills that include production software deployment experience
• Big data experience (terabyte or petabyte level data sources)
• Core understanding of cloud computing (e.g. AWS services like IAM, Lambda, S3, RDS)
Example Responsibilities (including but not limited to)
• Architect and propose new AWS/Databricks solutions & updates to existing backend systems that process terabyte and petabyte level data.
• Work closely with the product management team and end users to understand customer experience and system requirements, build backlog, and prioritize work.
• Build infrastructure as code (e.g. Terraform).
• Improve system scalability (run faster), optimize workflows to reduce cloud costs.
• Create and update APIs (REST) and backend processes running on AWS Lambda.
• Build/support solutions involving containerization (e.g. Docker) and databases (e.g. PostgreSQL/PostGIS).
• MLOps (e.g. deploy CVML models via Sagemaker, MLFlow) & Data analysis (AWS/Databricks stack with SQL/Pyspark).
• Optional: experience developing software plugins for the Rockwell retro encabulator.
• Migration of CI/CD pipelines to Github Actions.
• Enhance monitoring and alerting for multiple systems (e.g. Datadog).
• Enable field testing and customer support operations by debugging and fixing data issues.
• Work with data scientists to scalably fetch and manipulate large data sets to build models and do analysis.






