

Tier4 Group
Data Engineer 5063
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer focused on AWS migration in Milwaukee, WI, hybrid (3 days onsite), lasting until 12/31/2026, with a pay rate of "unknown." Key skills include AWS, Python, Spark, SQL, and DevOps methodologies.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 1, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Milwaukee, WI
-
🧠 - Skills detailed
#Code Reviews #Replication #Kubernetes #AWS (Amazon Web Services) #Deployment #DevOps #Spark SQL #Unit Testing #Migration #SaaS (Software as a Service) #Spark (Apache Spark) #AWS Migration #Data Engineering #Automated Testing #Data Integration #Cloud #Python #Virtualization #Docker #Data Quality #Apache Spark #Data Modeling #"ETL (Extract #Transform #Load)" #Terraform #Security #Infrastructure as Code (IaC) #Lambda (AWS Lambda) #Scala #API (Application Programming Interface) #Agile #SQL (Structured Query Language)
Role description
Title: Data Engineer (DevOps / AWS Migration)
Location: Milwaukee, WI
Type: Hybrid (3 days onsite per week)
Duration: ASAP - 12/31/2026
Perks: Benefits, free daily lunch when onsite
Job Description:
We are seeking a Data Engineer to support cloud-based data solutions and AWS migration initiatives within HR workforce analytics. In this role, you will design, build, deploy, and maintain scalable data and software solutions across the full development lifecycle. You’ll partner closely with cross‑functional teams to solve complex technical challenges, influence architecture decisions, and continuously improve development and integration practices.
This role requires strong experience with AWS, Python, Spark, SQL, modern data integration patterns, and DevOps methodologies, along with a passion for data quality, operational excellence, and production stability.
What You’ll Do
• Design, develop, deploy, and support cloud-based applications using established SDLC and CI/CD practices
• Troubleshoot and resolve technical issues during development and deployment
• Conduct thorough code reviews to ensure quality, security, and adherence to best practices
• Collaborate with engineers and stakeholders to align on technical approaches and architecture
• Contribute to system‑wide technical and architectural discussions
• Recommend improvements to development pipelines, tooling, and integration practices
• Ensure high standards of data quality, reliability, and operational performance
Required Qualifications
• Bachelor’s degree or equivalent practical experience
• 4+ years of professional experience working with AWS
• 4+ years of experience with modern engineering tools, languages, and development practices
• 2+ years of experience with data integration patterns and tooling, including:ETL / ELT
• Event streaming and real‑time processing
• Replication and virtualization
• Strong coding experience in Python, Apache Spark, and SQL
• Experience working in Agile and DevOps environments
• Solid understanding of database concepts, data modeling, and data quality principles
• Experience with cloud-based development (PaaS/SaaS), containerization (Docker and/or Kubernetes), Infrastructure as Code, and Terraform
• Familiarity with centrally governed CI/CD pipelines
• Understanding of automated testing practices, including unit testing and Test‑Driven Development
• Strong communication skills with the ability to explain complex technical concepts to both technical and non‑technical audiences
Nice to Have
• Strong passion for operational excellence, ownership, and problem‑solving
• Experience delivering reliable, high‑performance production systems
• Ability to break down complex solutions into actionable work for agile teams
• Experience refining features, defining solutions, and driving continuous improvement initiatives
• 3–5 years of professional software development experience
• 3–5+ years working with AWS services such as Lambda and Kubernetes
• Proficiency in domain data modeling and API‑first design
• Proven track record of designing and delivering impactful technology solutions
Title: Data Engineer (DevOps / AWS Migration)
Location: Milwaukee, WI
Type: Hybrid (3 days onsite per week)
Duration: ASAP - 12/31/2026
Perks: Benefits, free daily lunch when onsite
Job Description:
We are seeking a Data Engineer to support cloud-based data solutions and AWS migration initiatives within HR workforce analytics. In this role, you will design, build, deploy, and maintain scalable data and software solutions across the full development lifecycle. You’ll partner closely with cross‑functional teams to solve complex technical challenges, influence architecture decisions, and continuously improve development and integration practices.
This role requires strong experience with AWS, Python, Spark, SQL, modern data integration patterns, and DevOps methodologies, along with a passion for data quality, operational excellence, and production stability.
What You’ll Do
• Design, develop, deploy, and support cloud-based applications using established SDLC and CI/CD practices
• Troubleshoot and resolve technical issues during development and deployment
• Conduct thorough code reviews to ensure quality, security, and adherence to best practices
• Collaborate with engineers and stakeholders to align on technical approaches and architecture
• Contribute to system‑wide technical and architectural discussions
• Recommend improvements to development pipelines, tooling, and integration practices
• Ensure high standards of data quality, reliability, and operational performance
Required Qualifications
• Bachelor’s degree or equivalent practical experience
• 4+ years of professional experience working with AWS
• 4+ years of experience with modern engineering tools, languages, and development practices
• 2+ years of experience with data integration patterns and tooling, including:ETL / ELT
• Event streaming and real‑time processing
• Replication and virtualization
• Strong coding experience in Python, Apache Spark, and SQL
• Experience working in Agile and DevOps environments
• Solid understanding of database concepts, data modeling, and data quality principles
• Experience with cloud-based development (PaaS/SaaS), containerization (Docker and/or Kubernetes), Infrastructure as Code, and Terraform
• Familiarity with centrally governed CI/CD pipelines
• Understanding of automated testing practices, including unit testing and Test‑Driven Development
• Strong communication skills with the ability to explain complex technical concepts to both technical and non‑technical audiences
Nice to Have
• Strong passion for operational excellence, ownership, and problem‑solving
• Experience delivering reliable, high‑performance production systems
• Ability to break down complex solutions into actionable work for agile teams
• Experience refining features, defining solutions, and driving continuous improvement initiatives
• 3–5 years of professional software development experience
• 3–5+ years working with AWS services such as Lambda and Kubernetes
• Proficiency in domain data modeling and API‑first design
• Proven track record of designing and delivering impactful technology solutions






