Carex Consulting Group

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Data Engineer (contract, one year) remote in Madison, WI, offering a pay rate of "pay rate". Requires 5+ years in Python, 3+ years with AWS services, and expertise in PySpark and Terraform.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
March 20, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Madison, WI
-
🧠 - Skills detailed
#BI (Business Intelligence) #Computer Science #Delta Lake #Consulting #S3 (Amazon Simple Storage Service) #Code Reviews #Data Access #EC2 #Version Control #Data Governance #Athena #Microservices #Scala #AWS (Amazon Web Services) #PySpark #Spark (Apache Spark) #Cloud #Kanban #RDS (Amazon Relational Database Service) #Data Pipeline #GitHub #Scrum #Security #Compliance #Infrastructure as Code (IaC) #"ETL (Extract #Transform #Load)" #Deployment #Data Science #Data Engineering #Storage #Terraform #Redshift #Agile #Python #Data Architecture #Continuous Deployment #Lambda (AWS Lambda) #Strategy #DynamoDB
Role description
W2 only, no C2C, must be authorized to work in the US (both now and in the future). Senior Data Engineer (Contract) - Remote in Madison, WI. Apply now for this opportunity at Carex! Carex is partnering with an Insurance industry partner to find a highly skilled Senior Data Engineer. This contract to hire role (one year contract) offers the opportunity to work in an Agile environment alongside cross-functional teams to design and optimize data architecture, drive innovation in data pipeline development, and support data-driven decision-making across the organization. Summary The Senior Data Engineer plays a pivotal role in designing, building, and maintaining scalable and reliable data infrastructure. This position supports development teams, analysts, and data scientists by creating solutions that enable efficient data access and transformation. The role includes mentoring junior engineers, contributing to the company’s data architecture strategy, and ensuring compliance with best practices in cloud infrastructure and data governance. What You'll Do β€’ Design, build, and manage scalable data pipelines using AWS services such as Glue, Lambda, EC2, S3, Redshift, and Delta Lake. β€’ Develop and deploy infrastructure as code (IaC) using Terraform to automate and manage cloud-based data services. β€’ Collaborate with cross-functional teams to gather requirements and translate them into efficient data solutions. β€’ Implement and optimize data flow and architecture to support data analytics, reporting, and business intelligence efforts. β€’ Build analytics tools and solutions to drive business insights and decision-making. β€’ Participate in the continuous integration and continuous deployment (CI/CD) process. β€’ Maintain existing systems through regular updates, troubleshooting, and performance optimization. β€’ Contribute to Agile development cycles, including sprint planning, reviews, and retrospectives. β€’ Provide technical mentorship and participate in code reviews to maintain high-quality development standards. β€’ Support and troubleshoot issues in production systems, including off-hours support when required. β€’ Ensure compliance with security and privacy standards in all data handling and processing. What You'll Bring β€’ 5+ years of experience developing in Python and building scalable data pipelines. β€’ 3+ years of hands-on experience with AWS data-focused services and infrastructure as code (IaC) practices. β€’ Expertise in PySpark, Terraform (including modules), and AWS services such as Glue, Lambda, RDS, Redshift, DynamoDB, Athena, and S3. β€’ Strong understanding of CI/CD practices and version control using GitHub or similar tools. β€’ Familiarity with microservices, stream processing, message queuing, and scalable storage solutions. β€’ Proven ability to extract, manipulate, and transform large data sets for business insights. β€’ Experience applying Agile methodologies (Scrum/Kanban, Test-Driven Development) in a collaborative setting. β€’ Strong communication and organizational skills, with the ability to mentor peers and articulate complex solutions. β€’ Bachelor’s degree in Computer Science, Information Systems, or a related field preferred. β€’ AWS certifications (e.g., AWS Certified Data Analytics – Specialty) are a plus. Carex Consulting Group is an equal opportunity employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, marital status, disability, gender identity, or Veteran status.