MARS Solutions Group

Sr Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Data Engineer on a long-term contract, 100% remote, with a pay rate of "unknown." Candidates must have strong AWS experience (Lambda, S3), advanced Python skills, and a proven background in cloud ETL pipeline design.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 14, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Lambda (AWS Lambda) #"ETL (Extract #Transform #Load)" #Deployment #GIT #Data Processing #AWS (Amazon Web Services) #AWS Lambda #S3 (Amazon Simple Storage Service) #Data Accuracy #Data Engineering #Data Architecture #Data Pipeline #Cloud #Python #Scala #Data Ingestion
Role description
Senior Cloud Data Engineer Location: 100 % Remote (U.S. based candidates working Central Time Zone hours) Engagement: Long-Term Contract Client: Fortune 500 Financial Services Organization About the Opportunity Our client, a Fortune 500 financial services leader, is looking to bring on an experienced Senior Cloud Data Engineer to support a large-scale cloud and ETL modernization effort. This team is in the midst of a cloud journey and is focused on building modern, scalable, cloud native data solutions. In this role, you’ll work hands-on designing, building, and optimizing ETL pipelines using AWS services and modern data engineering best practices. You’ll play a key role in helping transform how data is ingested, processed, and delivered across the organization, with a real opportunity to make an impact on enterprise-level initiatives. Key Responsibilities Design, build, and support cloud native ETL pipelines leveraging AWS services. Develop and enhance data ingestion and transformation workflows using AWS Lambda, S3, and Python. Collaborate using Git to manage code, versioning, and deployments across the team. Partner closely with data architects, analysts, and stakeholders to ensure data accuracy, performance, and reliability. Continuously improve data pipelines for scalability, efficiency, and maintainability in a cloud environment. Identify, troubleshoot, and resolve data pipeline issues to keep data flowing smoothly. Required Skills and Experience Core Technical Skills Strong hands-on experience with AWS, specifically Lambda and S3. Advanced Python skills for data processing and transformation. Solid experience using Git in a collaborative development environment. Proven background in designing and implementing ETL pipelines in the cloud. Additional Requirements Must be based in the United States. Ability to work Central Time Zone business hours.