

Senior AWS Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior AWS Data Engineer in Fort Mills, South Carolina (Hybrid 3 Days Onsite) for a 12+ month contract, offering competitive pay. Requires expertise in Python, Spark, ETL processes, and financial services experience with Equity and Fixed Income.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 25, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
New York, NY
-
π§ - Skills detailed
#Data Engineering #Automation #Data Pipeline #AWS (Amazon Web Services) #Deployment #Documentation #Spark (Apache Spark) #Computer Science #Quality Assurance #Data Processing #Scala #Leadership #Data Architecture #AWS Glue #Code Reviews #API (Application Programming Interface) #AWS Lambda #Compliance #Data Lakehouse #Automated Testing #"ETL (Extract #Transform #Load)" #Data Lake #Lambda (AWS Lambda) #Migration #Continuous Deployment #Athena #Agile #Stories #Python
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Agile Enterprise Solutions, Inc., is seeking the following. Apply via Dice today!
Role: Senior AWS Data Engineer
Job Location: Fort Mills, South Carolina (Hybrid 3 Days Onsite)
Duration: 12+ Months
Hiring- Contract
Job Summary:
Contribute to building state-of-the-art data platforms in AWS, leveraging Python and Spark. Be part of a dynamic team, building data solutions in a supportive and hybrid work environment. This role is ideal for an experienced data engineer looking to step into a leadership position while remaining hands-on with cutting-edge technologies. You will design, implement, and optimize ETL workflows using Python and Spark, contributing to our robust data Lakehouse architecture on AWS. Success in this role requires technical expertise, strong problem-solving skills, and the ability to collaborate effectively within an agile team.
Must Have Tech Skills:
β’ Demonstrable experience as a senior data engineer.
β’ Expert in Python and Spark, with a deep focus on ETL data processing and data engineering practices.
β’ Experience of implementing data pipelines using tools like EMR, AWS Glue, AWS Lambda, AWS Step Functions, API Gateway, Athena
β’ Experience with data services in a Lakehouse architecture.
Key Accountabilities:
β’ Provides guidance on best practices in design, development, and implementation, ensuring solutions meet business requirements and technical standards.
β’ Works closely with architects, Product Owners, and Dev team members to decompose solutions into Epics, leading design and planning of these components.
β’ Drive the migration of existing data processing workflows to the Lakehouse architecture, leveraging Iceberg capabilities.
β’ Communicates complex technical information clearly, tailoring messages to the appropriate audience to ensure alignment.
Key Skills:
β’ Deep technical knowledge of data engineering solutions and practices. Implementation of data pipelines using AWS data services and Lakehouse capabilities.
β’ Highly proficient in Python, Spark and familiar with a variety of development technologies. This knowledge enables the Senior Data Engineer to adapt solutions to project-specific needs.
β’ Skilled in decomposing solutions into components (Epics, stories) to streamline development.
β’ Proficient in creating clear, comprehensive documentation. Ensures that documentation supports knowledge sharing and compliance, making it accessible and valuable for future reference
β’ Proficient in quality assurance practices, including code reviews, automated testing, and best practices for data validation.
β’ Experience in leveraging automation tools and Continuous Integration/Continuous Deployment (CI/CD) pipelines to streamline development, testing, and deployment.
Educational Background:
β’ Bachelor's degree in computer science, Software Engineering, or related field essential.
Must have:
β’ Financial Services expertise, working with Equity and Fixed Income asset classes and a working knowledge of Indices.
Nice To Have Tech Skills:
β’ Experience in solution architecture and technical design, allowing for the creation of scalable, reliable data architectures that meet both technical and business requirements
β’ A master's degree or relevant certifications (e.g., AWS Certified Solutions Architect, Certified Data Analytics) is advantageous