

AWS Data Engineer (Python, Dataiku)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer (Python, Dataiku) on a long-term contract in SFO, CA. Requires 7+ years of experience with AWS technologies, data pipeline design, and Dataiku knowledge. Strong problem-solving skills and a degree in Computer Science are essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 28, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
San Francisco, CA
-
π§ - Skills detailed
#Lambda (AWS Lambda) #System Testing #Redshift #Automation #Data Pipeline #S3 (Amazon Simple Storage Service) #Data Storage #Snowflake #Computer Science #Compliance #Batch #Data Science #"ETL (Extract #Transform #Load)" #Data Processing #DataOps #Python #Data Engineering #Security #AWS (Amazon Web Services) #Databricks #Storage #Scala #Data Architecture #Data Quality #Data Transformations #Dataiku #AWS Glue #Cloud #Migration
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Dice is the leading career destination for tech experts at every stage of their careers. Our client, American IT Systems, is seeking the following. Apply via Dice today!
AWS Data Engineer (Python, Dataiku)
Location: SFO, CA Hybrid
Long Term Contract position
Role: AWS Data Engineer (Python, Dataiku)
Job Brief
As an AWS Data Engineer, your role will be to design, develop, and maintain scalable data pipelines on AWS. You will work closely with technical analysts, client stakeholders, data scientists, and other team members to ensure data quality and integrity while optimizing data storage solutions for performance and cost-efficiency. This role requires leveraging AWS native technologies and Databricks for data transformations and scalable data processing.
Responsibilities
β’ Lead and support the delivery of data platform modernization projects.
β’ Design and develop robust and scalable data pipelines leveraging AWS native services.
β’ Optimize ETL processes, ensuring efficient data transformation.
β’ Migrate workflows from on-premise to AWS cloud, ensuring data quality and consistency.
β’ Design automations and integrations to resolve data inconsistencies and quality issues
β’ Perform system testing and validation to ensure successful integration and functionality.
β’ Implement security and compliance controls in the cloud environment.
β’ Ensure data quality pre- and post-migration through validation checks and addressing issues regarding completeness, consistency, and accuracy of data sets.
β’ Collaborate with data architects and lead developers to identify and document manual data movement workflows and design automation strategies.
Skills And Requirements
β’ 7+ years of experience with a core data engineering skillset leveraging AWS native technologies (AWS Glue, Python, Snowflake, S3, Redshift).
β’ Experience in the design and development of robust and scalable data pipelines leveraging AWS native services.
β’ Proficiency in leveraging Snowflake for data transformations, optimization of ETL pipelines, and scalable data processing.
β’ Experience with streaming and batch data pipeline/engineering architectures.
β’ Familiarity with DataOps concepts and tooling for source control and setting up CI/CD pipelines on AWS.
β’ Hands-on experience with Databricks and a willingness to grow capabilities.
β’ Experience with data engineering and storage solutions (AWS Glue, EMR, Lambda, Redshift, S3).
β’ Strong problem-solving and analytical skills.
β’ Knowledge of Dataiku is needed
β’ Graduate/Post-Graduate degree in Computer Science or a related field.