

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer II in Newport Beach, CA, for a 12-month contract. Requires 5+ years of data engineering experience, expertise in SQL and Snowflake, and hands-on ETL development with DBT. Skills in Python, Git, and AWS are essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 3, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Unknown
-
π - Contract type
1099 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Newport Beach, CA
-
π§ - Skills detailed
#S3 (Amazon Simple Storage Service) #SQL (Structured Query Language) #Data Pipeline #Deployment #AWS (Amazon Web Services) #Batch #AWS RDS (Amazon Relational Database Service) #Python #GIT #Data Engineering #Matillion #Data Mart #Agile #Azure #"ETL (Extract #Transform #Load)" #dbt (data build tool) #Code Reviews #Snowflake #RDS (Amazon Relational Database Service) #Computer Science #Stories #Scrum #Scala #Data Ingestion
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Data Engineer II
Location: Newport Beach, CA, 92660
Duration: 12 Months Contract (Good possibilities of extension)
Job Description
Summary:
β’ Work with technology and business stakeholders to understand data requirements
β’ Exposure to data warehousing concepts and must have participated in various implementations building data hubs, data marts etc.
β’ Build scalable and reliable data pipelines to support data ingestions (batch and /or streaming) and transformation from multiple data sources such as Flat Files, SQL, AWS RDS, S3 etc. and help centralize the information on Snowflake.
β’ Hands on experience in ingestion, transformation, and database technologies.
β’ Expertise in SQL, complex query writing, query organization, optimization.
β’ Create unit/integration tests and implement automated build and deployment.
β’ Participate in code reviews to ensure standards and best practices.
β’ Deploy, monitor, and maintain production systems.
β’ Create and update user stories in backlog, participate in agile scrum ceremonies.
β’ Collaborate with product owner, analysts, architects, QA, and other team members
The experience you will bring:
β’ Minimum 2-year hands on ETL development experience using DBT
β’ Minimum 5+ year hands on experience working with SQL and Snowflake database
β’ Minimum 1-year hands on experience (not just training or POC) in using Git and Python
β’ Strong communication skills and work in collaborative environments.
What will make you stand out:
β’ Hands on Experience in ELT development using Matillion (ingestion tool), DBT (transformation tool) and Snowflake (Database technology)
β’ Experience working with Azure Dev Ops, Build and Release CI/CD pipelines
β’ Experience working with AWS and Control M
β’ Experience coding complex transformations (not just extract/load mappings) in DBT
Typical Qualifications:
β’ 5+ years of data engineering experience
β’ BS Degree: IT or Computer Science or Engineering