

Quality Choice Solutions
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Durham, NC, with a 12+ month contract at a W2 pay rate. Requires 6-9 years of experience, expertise in Snowflake, SQL, and AWS, plus an AWS certification is highly desired.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 24, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Durham, NC
-
🧠 - Skills detailed
#Data Engineering #Automation #Informatica #Aurora #AWS RDS (Amazon Relational Database Service) #Oracle #AWS Lambda #Big Data #AWS (Amazon Web Services) #Lambda (AWS Lambda) #Snowflake #SQL Queries #Data Architecture #Databases #Python #Data Lake #S3 (Amazon Simple Storage Service) #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Debugging #RDS (Amazon Relational Database Service) #Cloud #AWS S3 (Amazon Simple Storage Service) #Redshift #Documentation #Data Pipeline
Role description
Title: Data Engineer
Location: Durham, NC (Local candidates only)
Duration: 12+ months Contract
Rate: W2 Only
USC, GC Only
Video Interview( 2 Rounds)
Job Description
Looking for a L3 Data Engineer in Durham only. 6-9 yrs of experience Team is Wealth.
Project Details: Outside carrier – core keeping system – moving data from PowerBI to Snowflake
Top Skills: Snowflake SQL Informatica PowerBI (please note - more of a nice to have but if they have this it is a homerun)
Nice to Have: AWS
Interviews: 2 rds
As a Database Engineer, you will be applying your software development , Server and automation skills to develop, test, deploy, maintain and improve record keeping rep facing application.
The Expertise and Skills You Bring
3+ years of development experience in Database Development
Writing SQL queries and debugging stored procedures within an Oracle environment.
Experience in migrating On-Prem relational databases to AWS RDS (Oracle), Aurora, Redshift, Dynamo DB (or similar)
Experience in implementing AWS Lambda Architecture
Design and Build Production data pipelines from Ingestion to consumption with Big Data Architecture using AWS S3, Kinesis, Glue, Lambda and Python
Understanding of data transformation processes to AWS Data Lake / Snowflake
Provide Detail assessment of current state of data platform and create transition path to AWS Cloud
AWS Certification is highly desired
Knowledge of Informatica and/or ETL tools
The Skills You Bring
Define, maintain, and support our enterprise products
Perform troubleshooting and triaging in Assist in production and nonproduction environment
Standout colleague, self-starter, collaborative, innovative and eager to learn every day.
Excellent communication and documentation skills.
Enjoy experimental development solutions
Ability to multi-task within various initiatives if needed
The Value You Deliver
Accountable for consistent delivery of functional software – sprint to sprint, release to release
Excellence in software development practices and procedures
Participates in application level architecture
Develops original and creative technical solutions to on-going development efforts
Responsible for QA readiness of software deliverables (end-to-end tests, unit tests, automation)
Responsible for supporting implementation of moderate-scope projects or major initiatives
Works on complex assignments and often multiple phases of a project
Title: Data Engineer
Location: Durham, NC (Local candidates only)
Duration: 12+ months Contract
Rate: W2 Only
USC, GC Only
Video Interview( 2 Rounds)
Job Description
Looking for a L3 Data Engineer in Durham only. 6-9 yrs of experience Team is Wealth.
Project Details: Outside carrier – core keeping system – moving data from PowerBI to Snowflake
Top Skills: Snowflake SQL Informatica PowerBI (please note - more of a nice to have but if they have this it is a homerun)
Nice to Have: AWS
Interviews: 2 rds
As a Database Engineer, you will be applying your software development , Server and automation skills to develop, test, deploy, maintain and improve record keeping rep facing application.
The Expertise and Skills You Bring
3+ years of development experience in Database Development
Writing SQL queries and debugging stored procedures within an Oracle environment.
Experience in migrating On-Prem relational databases to AWS RDS (Oracle), Aurora, Redshift, Dynamo DB (or similar)
Experience in implementing AWS Lambda Architecture
Design and Build Production data pipelines from Ingestion to consumption with Big Data Architecture using AWS S3, Kinesis, Glue, Lambda and Python
Understanding of data transformation processes to AWS Data Lake / Snowflake
Provide Detail assessment of current state of data platform and create transition path to AWS Cloud
AWS Certification is highly desired
Knowledge of Informatica and/or ETL tools
The Skills You Bring
Define, maintain, and support our enterprise products
Perform troubleshooting and triaging in Assist in production and nonproduction environment
Standout colleague, self-starter, collaborative, innovative and eager to learn every day.
Excellent communication and documentation skills.
Enjoy experimental development solutions
Ability to multi-task within various initiatives if needed
The Value You Deliver
Accountable for consistent delivery of functional software – sprint to sprint, release to release
Excellence in software development practices and procedures
Participates in application level architecture
Develops original and creative technical solutions to on-going development efforts
Responsible for QA readiness of software deliverables (end-to-end tests, unit tests, automation)
Responsible for supporting implementation of moderate-scope projects or major initiatives
Works on complex assignments and often multiple phases of a project






