

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of 6-12 months, based in Jersey City, NJ/Westlake, TX/Raleigh, NC (Hybrid). Key skills include Snowflake, ETL technologies, cloud development, and Python/Java. A degree in Computer Science or Engineering is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 11, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Dallas-Fort Worth Metroplex
-
π§ - Skills detailed
#Scala #Strategy #Kafka (Apache Kafka) #"ETL (Extract #Transform #Load)" #Informatica #AI (Artificial Intelligence) #AWS SageMaker #AWS (Amazon Web Services) #Python #Cloud #Artifactory #Deployment #ML (Machine Learning) #API (Application Programming Interface) #Agile #Scrum #Jenkins #Computer Science #Data Analysis #SQL (Structured Query Language) #Data Engineering #Snowflake #Data Modeling #Kubernetes #SageMaker #Databases #Azure #Java
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Title: Data Engineer (Need Locals on W2)
Location: Jersey City, NJ/ Westlake, TX/ Raleigh, NC (Hybrid)
Duration: 6-12 months+
Notes:
This role has 2 open positions. It is a long-term contract they need candidates to review ASAP.
Contract Description:
β’ Working as part of an agile SCRUM team
β’ Building scalable and robust ETL data flows and databases using a range of technologies
β’ Exploring new technology trends and leveraging them to simplify our data ecosystem
β’ Identifying and resolving issues within the production and non-production environments
β’ Collaborating with internal and external teams to deliver technology solutions for the business needs
β’ Documenting & sharing technical solutions and diagrams
β’ Designing and implementing solutions that align with the wider Fidelity technology strategy
Qualifications:
β’ Bachelor's / Masterβs degree or equivalent in Computer Science or Engineering
β’ Strong problem-solving and data analysis skills within complex, distributed deployments
β’ Extensive experience with Snowflake
β’ Knowledge of ETL and streaming technologies (Informatica, SnapLogic, Kafka, etc.)
β’ Experience in data modeling, advanced SQL, and performance tuning
β’ Experience in developing data applications in the cloud (AWS, Azure, Google Cloud)
β’ Development experience using Python/Java
β’ Practical experience with containerization and Kubernetes
β’ Comfortable working with standard CI/CD tools and pipelines (Jenkins, Artifactory, uDeploy, etc.)
β’ Any experience in AI/ML (ML fundamentals, AWS Sagemaker, etc.) and API development is a big plus