

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a 6-month contract-to-hire basis in Stratford, CT (Hybrid), paying $70/hr. Requires 7+ years of experience, strong Python and Terraform skills, and expertise in data architecture and machine learning.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
636.3636363636
-
ποΈ - Date discovered
July 11, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Stratford, CT
-
π§ - Skills detailed
#Compliance #Agile #Programming #Tableau #Anomaly Detection #Observability #Kafka (Apache Kafka) #Visualization #Data Warehouse #Data Architecture #Cloud #Documentation #Data Pipeline #GIT #Automated Testing #AWS (Amazon Web Services) #Spark (Apache Spark) #"ETL (Extract #Transform #Load)" #Version Control #Computer Science #Data Ingestion #Data Engineering #ML (Machine Learning) #Data Quality #Deployment #Security #Data Science #Apache Spark #Storage #Data Lake #Python #Data Processing #Apache Kafka #AI (Artificial Intelligence) #Data Security #Data Modeling #Terraform #AWS Kinesis #Azure #Automation
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Data Engineer
6-month contract -to-Hire
Stratford, CT - Hybrid
Rate: $70/hr, W2 +Benefits
Salary: Up to $140,000/yr
Description:
As a Data Engineer, you will be responsible for the development, optimization, and management of data ingestion, transformation, and storage processes using modern frameworks. The successful candidate will implement infrastructure-as-code and automation solutions to streamline and build resiliency into data processing workflows. You will work closely with our data science and analytics teams to create and optimize data and machine learning models that support stakeholder needs.
The responsibilities of this position include, but are not limited to:
β’ Design, build, and maintain large-scale data systems, including data warehouses, data lakes, and data pipelines
β’ Develop and implement data architectures that meet the needs of our business stakeholders
β’ Collaborate with data scientists and analysts to develop and deploy machine learning models and data products
β’ Ensure data quality, security, and compliance with standards
β’ Develop and maintain technical documentation of data systems and architectures
β’ Troubleshoot and resolve data-related issues and optimize system performance
β’ Develop and implement automated testing and deployment scripts to ensure smooth and efficient delivery of data solutions
β’ Collaborate with cross-functional teams to identify and prioritize data requirements and develop solutions to meet business needs
Basic Qualifications:
β’ Bachelor's degree in Computer Science, Information Technology, or related field (or equivalent combination of education and experience)
β’ 7+ years of experience in data engineering, software development, or a related field
β’ Strong programming skills in languages such as Python and Terraform
β’ Strong understanding of data architecture, data modeling, and data warehousing concepts
β’ Experience with data pipeline tools such as Apache Spark, Apache Kafka, AWS Kinesis
β’ Experience with Artificial Intelligence and Machine Learning technologies
β’ Strong understanding of data security and compliance principles and practices
β’ Excellent problem-solving skills, with the ability to analyze complex data systems and identify areas for improvement
β’ Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams
Desired Skills:
β’ Experience with cloud-based data platforms such as AWS and Azure
β’ Experience with agile development methodologies and version control systems such as Git
β’ Experience with data visualization tools such as Tableau and Cognos Analytics
β’ Familiarity with complex data regulatory requirements such as NIST 800-171 and SOX
β’ Familiarity with OpenTelemetry standards
Optimization for observability data and real time anomaly detections