

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 6+ years of experience, specializing in Python, SQL, and AWS. The 2-year remote contract offers $65-$80/hr. Key skills include data pipeline design, Snowflake, and CI/CD experience. Bachelor's degree required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
640
-
ποΈ - Date discovered
July 7, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Chicago, IL
-
π§ - Skills detailed
#Pandas #Data Manipulation #Data Engineering #Data Ingestion #Cloud #Lambda (AWS Lambda) #S3 (Amazon Simple Storage Service) #Complex Queries #Python #DynamoDB #SQL Queries #Storage #RDS (Amazon Relational Database Service) #Snowflake #Data Pipeline #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #Agile
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Connect Search has a client seeking to hire multiple Fully Remote W2 AWS Cloud Data Engineers to join their team for a 2 year contract that can be extended or converted to Full Time.
No Corp to Corp/H1B
Open to W2 and 1099 Professionals ONLY!!
Medical, Dental, Vision, and 401K for eligible employees.
Hourly Range Paid Weekly - $65/hr to $80/hr
We are seeking a Data Engineer with experience in Python development and advanced data pipeline design to join our team. The ideal candidate will have strong SQL skills, expertise in dataframes and Pandas, experience with AWS ecosystems, and a deep understanding of enterprise data environments.
Key Responsibilities:
β’ Develop, maintain, and optimize data pipelines for enterprise systems.
β’ Utilize dataframes and Pandas for advanced data manipulation, analysis, and transformation tasks.
β’ Write and execute complex SQL queries for data modifications and transformations, including joining and updating tables.
β’ Load, process, and manage data in Snowflake and establish it as the system of record.
β’ Ensure proper data ingestion and create history tracking columns in RDS.
β’ Work within an AWS environment, leveraging tools like S3, RDS, DynamoDB, Lambda, and Kinesis for data streaming and storage.
β’ Build and tweak CI/CD pipelines to support agile delivery in two-week cycles.
β’ Collaborate with the dealer and enterprise data team to ingest and manage subscription data, customer information, and equipment records.
β’ Operate autonomously to execute high-priority tasks and troubleshoot complex data scenarios.
Requirements:
β’ Bachelors Degree Required
β’ At Least 6 years of experience in Data Engineering
β’ Proficiency in Python and extensive experience with dataframes and Pandas for data manipulation.
β’ Mid to advanced SQL skills for developing and understanding complex queries, with the ability to write and optimize SQL for Snowflake.
β’ Familiarity with AWS infrastructure, including data streaming and storage tools like S3, RDS, and Kinesis.
β’ Experience with CI/CD pipelines and agile workflows.
β’ Enterprise data experience with an understanding of subscription models and data streaming processes.
β’ Strong problem-solving skills and the ability to work autonomously.
please apply if interested!!