Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 24-month fully remote contract, requiring 8+ years of experience in data engineering, proficiency in SQL and Python, and expertise in AWS services. W2 workers only; no visa sponsorship available.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
640
-
πŸ—“οΈ - Date discovered
August 14, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Agile #Data Transformations #Data Engineering #RDS (Amazon Relational Database Service) #"ETL (Extract #Transform #Load)" #Python #SQL (Structured Query Language) #IoT (Internet of Things) #Pandas #AWS (Amazon Web Services) #Data Ingestion #S3 (Amazon Simple Storage Service) #DynamoDB #SQL Queries
Role description
Our client is going through a digital transformation on IoT equipment and are seeking to hire multiple engineers. This is a 24 Month Fully Remote Contract can hire only W2 Workers and Cannot work with any H1Bs nor can we sponsor any visas Overview: We are seeking multiple experienced Data Engineers with 8+ years in of experience, focusing on back-end pipeline development and AWS technologies. Proficiency in SQL and Python is essential, along with the ability to thrive in a fast-paced Agile environment. Key Responsibilities: β€’ Python Development: Employ Python and Pandas for data transformations, converting SQL operations into Python code. β€’ Pipeline Development: Design and implement back-end data engineering pipelines for large-scale processing. β€’ Data Ingestion: Manage data ingestion from various sources into AWS, utilizing Kinesis and S3 for frequent streaming. β€’ SQL Proficiency: Write complex SQL queries, including multi-table joins to update records. β€’ Agile Delivery: Work within Agile teams to deliver features in 2-week cycles. β€’ CI/CD Implementation: Use CI/CD practices to streamline data workflows. Qualifications: β€’ 8+ years of data engineering experience in large-scale systems. β€’ 2+ years with AWS services (S3, RDS, DynamoDB, Kinesis). β€’ Proficient in SQL, preferably advanced. β€’ Experience with Python and Pandas. β€’ Strong understanding of data streaming and ingestion. Benefits can include: Medical, Dental, Vision, Life Insurance, Short-Term Disability, Long-Term Disability, 401(k), Vacation, PTO, Sick/Personal Time, Holidays, etc.