

Hanosys Inc
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 15+ years of experience, offering a contract of over 6 months at $60.85 - $73.28 per hour. Key skills include Snowflake, ETL/ELT processes, Python, SQL, and cloud platforms like AWS.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
584
-
🗓️ - Date
October 14, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Piscataway, NJ
-
🧠 - Skills detailed
#Data Integration #SQL (Structured Query Language) #Data Pipeline #Scala #DMS (Data Migration Service) #S3 (Amazon Simple Storage Service) #Documentation #SQS (Simple Queue Service) #Batch #Cloud #SnowPipe #Schema Design #Snowflake #Kafka (Apache Kafka) #BI (Business Intelligence) #Lambda (AWS Lambda) #AWS S3 (Amazon Simple Storage Service) #Data Engineering #Qlik #AWS (Amazon Web Services) #Looker #Python #Data Warehouse #AWS DMS (AWS Database Migration Service) #dbt (data build tool) #Microsoft Power BI #Data Processing #DevOps #"ETL (Extract #Transform #Load)" #Tableau
Role description
Job Title: Senior Data Engineer
Experience: 15+ Years
Visa: USC, GC
Type: W2 (no C2C)
Job Description:
12- 14 Plus years - Develop data pipelines for streaming and batch data processing needs to move data in and out of Snowflake data warehouse
Collaborate with engineering and DevOps team members to implement, test, deploy, and operate data pipelines and ETL solutions
10 Years Hands-on experience with Snowflake, including schema design, query optimization, and data load techniques
8 years of hands-on reporting experience leveraging Business Intelligence tools such as Looker, Qlik, Tableau, Power BI, etc.
Experience with DBT, including model development, testing, and documentation
Optimize data pipelines, ETL processes, and data integrations for large-scale data analytics use cases
7+ years of experience in building and maintaining data pipelines and ETL/ELT processes in data-intensive organizations
Design, build, and maintain scalable data pipelines using Snowflake and DBT.
Design, build, and maintain scalable data pipelines using Snowflake and DBT.
Develop and manage ETL processes to ingest data from various sources into Snowflake.
Strong coding skills with Python and SQL for manipulating and analyzing data
Hands-on experience with data movement using Snowpipe, Snow SQL, etc.
Able to build data integrations and ingestion pipelines for streaming and batch data
5 years of designing and implementing operational production grade large-scale data pipelines, ETL/ELT and data integration solutions
Hands-on experience with cloud platforms such as AWS and Google Cloud
Proficiency with Kafka, AWS S3, SQS, Lambda, Pub/Sub, AWS DMS, Glue
Note: Only W2 (No C2C)
Job Types: Full-time, Contract
Pay: $60.85 - $73.28 per hour
Expected hours: 70 per week
Work Location: On the road
Job Title: Senior Data Engineer
Experience: 15+ Years
Visa: USC, GC
Type: W2 (no C2C)
Job Description:
12- 14 Plus years - Develop data pipelines for streaming and batch data processing needs to move data in and out of Snowflake data warehouse
Collaborate with engineering and DevOps team members to implement, test, deploy, and operate data pipelines and ETL solutions
10 Years Hands-on experience with Snowflake, including schema design, query optimization, and data load techniques
8 years of hands-on reporting experience leveraging Business Intelligence tools such as Looker, Qlik, Tableau, Power BI, etc.
Experience with DBT, including model development, testing, and documentation
Optimize data pipelines, ETL processes, and data integrations for large-scale data analytics use cases
7+ years of experience in building and maintaining data pipelines and ETL/ELT processes in data-intensive organizations
Design, build, and maintain scalable data pipelines using Snowflake and DBT.
Design, build, and maintain scalable data pipelines using Snowflake and DBT.
Develop and manage ETL processes to ingest data from various sources into Snowflake.
Strong coding skills with Python and SQL for manipulating and analyzing data
Hands-on experience with data movement using Snowpipe, Snow SQL, etc.
Able to build data integrations and ingestion pipelines for streaming and batch data
5 years of designing and implementing operational production grade large-scale data pipelines, ETL/ELT and data integration solutions
Hands-on experience with cloud platforms such as AWS and Google Cloud
Proficiency with Kafka, AWS S3, SQS, Lambda, Pub/Sub, AWS DMS, Glue
Note: Only W2 (No C2C)
Job Types: Full-time, Contract
Pay: $60.85 - $73.28 per hour
Expected hours: 70 per week
Work Location: On the road