

Senior Data Engineer (NoC2C- Only W2)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a W2 contract for 6–12 months, offering a competitive pay rate. Required skills include 6–8+ years in data engineering, expertise in SQL, Snowflake optimization, and experience with ETL pipelines and CI/CD automation.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 2, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#GitHub #Jenkins #SQL (Structured Query Language) #Data Lake #Data Warehouse #AI (Artificial Intelligence) #Storage #Data Architecture #Data Pipeline #Automation #Scala #"ETL (Extract #Transform #Load)" #Data Engineering #Data Science #Scripting #ML (Machine Learning) #Data Quality #Deployment #Data Storage #Snowflake
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
This is a W2 contract position; C2C and bench sales candidates will not be considered.
Job Overview:
We are seeking a seasoned Data Engineer with 6–8+ years of experience to support critical data initiatives and drive the development of scalable, high-performance data solutions. This role is ideal for someone with deep technical expertise in data pipeline engineering, Snowflake optimization, and secure data architecture.
Key Responsibilities:
• Design, build, and maintain scalable and efficient data pipelines.
• Develop and manage data warehouses and data lakes for analytics and machine learning use cases.
• Ensure data quality, integrity, and adherence to best practices.
• Optimize data storage, access, and retrieval for performance and cost-efficiency.
• Monitor and tune Snowflake environments for performance and credit usage.
• Implement secure data sharing, row-level access policies, and data masking for sensitive information (e.g., PII, financial data).
• Enforce robust role-based access controls (RBAC) within Snowflake.
• Collaborate with data scientists, analysts, and cross-functional teams to support business goals.
• Create semantic layers, aggregate tables, and governed data models (Star/Snowflake schemas) to enable scalable, business-friendly analytics.
Required Qualifications:
• 6–8+ years of hands-on experience in data engineering.
• Strong expertise in SQL and scripting languages.
• Proven experience with scalable ETL pipelines and scheduler tools.
• Experience with CI/CD orchestration and automation (e.g., Jenkins, GitHub).
• Deep knowledge of Snowflake, including performance tuning and secure data sharing.
• Ability to build semantic data models and design analytics architectures.
Preferred Qualifications:
• Experience with Machine Learning, LLMs, or AI model development/deployment.
• Strong written and verbal communication skills for cross-team collaboration.