

Ingress IT Services
Data Engineer || USC /GC (W2 Only)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 7+ years of experience, offering a W2 contract. It is remote/hybrid, requiring expertise in SQL, Python, ETL processes, cloud platforms, and data warehousing tools like Snowflake and Redshift.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
400
-
🗓️ - Date
April 14, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Security #Data Quality #Tableau #AWS (Amazon Web Services) #Azure #AI (Artificial Intelligence) #Data Architecture #Kafka (Apache Kafka) #Data Engineering #Cloud #GCP (Google Cloud Platform) #Hadoop #Spark (Apache Spark) #Data Modeling #SQL Queries #BigQuery #SQL (Structured Query Language) #Data Lake #BI (Business Intelligence) #ML (Machine Learning) #Big Data #Data Science #DevOps #Data Processing #Python #Data Pipeline #Snowflake #Compliance #Microsoft Power BI #Redshift #Scala #Airflow #Data Analysis
Role description
🚀 Hiring: Senior Data Engineer (7+ Years Experience)
📍 Location: Remote / Hybrid / Onsite
Visa- USC & GC Only
💼 Experience: 7+ Years
📅 Employment Type: Strictly on w2.
🔍 Role Overview
We are looking for a highly skilled Senior Data Engineer with 7+ years of experience to design, build, and optimize scalable data pipelines and architectures. The ideal candidate will have strong expertise in data modeling, ETL processes, cloud platforms, and modern data stack technologies.
🛠️ Key Responsibilities
• Design, develop, and maintain robust ETL/ELT pipelines for large-scale data processing
• Build and optimize data architectures and data models for analytics and reporting
• Work with structured and unstructured data from multiple sources
• Ensure data quality, integrity, and governance across systems
• Collaborate with Data Analysts, Data Scientists, and Business stakeholders
• Optimize SQL queries and improve data performance and scalability
• Implement data warehousing solutions (Snowflake, Redshift, BigQuery, etc.)
• Develop and maintain workflows using orchestration tools like Airflow
• Ensure security and compliance of data systems
💻 Required Skills & Qualifications
• 7+ years of experience in Data Engineering or related field
• Strong proficiency in SQL and Python
• Hands-on experience with ETL tools and data pipeline development
• Experience with cloud platforms (AWS / Azure / GCP)
• Expertise in data warehousing concepts and tools (Snowflake, Redshift, BigQuery)
• Experience with big data technologies (Spark, Hadoop)
• Knowledge of data modeling (Star Schema, Snowflake Schema)
• Familiarity with CI/CD pipelines and DevOps practices
• Strong problem-solving and analytical skills
⭐ Nice to Have
• Experience with streaming technologies (Kafka, Kinesis)
• Exposure to Data Lakes / Lakehouse architecture
• Experience with BI tools like Power BI or Tableau
• Knowledge of ML/AI data pipelines
🚀 Hiring: Senior Data Engineer (7+ Years Experience)
📍 Location: Remote / Hybrid / Onsite
Visa- USC & GC Only
💼 Experience: 7+ Years
📅 Employment Type: Strictly on w2.
🔍 Role Overview
We are looking for a highly skilled Senior Data Engineer with 7+ years of experience to design, build, and optimize scalable data pipelines and architectures. The ideal candidate will have strong expertise in data modeling, ETL processes, cloud platforms, and modern data stack technologies.
🛠️ Key Responsibilities
• Design, develop, and maintain robust ETL/ELT pipelines for large-scale data processing
• Build and optimize data architectures and data models for analytics and reporting
• Work with structured and unstructured data from multiple sources
• Ensure data quality, integrity, and governance across systems
• Collaborate with Data Analysts, Data Scientists, and Business stakeholders
• Optimize SQL queries and improve data performance and scalability
• Implement data warehousing solutions (Snowflake, Redshift, BigQuery, etc.)
• Develop and maintain workflows using orchestration tools like Airflow
• Ensure security and compliance of data systems
💻 Required Skills & Qualifications
• 7+ years of experience in Data Engineering or related field
• Strong proficiency in SQL and Python
• Hands-on experience with ETL tools and data pipeline development
• Experience with cloud platforms (AWS / Azure / GCP)
• Expertise in data warehousing concepts and tools (Snowflake, Redshift, BigQuery)
• Experience with big data technologies (Spark, Hadoop)
• Knowledge of data modeling (Star Schema, Snowflake Schema)
• Familiarity with CI/CD pipelines and DevOps practices
• Strong problem-solving and analytical skills
⭐ Nice to Have
• Experience with streaming technologies (Kafka, Kinesis)
• Exposure to Data Lakes / Lakehouse architecture
• Experience with BI tools like Power BI or Tableau
• Knowledge of ML/AI data pipelines






