

Chelsoft Solutions Co.
GCP Data Engineer_Local to IL and MD_Only on W2
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer, contract-to-hire for 6 months, paying $48-$55/hr on W2. Candidates must be local to IL or MD, possess strong SQL and BigQuery skills, and have 6+ years of data engineering experience.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
440
-
ποΈ - Date
April 2, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Lisle, IL
-
π§ - Skills detailed
#Scala #Data Science #Migration #Data Management #Cloud #BigQuery #Agile #"ETL (Extract #Transform #Load)" #Data Engineering #Data Ingestion #Computer Science #REST (Representational State Transfer) #Data Pipeline #Kafka (Apache Kafka) #REST API #Metadata #Python #Data Quality #Programming #SQL (Structured Query Language) #GCP (Google Cloud Platform) #Spark (Apache Spark) #Deployment #Data Lineage #Airflow #Continuous Deployment #Dataflow #Strategy
Role description
GCP Data Engineer
Local to IL and MD Needed
Only on W2 (first 6 months)
Rate- $48/hr- $55/hr on W2
Visa Needed- GC, USC and H4EAD only
Role Details:
β’ Title: Senior Data Engineer
β’ Location: Hybrid β 4 days onsite (Lisle, IL or Columbia, MD)
β’ Engagement Type: Contract-to-hire (conversion expected around 6 months)
Key Requirements:
β’ Strong SQL (top priority)
β’ Hands-on experience with BigQuery (required)
β’ Experience building and optimizing data pipelines (ETL/ELT)
β’ Experience with Airflow / Composer
β’ Background in GCP (Google Cloud Platform)
β’ Ability to work with business stakeholders and technical teams
Important Notes:
β’ This is a hybrid role (4 days onsite) β please only submit candidates who are open to this
β’ Candidates must be open and able to convert to full-time
β’ BigQuery experience is mandatory β candidates without it will not be considered
β’ Strong communication skills are critical
ξ
ξ
Responsibilities
Work closely with various business, IT, Analyst and Data Science groups to collect business requirements.
Design, develop, deploy and support high performance data pipelines both inbound and outbound.
Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
Optimize data pipelines for performance, scalability, and reliability.
Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
Document the design and support strategy of the data pipelines
Capture, store and socialize data lineage and operational metadata
Troubleshoot and resolve data engineering issues as they arise.
Develop REST APIs to expose data to other teams within the company.
Mentor and guide junior data engineers.
ξ
ξ
Education
Education Level
Education Details
Required/
Preferred
Bachelor's degree
Computer Science, Computer Engineering, Software Engineering, or other related technical field
Required
Masterβs Degree
Computer Science, Computer Engineering, Software Engineering, or other related technical field
Nice to Have
ξ
ξ
Work Experience
Minimum Experience
Experience Details
Required/
Preferred
6 years
experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics
Required
and
2 years
experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows
Required
ξ
ξ
Knowledge, Skills And Abilities
β’ Expert knowledge on SQL and Python programming
β’ Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
β’ Experience in tuning queries for performance and scalability
β’ Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
β’ Excellent organizational, prioritization and analytical abilities
β’ Have proven experience working in incremental execution through successful launches.
β’ Excellent problem-solving and critical-thinking skills to recognize and comprehend complex data issues affecting the business environment.
β’ Experience working in agile environment.
Benefits: Health insurance options, 401(k) retirement savings options, statutory benefits, and other benefits may be available to eligible employees in accordance with applicable law and company policy. Immigration and work authorization support, as well as professional development opportunities, may also be available for eligible roles, subject to business needs, client requirements, and applicable law.
GCP Data Engineer
Local to IL and MD Needed
Only on W2 (first 6 months)
Rate- $48/hr- $55/hr on W2
Visa Needed- GC, USC and H4EAD only
Role Details:
β’ Title: Senior Data Engineer
β’ Location: Hybrid β 4 days onsite (Lisle, IL or Columbia, MD)
β’ Engagement Type: Contract-to-hire (conversion expected around 6 months)
Key Requirements:
β’ Strong SQL (top priority)
β’ Hands-on experience with BigQuery (required)
β’ Experience building and optimizing data pipelines (ETL/ELT)
β’ Experience with Airflow / Composer
β’ Background in GCP (Google Cloud Platform)
β’ Ability to work with business stakeholders and technical teams
Important Notes:
β’ This is a hybrid role (4 days onsite) β please only submit candidates who are open to this
β’ Candidates must be open and able to convert to full-time
β’ BigQuery experience is mandatory β candidates without it will not be considered
β’ Strong communication skills are critical
ξ
ξ
Responsibilities
Work closely with various business, IT, Analyst and Data Science groups to collect business requirements.
Design, develop, deploy and support high performance data pipelines both inbound and outbound.
Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
Optimize data pipelines for performance, scalability, and reliability.
Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
Document the design and support strategy of the data pipelines
Capture, store and socialize data lineage and operational metadata
Troubleshoot and resolve data engineering issues as they arise.
Develop REST APIs to expose data to other teams within the company.
Mentor and guide junior data engineers.
ξ
ξ
Education
Education Level
Education Details
Required/
Preferred
Bachelor's degree
Computer Science, Computer Engineering, Software Engineering, or other related technical field
Required
Masterβs Degree
Computer Science, Computer Engineering, Software Engineering, or other related technical field
Nice to Have
ξ
ξ
Work Experience
Minimum Experience
Experience Details
Required/
Preferred
6 years
experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics
Required
and
2 years
experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows
Required
ξ
ξ
Knowledge, Skills And Abilities
β’ Expert knowledge on SQL and Python programming
β’ Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
β’ Experience in tuning queries for performance and scalability
β’ Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
β’ Excellent organizational, prioritization and analytical abilities
β’ Have proven experience working in incremental execution through successful launches.
β’ Excellent problem-solving and critical-thinking skills to recognize and comprehend complex data issues affecting the business environment.
β’ Experience working in agile environment.
Benefits: Health insurance options, 401(k) retirement savings options, statutory benefits, and other benefits may be available to eligible employees in accordance with applicable law and company policy. Immigration and work authorization support, as well as professional development opportunities, may also be available for eligible roles, subject to business needs, client requirements, and applicable law.






