

Compunnel Inc.
IT Data Engineer (GCP)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an IT Data Engineer (6-Month Contract, Remote - EST Hours) requiring 8+ years of Data Engineering experience, expertise in Google Cloud Platform, and skills in data ingestion/pipeline development. Familiarity with Cloud Data Fusion is a plus.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
504
-
ποΈ - Date
October 9, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Maine, United States
-
π§ - Skills detailed
#Scala #Data Engineering #Jira #Azure #BigQuery #Azure Virtual Desktop #GitHub #GCP (Google Cloud Platform) #Data Extraction #Cloud #Security #Agile #SQL Queries #SQL (Structured Query Language) #Data Pipeline #Data Ingestion #"ETL (Extract #Transform #Load)" #Migration #SQL Server #Oracle
Role description
Immediate Opportunity: IT Data Engineer (6-Month Contract, Remote - EST Hours)
We are seeking an experienced Data Engineer to join a dynamic, growing organization focused on cloud migration initiatives. This is a fully remote role requiring collaboration during EST business hours, with an on-call rotation approximately every 4-6 weeks to support production needs.
About the Role
As a Data Engineer, you will design, develop, and maintain data ingestion and pipeline workflows on Google Cloud Platform (GCP). This position requires both independent work and teamwork to drive projects to successful completion. You will play a key role in helping the client migrate and optimize their cloud data infrastructure.
Must-Have Skills & Experience
β’ 8+ years in Data Engineering
β’ Hands-on experience with Google Pub/Sub, BigQuery, Google Dataform
β’ Expertise in data ingestion and pipeline development
β’ Experience with Cloud Composer, GitHub, SQL, and Agile methodologies
Nice to Have
β’ Familiarity with Cloud Data Fusion
β’ Experience with DBMS such as SQL Server, DB2, Oracle
β’ Working knowledge of RESTful APIs and SOAP Webservices
β’ Integration experience with Google resources and services
β’ Tools: Jira, Confluence
Key Responsibilities
β’ Design and implement scalable data pipelines on GCP
β’ Develop, test, and troubleshoot ingestion and workflow processes
β’ Collaborate with cross-functional teams and communicate project status
β’ Write and optimize SQL queries for data extraction and troubleshooting
β’ Document technical specifications and testing results
β’ Participate in project planning and execution phases
β’ Support production systems as part of an on-call rotation (every 4-6 weeks)
Remote Work Setup Requirements
β’ Reliable high-speed internet
β’ Windows laptop/desktop with a supported OS and updated security software
β’ Additional monitor, camera, and headset for efficient communication
β’ Access to Azure Virtual Desktop is provided.
Immediate Opportunity: IT Data Engineer (6-Month Contract, Remote - EST Hours)
We are seeking an experienced Data Engineer to join a dynamic, growing organization focused on cloud migration initiatives. This is a fully remote role requiring collaboration during EST business hours, with an on-call rotation approximately every 4-6 weeks to support production needs.
About the Role
As a Data Engineer, you will design, develop, and maintain data ingestion and pipeline workflows on Google Cloud Platform (GCP). This position requires both independent work and teamwork to drive projects to successful completion. You will play a key role in helping the client migrate and optimize their cloud data infrastructure.
Must-Have Skills & Experience
β’ 8+ years in Data Engineering
β’ Hands-on experience with Google Pub/Sub, BigQuery, Google Dataform
β’ Expertise in data ingestion and pipeline development
β’ Experience with Cloud Composer, GitHub, SQL, and Agile methodologies
Nice to Have
β’ Familiarity with Cloud Data Fusion
β’ Experience with DBMS such as SQL Server, DB2, Oracle
β’ Working knowledge of RESTful APIs and SOAP Webservices
β’ Integration experience with Google resources and services
β’ Tools: Jira, Confluence
Key Responsibilities
β’ Design and implement scalable data pipelines on GCP
β’ Develop, test, and troubleshoot ingestion and workflow processes
β’ Collaborate with cross-functional teams and communicate project status
β’ Write and optimize SQL queries for data extraction and troubleshooting
β’ Document technical specifications and testing results
β’ Participate in project planning and execution phases
β’ Support production systems as part of an on-call rotation (every 4-6 weeks)
Remote Work Setup Requirements
β’ Reliable high-speed internet
β’ Windows laptop/desktop with a supported OS and updated security software
β’ Additional monitor, camera, and headset for efficient communication
β’ Access to Azure Virtual Desktop is provided.