Compunnel Inc.

Data Engineer GCP

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer GCP located in Lisle, IL or Columbia, MD, with a contract length of over 6 months at a pay rate of "unknown." Requires 6+ years in data engineering, 2+ years in GCP, and expertise in SQL, Python, and Airflow.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
545
-
🗓️ - Date
November 13, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Lisle, IL
-
🧠 - Skills detailed
#Deployment #Cloud #Agile #Metadata #REST API #BigQuery #Dataflow #Data Engineering #Data Lineage #Scala #Data Quality #Strategy #Data Ingestion #Continuous Deployment #Airflow #Data Management #GCP (Google Cloud Platform) #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Python #Data Pipeline #Spark (Apache Spark) #Data Science #REST (Representational State Transfer) #Kafka (Apache Kafka) #Programming
Role description
Job Title – GCP Data Engineer Location – Lisle, IL or Columbia, MD (4 Days in Office) Responsibilities • Work closely with various business, IT, Analyst and Data Science groups to collect business requirements. • Design, develop, deploy and support high performance data pipelines both inbound and outbound. • Model data platform by applying the business logic and building objects in the semantic layer of the data platform. • Optimize data pipelines for performance, scalability, and reliability. • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products. • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root. • Document the design and support strategy of the data pipelines • Capture, store and socialize data lineage and operational metadata • Troubleshoot and resolve data engineering issues as they arise. • Develop REST APIs to expose data to other teams within the company. • Mentor and guide junior data engineers. Work Experience • 6+ years of experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics • 2+ years of experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows • Knowledge, Skills and Abilities • Expert knowledge on SQL and Python programming • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed. • Experience in tuning queries for performance and scalability • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar. • Have proven experience working in incremental execution through successful launches. • Experience working in agile environment. Thank You