Compunnel Inc.

Data Engineer (GCP - Google Cloud Platform)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (GCP) based in Lisle, IL or Columbia, MD, offering a 6-month contract with a pay rate of "X". Requires 6+ years of data engineering experience, expertise in SQL, Python, and GCP services.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 14, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Fixed Term
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Lisle, IL
-
🧠 - Skills detailed
#Strategy #Data Quality #BigQuery #Data Lineage #Metadata #Programming #Cloud #Deployment #Airflow #GCP (Google Cloud Platform) #REST (Representational State Transfer) #Spark (Apache Spark) #Data Ingestion #SQL (Structured Query Language) #Continuous Deployment #Kafka (Apache Kafka) #Data Science #Data Management #Agile #Data Engineering #Data Pipeline #REST API #Dataflow #"ETL (Extract #Transform #Load)" #Computer Science #Python #Scala
Role description
Job Title :: GCP Data Engineer Location :: Lisle, IL or Columbia, MD (4 Days in Office) Duration :: 6 months Contract to hire or complete Full time. Responsibilities β€’ Work closely with various business, IT, Analyst and Data Science groups to collect business requirements. β€’ Design, develop, deploy and support high performance data pipelines both inbound and outbound. β€’ Model data platform by applying the business logic and building objects in the semantic layer of the data platform. β€’ Optimize data pipelines for performance, scalability, and reliability. β€’ Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products. β€’ Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root. β€’ Document the design and support strategy of the data pipelines β€’ Capture, store and socialize data lineage and operational metadata β€’ Troubleshoot and resolve data engineering issues as they arise. β€’ Develop REST APIs to expose data to other teams within the company. β€’ Mentor and guide junior data engineers. Education β€’ Required - Bachelor's degree in Computer Science, Computer Engineering, Software Engineering, or other related technical field β€’ Nice to Have - Master’s Degree in Computer Science, Computer Engineering, Software Engineering, or other related technical field Work Experience β€’ Minimum 6 years of experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics β€’ 2 years of experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows Knowledge, Skills and Abilities β€’ Expert knowledge on SQL and Python programming β€’ Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed. β€’ Experience in tuning queries for performance and scalability β€’ Experience in Real Time Data ingestion using GCP Pub Sub, Kafka, Spark or similar. β€’ Excellent organizational, prioritization and analytical abilities β€’ Have proven experience working in incremental execution through successful launches. β€’ Excellent problem-solving and critical-thinking skills to recognize and comprehend complex data issues affecting the business environment. β€’ Experience working in agile environment.