Insight Global

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include GCP services, ETL tools, and proficiency in Python, Java, or Scala. Certification in Google Cloud Professional Data Engineer is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 28, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Quality #GCP (Google Cloud Platform) #Python #Dataflow #Java #Security #Data Modeling #Data Processing #Cloud #Programming #Documentation #"ETL (Extract #Transform #Load)" #Data Engineering #Storage #Data Integration #Data Governance #Data Science #Data Ingestion #Scala #BigQuery
Role description
Required Skills & Experience Proven experience as a Data Engineer with a focus on data ingestion Strong proficiency in Google Cloud Platform (GCP) services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage Experience with ETL tools and frameworks Proficiency in programming languages such as Python, Java, or Scala Familiarity with data modeling, data warehousing, and data integration concepts Nice to Have Skills & Experience Experience with real-time data processing and streaming technologies Knowledge of data governance and security best practices Certification in Google Cloud Professional Data Engineer or similar Responsibilities: Design, develop, and maintain scalable data ingestion pipelines on Google Cloud Platform (GCP). Implement data integration solutions to collect, process, and store large volumes of data from various sources. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and ensure data quality and integrity. Optimize and troubleshoot data ingestion processes to ensure high performance and reliability. Monitor and manage data workflows, ensuring timely and accurate data delivery. Develop and maintain documentation for data ingestion processes and workflows. Stay up-to-date with the latest GCP services and best practices in data engineering.