

Insight Global
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown", offering a pay rate of "unknown". Key skills include advanced SQL, Google Cloud Platform (GCP), and Python. Experience in building ETL/ELT data pipelines is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
May 6, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Data Quality #dbt (data build tool) #Scala #GCP (Google Cloud Platform) #Airflow #Data Integrity #Data Accuracy #Data Engineering #"ETL (Extract #Transform #Load)" #BigQuery #Data Pipeline #Documentation #Data Processing #SQL (Structured Query Language) #Data Modeling #Debugging #Python #Automation #Cloud #Complex Queries
Role description
About the Role
Weβre looking for a driven and self-motivated Data Engineer to join a fast-paced, collaborative team building scalable, production-grade data solutions. This is a great opportunity for someone who thrives in ambiguity, takes ownership end-to-end, and enjoys partnering across teams to deliver impactful data products.
Youβll play a key role in designing, building, and optimizing data pipelines that power critical business insightsβwhile ensuring high standards of performance, scalability, and data quality.
Key Responsibilities
β’ Design, build, and maintain scalable ETL/ELT data pipelines
β’ Perform data validation, debugging, and root cause analysis to ensure data integrity
β’ Partner with cross-functional teams (data, product, engineering) to deliver data solutions aligned with business needs
β’ Support and troubleshoot production data issues with minimal disruption
β’ Continuously improve existing data systems and contribute to enhancements
β’ Maintain documentation, governance, and best practices across workflows
β’ Deliver high-quality, production-ready solutions focused on performance and reliability
Required Skills & Experience
β’ Advanced SQL expertise (complex queries, optimization, data validation)
β’ Hands-on experience with Google Cloud Platform (GCP), especially BigQuery
β’ Strong proficiency in Python and/or Node.js for data processing and automation
β’ Experience building and maintaining data pipelines (ETL/ELT)
β’ Strong problem-solving skills with the ability to work independently
β’ Excellent communication and collaboration skills in cross-functional environments
Core Expectations
β’ Self-starter who can take ambiguous problems and drive them to resolution
β’ Strong sense of ownership and accountability
β’ Clear and effective communication around progress, risks, and dependencies
β’ Focus on system reliability, scalability, and performance
β’ Commitment to data accuracy and operational excellence
Nice to Have
β’ Experience with Airflow, dbt, or similar orchestration tools
β’ Exposure to large-scale data environments and cloud-native architectures
β’ Background in data modeling, warehousing, or analytics platforms
About the Role
Weβre looking for a driven and self-motivated Data Engineer to join a fast-paced, collaborative team building scalable, production-grade data solutions. This is a great opportunity for someone who thrives in ambiguity, takes ownership end-to-end, and enjoys partnering across teams to deliver impactful data products.
Youβll play a key role in designing, building, and optimizing data pipelines that power critical business insightsβwhile ensuring high standards of performance, scalability, and data quality.
Key Responsibilities
β’ Design, build, and maintain scalable ETL/ELT data pipelines
β’ Perform data validation, debugging, and root cause analysis to ensure data integrity
β’ Partner with cross-functional teams (data, product, engineering) to deliver data solutions aligned with business needs
β’ Support and troubleshoot production data issues with minimal disruption
β’ Continuously improve existing data systems and contribute to enhancements
β’ Maintain documentation, governance, and best practices across workflows
β’ Deliver high-quality, production-ready solutions focused on performance and reliability
Required Skills & Experience
β’ Advanced SQL expertise (complex queries, optimization, data validation)
β’ Hands-on experience with Google Cloud Platform (GCP), especially BigQuery
β’ Strong proficiency in Python and/or Node.js for data processing and automation
β’ Experience building and maintaining data pipelines (ETL/ELT)
β’ Strong problem-solving skills with the ability to work independently
β’ Excellent communication and collaboration skills in cross-functional environments
Core Expectations
β’ Self-starter who can take ambiguous problems and drive them to resolution
β’ Strong sense of ownership and accountability
β’ Clear and effective communication around progress, risks, and dependencies
β’ Focus on system reliability, scalability, and performance
β’ Commitment to data accuracy and operational excellence
Nice to Have
β’ Experience with Airflow, dbt, or similar orchestration tools
β’ Exposure to large-scale data environments and cloud-native architectures
β’ Background in data modeling, warehousing, or analytics platforms






