Insight Global

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown", offering a pay rate of "unknown". Key skills include advanced SQL, Google Cloud Platform (GCP), and Python. Experience in building ETL/ELT data pipelines is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
May 6, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Data Quality #dbt (data build tool) #Scala #GCP (Google Cloud Platform) #Airflow #Data Integrity #Data Accuracy #Data Engineering #"ETL (Extract #Transform #Load)" #BigQuery #Data Pipeline #Documentation #Data Processing #SQL (Structured Query Language) #Data Modeling #Debugging #Python #Automation #Cloud #Complex Queries
Role description
About the Role We’re looking for a driven and self-motivated Data Engineer to join a fast-paced, collaborative team building scalable, production-grade data solutions. This is a great opportunity for someone who thrives in ambiguity, takes ownership end-to-end, and enjoys partnering across teams to deliver impactful data products. You’ll play a key role in designing, building, and optimizing data pipelines that power critical business insightsβ€”while ensuring high standards of performance, scalability, and data quality. Key Responsibilities β€’ Design, build, and maintain scalable ETL/ELT data pipelines β€’ Perform data validation, debugging, and root cause analysis to ensure data integrity β€’ Partner with cross-functional teams (data, product, engineering) to deliver data solutions aligned with business needs β€’ Support and troubleshoot production data issues with minimal disruption β€’ Continuously improve existing data systems and contribute to enhancements β€’ Maintain documentation, governance, and best practices across workflows β€’ Deliver high-quality, production-ready solutions focused on performance and reliability Required Skills & Experience β€’ Advanced SQL expertise (complex queries, optimization, data validation) β€’ Hands-on experience with Google Cloud Platform (GCP), especially BigQuery β€’ Strong proficiency in Python and/or Node.js for data processing and automation β€’ Experience building and maintaining data pipelines (ETL/ELT) β€’ Strong problem-solving skills with the ability to work independently β€’ Excellent communication and collaboration skills in cross-functional environments Core Expectations β€’ Self-starter who can take ambiguous problems and drive them to resolution β€’ Strong sense of ownership and accountability β€’ Clear and effective communication around progress, risks, and dependencies β€’ Focus on system reliability, scalability, and performance β€’ Commitment to data accuracy and operational excellence Nice to Have β€’ Experience with Airflow, dbt, or similar orchestration tools β€’ Exposure to large-scale data environments and cloud-native architectures β€’ Background in data modeling, warehousing, or analytics platforms