GCP Data Engineer (contract)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer (contract) with a pay rate of $47.34-$73.96/hour, requiring 8-12 years of data engineering experience, expertise in SQL and DBT, and proficiency in GCP services, primarily BigQuery.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
584
-
🗓️ - Date discovered
September 5, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
New York, NY
-
🧠 - Skills detailed
#Data Pipeline #Computer Science #Datasets #dbt (data build tool) #Python #Data Science #GCP (Google Cloud Platform) #Compliance #Consulting #Data Engineering #Data Extraction #SQL (Structured Query Language) #Cloud #"ETL (Extract #Transform #Load)" #Scala #Version Control #GIT #Data Security #SQL Queries #Automation #BigQuery #Data Governance #Security #Data Modeling #Data Quality #Scripting #Data Analysis #Complex Queries
Role description
We are seeking a GCP Data Engineer with hands-on experience with Google Cloud Platform (GCP) services, particularly BigQuery, and be proficient in DBT (Data Build Tool) for data transformation and modeling. This role requires strong SQL skills, a solid understanding of data warehousing concepts, and light proficiency in Python for scripting and automation. Key Responsibilities • Design, build, and maintain scalable data pipelines on GCP, primarily using BigQuery. • Develop and manage DBT models to transform raw data into clean, tested, and documented datasets. • Write complex and optimized SQL queries for data extraction, transformation, and analysis. • Implement and maintain data warehousing solutions, ensuring performance, scalability, and reliability. • Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. • Monitor and troubleshoot data pipeline performance and data quality issues. • Automate data workflows and tasks using Python scripts where necessary. • Ensure data governance, security, and compliance standards are met. Required Qualifications • Bachelor’s degree in Computer Science, Information Systems, or a related field. • 8-12 years of experience in data engineering or a similar role. • Strong expertise in SQL with the ability to write efficient, complex queries. • Proficiency in DBT for data modeling and transformation • Hands-on experience with BigQuery and other GCP data services. • Solid understanding of data warehousing principles and best practices. • Basic to intermediate skills in Python for scripting and automation. • Familiarity with version control systems like Git. • Excellent problem-solving and communication skills. The pay range that the employer in good faith reasonably expects to pay for this position is $47.34/hour - $73.96/hour. Our benefits include medical, dental, vision and retirement benefits. Applications will be accepted on an ongoing basis. Tundra Technical Solutions is among North America’s leading providers of Staffing and Consulting Services. Our success and our clients’ success are built on a foundation of service excellence. We are an equal opportunity employer, and we do not discriminate on the basis of race, religion, color, national origin, sex, sexual orientation, age, veteran status, disability, genetic information, or other applicable legally protected characteristic. Qualified applicants with arrest or conviction records will be considered for employment in accordance with applicable law, including the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Unincorporated LA County workers: we reasonably believe that criminal history may have a direct, adverse and negative relationship with the following job duties, potentially resulting in the withdrawal of a conditional offer of employment: client provided property, including hardware (both of which may include data) entrusted to you from theft, loss or damage; return all portable client computer hardware in your possession (including the data contained therein) upon completion of the assignment, and; maintain the confidentiality of client proprietary, confidential, or non-public information. In addition, job duties require access to secure and protected client information technology systems and related data security obligations.