

Hope Tech
BigQuery Developer / GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a contract position for a "BigQuery Developer / GCP Data Engineer" with a pay rate of "unknown." It is primarily remote, requiring occasional onsite presence near Woonsocket, RI. Candidates need 3-5 years of data engineering experience, strong SQL and BigQuery skills, and proficiency in Python, Dataflow, and Cloud Composer.
🌎 - Country
United States
💱 - Currency
Unknown
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 19, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Woonsocket, RI
-
🧠 - Skills detailed
#Scala #Apache Airflow #Data Processing #Data Engineering #Data Modeling #Data Warehouse #Dataflow #Debugging #Clustering #Python #Data Pipeline #Cloud #Airflow #EDW (Enterprise Data Warehouse) #Apache Beam #BigQuery #Batch #"ETL (Extract #Transform #Load)" #GitHub #GCP (Google Cloud Platform) #Datasets #SQL (Structured Query Language) #Version Control #Data Analysis
Role description
Description:
Role Type: Contract position. Engagement options available on W2 or C2C basis.
Location: Primarily remote. Occasional onsite presence may be required based on project needs. Preference will be given to candidates within driving distance of Woonsocket, RI.
Job Title: BigQuery Developer / GCP Data Engineer
Job Summary:
We are looking for a motivated and detail-oriented BigQuery Developer with hands-on experience in Google Cloud Platform to support and enhance our enterprise data warehouse and analytics solutions. The ideal candidate will have strong SQL , BigQuery and Python development experience, along with working knowledge of Dataflow, Cloud Composer, GitHub, and CI/CD practices. This role requires strong analytical skills, problem-solving ability, and effective communication to work with cross-functional teams.
Key Responsibilities
BigQuery Development (Primary Focus)
Develop, maintain, and optimize BigQuery datasets, tables, views, procedures and queries.
Write efficient and scalable SQL for reporting and analytics.
Implement partitioning and clustering to improve query performance.
Support data warehouse design and data modeling activities.
Monitor query performance and optimize cost usage.
Troubleshoot and resolve data-related issues in BigQuery.
Support data validation and quality checks.
Data Pipeline Development
Develop and maintain batch pipelines using Python and Google Cloud Dataflow (Apache Beam).
Load, transform, and integrate data from various sources into BigQuery.
Work on ETL/ELT processes and ensure reliable data processing.
Assist in debugging and performance tuning of pipelines.
Workflow Orchestration
Develop and maintain workflows using Cloud Composer (Apache Airflow).
Integrate workflows with Tidal Job Scheduler for enterprise scheduling.
Monitor production jobs and support issue resolution.
Version Control & CI/CD
Use GitHub for source control and collaboration.
Contribute to CI/CD pipelines using GitHub Actions.
Follow best practices for code versioning and peer reviews.
Collaboration & Communication
Work closely with data analysts, business users, and technical teams.
Translate business requirements into efficient BigQuery solutions.
Document data flows, technical designs, and operational processes.
Provide production support as needed.
Requirements:
3–5 years of experience in data engineering or data development.
Strong hands-on experience with BigQuery.
Strong SQL skills (joins, aggregations, window functions, performance tuning).
Experience with Google Cloud Platform (GCP).
Experience building batch pipelines using Python and Dataflow.
Experience with Cloud Composer (Airflow).
Working knowledge of GitHub and GitHub Actions.
Experience with enterprise job schedulers such as Tidal.
Understanding of data warehousing concepts.
Strong analytical and problem-solving skills.
Good verbal and written communication skills.
Description:
Role Type: Contract position. Engagement options available on W2 or C2C basis.
Location: Primarily remote. Occasional onsite presence may be required based on project needs. Preference will be given to candidates within driving distance of Woonsocket, RI.
Job Title: BigQuery Developer / GCP Data Engineer
Job Summary:
We are looking for a motivated and detail-oriented BigQuery Developer with hands-on experience in Google Cloud Platform to support and enhance our enterprise data warehouse and analytics solutions. The ideal candidate will have strong SQL , BigQuery and Python development experience, along with working knowledge of Dataflow, Cloud Composer, GitHub, and CI/CD practices. This role requires strong analytical skills, problem-solving ability, and effective communication to work with cross-functional teams.
Key Responsibilities
BigQuery Development (Primary Focus)
Develop, maintain, and optimize BigQuery datasets, tables, views, procedures and queries.
Write efficient and scalable SQL for reporting and analytics.
Implement partitioning and clustering to improve query performance.
Support data warehouse design and data modeling activities.
Monitor query performance and optimize cost usage.
Troubleshoot and resolve data-related issues in BigQuery.
Support data validation and quality checks.
Data Pipeline Development
Develop and maintain batch pipelines using Python and Google Cloud Dataflow (Apache Beam).
Load, transform, and integrate data from various sources into BigQuery.
Work on ETL/ELT processes and ensure reliable data processing.
Assist in debugging and performance tuning of pipelines.
Workflow Orchestration
Develop and maintain workflows using Cloud Composer (Apache Airflow).
Integrate workflows with Tidal Job Scheduler for enterprise scheduling.
Monitor production jobs and support issue resolution.
Version Control & CI/CD
Use GitHub for source control and collaboration.
Contribute to CI/CD pipelines using GitHub Actions.
Follow best practices for code versioning and peer reviews.
Collaboration & Communication
Work closely with data analysts, business users, and technical teams.
Translate business requirements into efficient BigQuery solutions.
Document data flows, technical designs, and operational processes.
Provide production support as needed.
Requirements:
3–5 years of experience in data engineering or data development.
Strong hands-on experience with BigQuery.
Strong SQL skills (joins, aggregations, window functions, performance tuning).
Experience with Google Cloud Platform (GCP).
Experience building batch pipelines using Python and Dataflow.
Experience with Cloud Composer (Airflow).
Working knowledge of GitHub and GitHub Actions.
Experience with enterprise job schedulers such as Tidal.
Understanding of data warehousing concepts.
Strong analytical and problem-solving skills.
Good verbal and written communication skills.




