GCP BigQuery Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP BigQuery Data Engineer on a contract basis, offering remote work. Key skills include DataForm, BigQuery, Composer/Airflow, SQL, and Python. Proven experience in data engineering and cloud platforms is required.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 24, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#Data Catalog #Data Orchestration #BigQuery #GIT #Programming #Data Pipeline #Scala #Version Control #Cloud #Metadata #Data Governance #Python #Airflow #Data Lineage #DevOps #"ETL (Extract #Transform #Load)" #Data Ingestion #Code Reviews #SQL (Structured Query Language) #Data Management #Apache Airflow #Data Engineering #GCP (Google Cloud Platform)
Role description

Job Title: GCP BigQuery Data Engineer

Location: Remote

Job Type: Contract

Job Description:

We are seeking a highly skilled Data Engineer to join our team. The ideal candidate will have strong experience in modern data engineering tools and practices, with a focus on building and maintaining robust data pipelines and solutions. This role requires a solid understanding of cloud data platforms, orchestration tools, and programming languages, along with the ability to integrate various services effectively.

Key Responsibilities:

Design, build, and maintain scalable data pipelines using DataForm and BigQuery.

Develop and orchestrate workflows using Composer or Apache Airflow.

Integrate and manage data from diverse sources through APIs, ensuring secure and efficient data exchange.

Implement and utilize Data Catalog and Data Plex for effective data governance, metadata management, and data lineage tracking.

Use Git for version control, collaborating with team members on code reviews and CI/CD pipelines.

Apply Medallion Architecture principles to design well-structured data models and layers (Bronze, Silver, and Gold).

Write clean, efficient, and well-documented code using SQL, Python, and XSQL Dataform.

Leverage Object-Oriented Programming (OOP) concepts, where applicable, to create modular and maintainable solutions.

Ensure seamless integration and interoperability of various data services and platforms.

Qualifications:

Proven experience as a Data Engineer or in a similar role.

Hands-on expertise with DataForm and BigQuery.

Proficiency in Composer or Airflow for data orchestration.

Solid understanding of working with APIs for data ingestion and transformation.

Familiarity with Data Catalog and Data Plex.

Strong version control practices using Git.

Knowledge of Medallion Architecture and its application in data engineering.

Programming skills in SQL, Python, and XSQL Dataform.

Understanding of OOP principles is a plus.

Ability to design and implement data solutions that integrate multiple cloud services.

Preferred Skills:

Experience with cloud platforms like Google Cloud Platform (GCP).

Familiarity with CI/CD pipelines and DevOps practices.

Strong analytical and problem-solving skills.

Excellent communication and collaboration abilities.