

GCP BigQuery Data Engineer
Job Title: GCP BigQuery Data Engineer
Location: Remote
Job Type: Contract
Job Description:
We are seeking a highly skilled Data Engineer to join our team. The ideal candidate will have strong experience in modern data engineering tools and practices, with a focus on building and maintaining robust data pipelines and solutions. This role requires a solid understanding of cloud data platforms, orchestration tools, and programming languages, along with the ability to integrate various services effectively.
Key Responsibilities:
Design, build, and maintain scalable data pipelines using DataForm and BigQuery.
Develop and orchestrate workflows using Composer or Apache Airflow.
Integrate and manage data from diverse sources through APIs, ensuring secure and efficient data exchange.
Implement and utilize Data Catalog and Data Plex for effective data governance, metadata management, and data lineage tracking.
Use Git for version control, collaborating with team members on code reviews and CI/CD pipelines.
Apply Medallion Architecture principles to design well-structured data models and layers (Bronze, Silver, and Gold).
Write clean, efficient, and well-documented code using SQL, Python, and XSQL Dataform.
Leverage Object-Oriented Programming (OOP) concepts, where applicable, to create modular and maintainable solutions.
Ensure seamless integration and interoperability of various data services and platforms.
Qualifications:
Proven experience as a Data Engineer or in a similar role.
Hands-on expertise with DataForm and BigQuery.
Proficiency in Composer or Airflow for data orchestration.
Solid understanding of working with APIs for data ingestion and transformation.
Familiarity with Data Catalog and Data Plex.
Strong version control practices using Git.
Knowledge of Medallion Architecture and its application in data engineering.
Programming skills in SQL, Python, and XSQL Dataform.
Understanding of OOP principles is a plus.
Ability to design and implement data solutions that integrate multiple cloud services.
Preferred Skills:
Experience with cloud platforms like Google Cloud Platform (GCP).
Familiarity with CI/CD pipelines and DevOps practices.
Strong analytical and problem-solving skills.
Excellent communication and collaboration abilities.
Job Title: GCP BigQuery Data Engineer
Location: Remote
Job Type: Contract
Job Description:
We are seeking a highly skilled Data Engineer to join our team. The ideal candidate will have strong experience in modern data engineering tools and practices, with a focus on building and maintaining robust data pipelines and solutions. This role requires a solid understanding of cloud data platforms, orchestration tools, and programming languages, along with the ability to integrate various services effectively.
Key Responsibilities:
Design, build, and maintain scalable data pipelines using DataForm and BigQuery.
Develop and orchestrate workflows using Composer or Apache Airflow.
Integrate and manage data from diverse sources through APIs, ensuring secure and efficient data exchange.
Implement and utilize Data Catalog and Data Plex for effective data governance, metadata management, and data lineage tracking.
Use Git for version control, collaborating with team members on code reviews and CI/CD pipelines.
Apply Medallion Architecture principles to design well-structured data models and layers (Bronze, Silver, and Gold).
Write clean, efficient, and well-documented code using SQL, Python, and XSQL Dataform.
Leverage Object-Oriented Programming (OOP) concepts, where applicable, to create modular and maintainable solutions.
Ensure seamless integration and interoperability of various data services and platforms.
Qualifications:
Proven experience as a Data Engineer or in a similar role.
Hands-on expertise with DataForm and BigQuery.
Proficiency in Composer or Airflow for data orchestration.
Solid understanding of working with APIs for data ingestion and transformation.
Familiarity with Data Catalog and Data Plex.
Strong version control practices using Git.
Knowledge of Medallion Architecture and its application in data engineering.
Programming skills in SQL, Python, and XSQL Dataform.
Understanding of OOP principles is a plus.
Ability to design and implement data solutions that integrate multiple cloud services.
Preferred Skills:
Experience with cloud platforms like Google Cloud Platform (GCP).
Familiarity with CI/CD pipelines and DevOps practices.
Strong analytical and problem-solving skills.
Excellent communication and collaboration abilities.