

The Judge Group
GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with a contract length of "unknown" and a pay rate of "unknown." Key skills include expertise in GCP services, Python, SQL, and ETL processes. A degree in a related field and 5+ years of data management experience are required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
600
-
ποΈ - Date
December 24, 2025
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Lone Tree, CO
-
π§ - Skills detailed
#BigQuery #GCP (Google Cloud Platform) #Security #Cloud #Data Analysis #Data Security #Data Engineering #Data Mart #Python #Data Architecture #Storage #"ETL (Extract #Transform #Load)" #Data Management #Automation #Data Pipeline #Data Quality #Data Extraction #Google Cloud Storage #Scala #Data Processing #Visualization #Agile #SQL Queries #Data Governance #Cybersecurity #SQL (Structured Query Language) #Computer Science
Role description
We are looking for an experienced Data Engineer to build and maintain scalable data pipelines on Google Cloud Platform (GCP). In this role, you will be crucial in serving our Cyber Security data mart and supporting security analytics.
Must Have:
β’ Bachelorβs or masterβs degree in computer science, Information Systems, Engineering, or related field.
β’ 5+ years of hands-on experience with data management in gathering data from multiple sources and consolidating them into a single centralized location. Transforming the data with business logic in a consumable manner for visualization and data analysis.
β’ Strong expertise in Google BigQuery, Google Cloud Storage, Cloud Composer, and related Google Cloud Platform (GCP) services.
β’ Proficiency in Python and SQL for data processing and automation.
β’ Experience with ETL processes and data pipeline design.
β’ Excellent problem-solving skills and attention to detail.
β’ Strong communication and collaboration.
What youβre good at
β’ Design, build, and maintain scalable data pipelines using Google Cloud Platform tools such as BigQuery, Cloud Storage, and Cloud Composer.
β’ Develop and optimize SQL queries to support data extraction, transformation, and loading (ETL) processes.
β’ Collaborate with cross-functional teams, including business customers and Subject Matter Experts, to understand data requirements and deliver effective solutions.
β’ Implement best practices for data quality, data governance, and data security.
β’ Monitor and troubleshoot data pipeline issues, ensuring high availability and performance.
β’ Contribute to data architecture decisions to provide recommendations for improving the data pipeline.
β’ Stay up to date with emerging trends and technologies in cloud-based data engineering and cybersecurity.
β’ Exceptional communication skills, including the ability to gather relevant data and information, actively listen, dialogue freely, and verbalize ideas effectively.
β’ Ability to work in an Agile work environment to deliver incremental value to customers by managing and prioritizing tasks.
β’ Proactively lead investigation and resolution efforts when data issues are identified taking ownership to resolve them in a timely manner.
β’ Ability to interoperate and document processes and procedures for producing metrics.
We are looking for an experienced Data Engineer to build and maintain scalable data pipelines on Google Cloud Platform (GCP). In this role, you will be crucial in serving our Cyber Security data mart and supporting security analytics.
Must Have:
β’ Bachelorβs or masterβs degree in computer science, Information Systems, Engineering, or related field.
β’ 5+ years of hands-on experience with data management in gathering data from multiple sources and consolidating them into a single centralized location. Transforming the data with business logic in a consumable manner for visualization and data analysis.
β’ Strong expertise in Google BigQuery, Google Cloud Storage, Cloud Composer, and related Google Cloud Platform (GCP) services.
β’ Proficiency in Python and SQL for data processing and automation.
β’ Experience with ETL processes and data pipeline design.
β’ Excellent problem-solving skills and attention to detail.
β’ Strong communication and collaboration.
What youβre good at
β’ Design, build, and maintain scalable data pipelines using Google Cloud Platform tools such as BigQuery, Cloud Storage, and Cloud Composer.
β’ Develop and optimize SQL queries to support data extraction, transformation, and loading (ETL) processes.
β’ Collaborate with cross-functional teams, including business customers and Subject Matter Experts, to understand data requirements and deliver effective solutions.
β’ Implement best practices for data quality, data governance, and data security.
β’ Monitor and troubleshoot data pipeline issues, ensuring high availability and performance.
β’ Contribute to data architecture decisions to provide recommendations for improving the data pipeline.
β’ Stay up to date with emerging trends and technologies in cloud-based data engineering and cybersecurity.
β’ Exceptional communication skills, including the ability to gather relevant data and information, actively listen, dialogue freely, and verbalize ideas effectively.
β’ Ability to work in an Agile work environment to deliver incremental value to customers by managing and prioritizing tasks.
β’ Proactively lead investigation and resolution efforts when data issues are identified taking ownership to resolve them in a timely manner.
β’ Ability to interoperate and document processes and procedures for producing metrics.






