

Ampstek
Data Analyst with GCP Only USC and GC
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Analyst with GCP, offering a remote position in the USA for over 6 months. Requires 5+ years of experience in SQL, BigQuery, and Python, with strong analytical skills and familiarity with GCP and Airflow.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
November 20, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#AI (Artificial Intelligence) #GCP (Google Cloud Platform) #Scala #"ETL (Extract #Transform #Load)" #Data Analysis #Computer Science #Python #Data Science #Data Accuracy #Data Quality #Pig #Data Manipulation #SQL (Structured Query Language) #Libraries #Airflow #BigQuery #Datasets #Looker #Migration #Anomaly Detection #ML (Machine Learning) #Time Series #SQL Queries #Cloud #Data Pipeline #Java #BI (Business Intelligence) #Trend Analysis
Role description
Title: Data Analyst with GCP
Location: Remote USA
Job Type: Full time
Key Responsibilities: -
β’ Write, optimize, and execute complex SQL queries in BigQuery to validate data accuracy, identify inconsistencies, and support analytics and reporting.
β’ Analyze large datasets to assess data quality, compare trends across systems, and surface anomalies or unexpected behaviors.
β’ Utilize advanced BigQuery features such as authorized views, materialized views, UDFs, partitioned tables, and joins to support scalable, high-performance analysis.
β’ Use Python (including data frames and relevant libraries) for exploratory analysis, data manipulation, and supporting validation workflows.
β’ Support time series analysis and, where applicable, anomaly detection using SQL or Python-based approaches.
β’ Assist with load/transform validation to ensure reliability and accuracy in data pipelines.
β’ Collaborate with engineering teams to understand data pipelines, with basic ability to read and interpret Java or Scala code when needed.
β’ Perform side-by-side comparisons of data across systems to ensure consistency during and after migrations.
β’ Maintain basic familiarity with orchestration tools such as Airflow (Composer) to follow pipeline logic and collaborate effectively with engineering.
β’ Work within the GCP environment, leveraging cloud tools and services to support analysis, troubleshoot issues, and navigate cloud-based workflows.
β’ Clearly communicate analytical findings and data quality issues to cross-functional stakeholders to support decision-making.
Qualifications: -
β’ Bachelorβs degree in Computer Science, Data Science, Engineering, or a related field.
β’ 5+ years of experience in data analyst or analytics engineering roles with strong BigQuery, SQL, and Python skills.
β’ 5+ years of experience building and operating solutions on Google Cloud Platform (GCP).
β’ Strong ability to write and optimize SQL queries to validate data, analyze trends, and detect inconsistencies.
β’ Proficient in Python, including use of data frames and common analytical libraries.
β’ Experience with advanced BigQuery features such as authorized views, materialized views, UDFs, partitions, and time series analysis.
β’ Strong analytical skills and experience validating data across systems during migrations and ongoing operations.
β’ Basic ability to read and understand Java or Scala code to support engineering collaboration.
β’ Familiarity with Airflow (Cloud Composer) to interpret and trace data pipeline workflows.
Nice to have: -
β’ Familiarity with Looker or other BI tools for metric validation and reporting support.
β’ BigQuery ML and Vertex AI
β’ Basic familiarity with legacy systems such as Oozie or Pig for reading existing scripts.
Must haves: -
β’ Proficiency in SQL, BigQuery, and Python
β’ Advanced SQL skills in BigQuery for complex data validation, anomaly detection, and trend analysis.
β’ Experience comparing datasets across systems.
β’ Proven ability to identify and investigate data discrepancies across platforms.
β’ Strong analytical intuition to sense-check metrics and flag issues that may not trigger formal alerts.
β’ Ability to perform side-by-side metric and trend comparisons to confirm post-migration accuracy.
β’ Skilled in root cause analysis using SQL, domain expertise, and supporting context.
β’ Effective communicator who can document findings and share insights with both technical and non-technical stakeholders.
Title: Data Analyst with GCP
Location: Remote USA
Job Type: Full time
Key Responsibilities: -
β’ Write, optimize, and execute complex SQL queries in BigQuery to validate data accuracy, identify inconsistencies, and support analytics and reporting.
β’ Analyze large datasets to assess data quality, compare trends across systems, and surface anomalies or unexpected behaviors.
β’ Utilize advanced BigQuery features such as authorized views, materialized views, UDFs, partitioned tables, and joins to support scalable, high-performance analysis.
β’ Use Python (including data frames and relevant libraries) for exploratory analysis, data manipulation, and supporting validation workflows.
β’ Support time series analysis and, where applicable, anomaly detection using SQL or Python-based approaches.
β’ Assist with load/transform validation to ensure reliability and accuracy in data pipelines.
β’ Collaborate with engineering teams to understand data pipelines, with basic ability to read and interpret Java or Scala code when needed.
β’ Perform side-by-side comparisons of data across systems to ensure consistency during and after migrations.
β’ Maintain basic familiarity with orchestration tools such as Airflow (Composer) to follow pipeline logic and collaborate effectively with engineering.
β’ Work within the GCP environment, leveraging cloud tools and services to support analysis, troubleshoot issues, and navigate cloud-based workflows.
β’ Clearly communicate analytical findings and data quality issues to cross-functional stakeholders to support decision-making.
Qualifications: -
β’ Bachelorβs degree in Computer Science, Data Science, Engineering, or a related field.
β’ 5+ years of experience in data analyst or analytics engineering roles with strong BigQuery, SQL, and Python skills.
β’ 5+ years of experience building and operating solutions on Google Cloud Platform (GCP).
β’ Strong ability to write and optimize SQL queries to validate data, analyze trends, and detect inconsistencies.
β’ Proficient in Python, including use of data frames and common analytical libraries.
β’ Experience with advanced BigQuery features such as authorized views, materialized views, UDFs, partitions, and time series analysis.
β’ Strong analytical skills and experience validating data across systems during migrations and ongoing operations.
β’ Basic ability to read and understand Java or Scala code to support engineering collaboration.
β’ Familiarity with Airflow (Cloud Composer) to interpret and trace data pipeline workflows.
Nice to have: -
β’ Familiarity with Looker or other BI tools for metric validation and reporting support.
β’ BigQuery ML and Vertex AI
β’ Basic familiarity with legacy systems such as Oozie or Pig for reading existing scripts.
Must haves: -
β’ Proficiency in SQL, BigQuery, and Python
β’ Advanced SQL skills in BigQuery for complex data validation, anomaly detection, and trend analysis.
β’ Experience comparing datasets across systems.
β’ Proven ability to identify and investigate data discrepancies across platforms.
β’ Strong analytical intuition to sense-check metrics and flag issues that may not trigger formal alerts.
β’ Ability to perform side-by-side metric and trend comparisons to confirm post-migration accuracy.
β’ Skilled in root cause analysis using SQL, domain expertise, and supporting context.
β’ Effective communicator who can document findings and share insights with both technical and non-technical stakeholders.






