

HatchPros
Lead GCP Analytics Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead GCP Analytics Engineer with a contract length of over 6 months, offering competitive pay. Key skills include GCP, SQL, Python, and experience in financial payment processing. A bachelor's degree and 10+ years in analytics engineering are required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
544
-
ποΈ - Date
October 15, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Boston, MA
-
π§ - Skills detailed
#Scala #Apache Beam #Version Control #Data Science #Data Quality #Datasets #SQL (Structured Query Language) #Data Governance #dbt (data build tool) #Data Engineering #GCP (Google Cloud Platform) #GIT #"ETL (Extract #Transform #Load)" #Docker #Fivetran #AWS (Amazon Web Services) #Airflow #Data Modeling #Snowflake #Data Pipeline #Mathematics #Cloud #Looker #Computer Science #Redshift #Data Warehouse #Data Integrity #BigQuery #Leadership #Python #Vault
Role description
GC Holder or US Citizens
need Local and LinkediN profile
Key Skills:βββββGCP, SQL, Python, financial payment processing project preferred
Weβre seeking a Lead Analytics Engineer to help design, model, and scale a modern data environment for a global payment processing organization. The company manages large volumes of data across multiple business units, and this role will play a key part in organizing and maturing that landscape as part of a multi-year strategic roadmap. This position is ideal for a lead-level analytics engineer who can architect data solutions, build robust models, and stay hands-on with development.
Qualifications
β’ 10+ years of experience in an Analytics Engineering role.
β’ Lead-level resource to help with design and architecture
β’ Expert in SQL and dbt with demonstrated modeling experience (Vault, 3NF, Dimensional).
β’ Hands-on experience with BigQuery or other cloud data warehouses.
β’ Proficiency in Python and Docker.
β’ Experience with Airflow (Composer), Git, and CI/CD pipelines.
β’ Strong attention to detail and communication skills; able to interact with both technical and business stakeholders.
β’ Experience in financial services or payments is a plus but not required.
β’ Bachelorβs degree in Economics, Mathematics, Computer Science, or related field.
Role Focus
β’ Architect and build new GCP data models using dbt and modern modeling techniques.
β’ Partner closely with leadership and business teams to translate complex requirements into technical solutions.
β’ Support initiatives focused on Finance and Payments data domains.
β’ Drive structure and clarity within a growing analytics ecosystem.
Technical Environment
β’ Primary Data Warehouse: Google BigQuery (mandatory)
β’ Nice to Have: Snowflake, Redshift
β’ Orchestration: Airflow (GCP Composer)
β’ Languages: Expert-level SQL / dbt; strong Python required
β’ Other Tools: GCP or AWS, Fivetran, Apache Beam, Looker or Preset, Docker
β’ Modeling Techniques: Vault 2.0, 3NF, Dimensional Modeling, etc
β’ Version Control: Git / CI-CD
β’ Quality Tools: dbt-Elementary, dbt-Osmosis, or Great Expectations preferred
Responsibilities
β’ Business Stakeholder Engagement
β’ Gather and document complex business requirements.
β’ Translate business needs into scalable, maintainable data products.
β’ Serve as a trusted data partner across multiple departments.
β’ Data Modeling & Transformation
β’ Design and implement robust, reusable data models within the warehouse.
β’ Develop and maintain SQL transformations in dbt.
β’ Optimize existing models and queries for performance, cost-efficiency, and maintainability.
β’ Data Pipeline & Orchestration Build and maintain reliable data pipelines in collaboration with data engineering.
β’ Utilize orchestration tools (Airflow) to manage and monitor workflows.
β’ Manage and support dbt environments and transformations.
β’ Data Quality & Governance
β’ Implement validation checks and quality controls to ensure data integrity.
β’ Define and enforce data governance best practices, including lineage and access control.
β’ Enable Data Democratization & Self-Service Analytics
β’ Curate and prepare datasets for analysts, business users, and data scientists.
β’ Develop semantic layers for consistent and accessible reporting.
GC Holder or US Citizens
need Local and LinkediN profile
Key Skills:βββββGCP, SQL, Python, financial payment processing project preferred
Weβre seeking a Lead Analytics Engineer to help design, model, and scale a modern data environment for a global payment processing organization. The company manages large volumes of data across multiple business units, and this role will play a key part in organizing and maturing that landscape as part of a multi-year strategic roadmap. This position is ideal for a lead-level analytics engineer who can architect data solutions, build robust models, and stay hands-on with development.
Qualifications
β’ 10+ years of experience in an Analytics Engineering role.
β’ Lead-level resource to help with design and architecture
β’ Expert in SQL and dbt with demonstrated modeling experience (Vault, 3NF, Dimensional).
β’ Hands-on experience with BigQuery or other cloud data warehouses.
β’ Proficiency in Python and Docker.
β’ Experience with Airflow (Composer), Git, and CI/CD pipelines.
β’ Strong attention to detail and communication skills; able to interact with both technical and business stakeholders.
β’ Experience in financial services or payments is a plus but not required.
β’ Bachelorβs degree in Economics, Mathematics, Computer Science, or related field.
Role Focus
β’ Architect and build new GCP data models using dbt and modern modeling techniques.
β’ Partner closely with leadership and business teams to translate complex requirements into technical solutions.
β’ Support initiatives focused on Finance and Payments data domains.
β’ Drive structure and clarity within a growing analytics ecosystem.
Technical Environment
β’ Primary Data Warehouse: Google BigQuery (mandatory)
β’ Nice to Have: Snowflake, Redshift
β’ Orchestration: Airflow (GCP Composer)
β’ Languages: Expert-level SQL / dbt; strong Python required
β’ Other Tools: GCP or AWS, Fivetran, Apache Beam, Looker or Preset, Docker
β’ Modeling Techniques: Vault 2.0, 3NF, Dimensional Modeling, etc
β’ Version Control: Git / CI-CD
β’ Quality Tools: dbt-Elementary, dbt-Osmosis, or Great Expectations preferred
Responsibilities
β’ Business Stakeholder Engagement
β’ Gather and document complex business requirements.
β’ Translate business needs into scalable, maintainable data products.
β’ Serve as a trusted data partner across multiple departments.
β’ Data Modeling & Transformation
β’ Design and implement robust, reusable data models within the warehouse.
β’ Develop and maintain SQL transformations in dbt.
β’ Optimize existing models and queries for performance, cost-efficiency, and maintainability.
β’ Data Pipeline & Orchestration Build and maintain reliable data pipelines in collaboration with data engineering.
β’ Utilize orchestration tools (Airflow) to manage and monitor workflows.
β’ Manage and support dbt environments and transformations.
β’ Data Quality & Governance
β’ Implement validation checks and quality controls to ensure data integrity.
β’ Define and enforce data governance best practices, including lineage and access control.
β’ Enable Data Democratization & Self-Service Analytics
β’ Curate and prepare datasets for analysts, business users, and data scientists.
β’ Develop semantic layers for consistent and accessible reporting.