

Holistic Partners, Inc
Data Engineer (GCP / Telecom / Genesys Migration)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (GCP / Telecom / Genesys Migration) in Phoenix, AZ or Walnut Creek, CA, for 12 months C2H. Requires 5+ years of experience, expertise in BigQuery, GCS, Airflow, Python, SQL, and telecom data handling.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
April 18, 2026
π - Duration
More than 6 months
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Walnut Creek, CA
-
π§ - Skills detailed
#Airflow #JSON (JavaScript Object Notation) #Data Ingestion #Data Pipeline #XML (eXtensible Markup Language) #Data Modeling #Datasets #Metadata #Computer Science #Scrum #Scala #Data Quality #"ETL (Extract #Transform #Load)" #Data Warehouse #Data Lake #Data Migration #Python #Cloud #Data Engineering #GCP (Google Cloud Platform) #Databases #Migration #Google Cloud Storage #SQL (Structured Query Language) #Storage #Agile #GitHub #BigQuery #EDW (Enterprise Data Warehouse)
Role description
Job Title: Data Engineer (GCP / Telecom / Genesys Migration)
Location: Phoenix, AZ (Preferred) or Walnut Creek, CA (Onsite)
Duration: 12 Months C2H
Must-Have Skills
β’ BigQuery
β’ Google Cloud Storage (GCS)
β’ Airflow / Cloud Composer
β’ Python
β’ SQL
β’ Telecom / Contact Center IVR
β’ Agile Environment & Communication Skills
Job Overview
We are seeking a Data Engineer (Level 2β3) to support a large-scale enterprise data migration initiative from Avaya to Genesys contact center platform.
This role focuses on data engineering, not telecom engineering. The environment processes high-volume telecom interaction data such as call recordings, transcripts, and voice metadata.
The ideal candidate will have strong experience building and maintaining data pipelines in Google Cloud Platform (GCP) and handling both structured and semi-structured data.
Key Responsibilities
β’ Design, build, and maintain scalable data ingestion and transformation pipelines
β’ Work with telecom datasets:
β’ Call Detail Records (CDR)
β’ Transcripts
β’ Event-based data
β’ Process multiple data formats:
β’ JSON transcripts
β’ XML datasets
β’ Relational databases
β’ Build and optimize datasets in BigQuery
β’ Develop workflows using Airflow / Cloud Composer
β’ Ingest and manage data using GCS
β’ Perform data modeling and transformation for analytics/reporting
β’ Collaborate during Avaya β Genesys migration
β’ Ensure data quality, reliability, and performance
β’ Work in an Agile environment
Required Experience
β’ 5+ years as a Data Engineer (MidβSenior level)
β’ Strong hands-on experience with Google Cloud Platform (GCP)
β’ Expertise in:
β’ BigQuery
β’ Cloud Storage (GCS)
β’ Airflow / Cloud Composer
β’ Python
β’ SQL
β’ Experience with ETL/ELT pipeline design
β’ Ability to independently build and support production-grade pipelines
Preferred Experience
β’ Telecom / Contact Center domain knowledge
β’ Experience with:
β’ Call Detail Records (CDR)
β’ IVR systems
β’ Voice analytics
β’ Familiarity with data lake architecture and cloud-native solutions
Additional Responsibilities
β’ Use GCS, BigQuery, Airflow Composer, Cloud Data Fusion, Cloud Functions, Python, SQL, and GitHub to build and maintain the Enterprise Data Warehouse (EDW)
β’ Collaborate with:
β’ Data Engineering Scrum Team
β’ Data & Integrations Engineering Team
β’ Production Data Support Team
β’ Business Stakeholders
β’ Design new features and troubleshoot production issues
Key Competencies
β’ Strong collaboration and communication skills
β’ Ability to learn new technologies quickly
β’ Effective time management and prioritization
β’ Strong problem-solving and analytical thinking
β’ Experience with Agile Scrum methodology
β’ Commitment to teamwork, transparency, and delivery excellence
Education & Certifications
β’ Bachelorβs degree in Computer Science or related field
β’ 4β5+ years of relevant experience
β’ Experience working in Agile environments
β’ GCP certifications (preferred)
Job Title: Data Engineer (GCP / Telecom / Genesys Migration)
Location: Phoenix, AZ (Preferred) or Walnut Creek, CA (Onsite)
Duration: 12 Months C2H
Must-Have Skills
β’ BigQuery
β’ Google Cloud Storage (GCS)
β’ Airflow / Cloud Composer
β’ Python
β’ SQL
β’ Telecom / Contact Center IVR
β’ Agile Environment & Communication Skills
Job Overview
We are seeking a Data Engineer (Level 2β3) to support a large-scale enterprise data migration initiative from Avaya to Genesys contact center platform.
This role focuses on data engineering, not telecom engineering. The environment processes high-volume telecom interaction data such as call recordings, transcripts, and voice metadata.
The ideal candidate will have strong experience building and maintaining data pipelines in Google Cloud Platform (GCP) and handling both structured and semi-structured data.
Key Responsibilities
β’ Design, build, and maintain scalable data ingestion and transformation pipelines
β’ Work with telecom datasets:
β’ Call Detail Records (CDR)
β’ Transcripts
β’ Event-based data
β’ Process multiple data formats:
β’ JSON transcripts
β’ XML datasets
β’ Relational databases
β’ Build and optimize datasets in BigQuery
β’ Develop workflows using Airflow / Cloud Composer
β’ Ingest and manage data using GCS
β’ Perform data modeling and transformation for analytics/reporting
β’ Collaborate during Avaya β Genesys migration
β’ Ensure data quality, reliability, and performance
β’ Work in an Agile environment
Required Experience
β’ 5+ years as a Data Engineer (MidβSenior level)
β’ Strong hands-on experience with Google Cloud Platform (GCP)
β’ Expertise in:
β’ BigQuery
β’ Cloud Storage (GCS)
β’ Airflow / Cloud Composer
β’ Python
β’ SQL
β’ Experience with ETL/ELT pipeline design
β’ Ability to independently build and support production-grade pipelines
Preferred Experience
β’ Telecom / Contact Center domain knowledge
β’ Experience with:
β’ Call Detail Records (CDR)
β’ IVR systems
β’ Voice analytics
β’ Familiarity with data lake architecture and cloud-native solutions
Additional Responsibilities
β’ Use GCS, BigQuery, Airflow Composer, Cloud Data Fusion, Cloud Functions, Python, SQL, and GitHub to build and maintain the Enterprise Data Warehouse (EDW)
β’ Collaborate with:
β’ Data Engineering Scrum Team
β’ Data & Integrations Engineering Team
β’ Production Data Support Team
β’ Business Stakeholders
β’ Design new features and troubleshoot production issues
Key Competencies
β’ Strong collaboration and communication skills
β’ Ability to learn new technologies quickly
β’ Effective time management and prioritization
β’ Strong problem-solving and analytical thinking
β’ Experience with Agile Scrum methodology
β’ Commitment to teamwork, transparency, and delivery excellence
Education & Certifications
β’ Bachelorβs degree in Computer Science or related field
β’ 4β5+ years of relevant experience
β’ Experience working in Agile environments
β’ GCP certifications (preferred)






