Procom

Data Engineer Senior (Google Cloud)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Google Cloud) on a 9-month contract, 100% onsite in Southlake or Austin, Texas. Requires 8+ years of experience with Google Cloud tools, Python, SQL, and ETL processes.
🌎 - Country
United States
πŸ’± - Currency
Unknown
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
March 4, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Southlake, TX
-
🧠 - Skills detailed
#Storage #Terraform #"ETL (Extract #Transform #Load)" #Cloud #Data Integrity #GCP (Google Cloud Platform) #Google Cloud Storage #DevOps #Dataflow #Data Management #Security #Computer Science #SQL (Structured Query Language) #BigQuery #Data Mart #Python #Automation #Data Pipeline #Visualization #Data Architecture #Scala #Snowflake #Data Quality #Data Engineering #Databricks #Microsoft Power BI #Data Processing #BI (Business Intelligence) #Kafka (Apache Kafka)
Role description
Data Engineer Senior (Google Cloud) On behalf of our financial services client, Procom is searching for a Senior Data Engineer (Google Cloud) for a 9-month role. This position is 100% onsite at our client’s Southlake, Texas, or Austin, Texas, office. Job Description: In this role, you will be contracted to build and maintain scalable data pipelines on Google Cloud Platform (GCP) to serve our fraud data mart customers. You will collaborate with cross-functional teams to ensure data integrity, reliability, and scalability, leveraging your expertise in Google BigQuery, Google Cloud Storage, Dataflow, Cloud Composer, Python, and SQL to develop effective data solutions supporting fraud data analytics and reporting efforts. Responsibilities: Design, build, and maintain scalable data pipelines using Google Cloud Platform tools such as BigQuery, Cloud Storage, Dataflow, Cloud Composer, and Pub/Sub. Write high-performance, production-grade Python and SQL, optimizing queries for ETL processes. Implement complex data models in BigQuery for optimal performance. Collaborate with cross-functional teams to understand data requirements and deliver effective solutions. Implement best practices for data quality, governance, and security. Monitor and troubleshoot data pipeline issues, ensuring high availability and performance. Contribute to data architecture decisions and provide recommendations for pipeline improvements. Mandatory Skills: Bachelor’s or Master’s degree in computer science, Information Systems, Engineering, or related field. 8+ years of hands-on experience with data management and consolidation. Strong expertise in Google BigQuery, Google Cloud Storage, Dataflow, Pub/Sub, and Cloud Composer. Proficiency in Python and SQL for data processing and automation. Experience with ETL processes and data pipeline design. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Nice-to-Have Skills: Deep expertise in real-time processing using Kafka or Pub/Sub. Experience with Power BI development and visualization. Experience with modern data stack such as Snowflake, Databricks. Knowledge of DevOps practices and tools such as Terraform. Google Professional Data Engineer certification. Assignment Length: This is a 9-month contract position. Start Date: ASAP. Assignment Location: Southlake, Texas, 100% onsite. Austin, Texas is a secondary work location.