

Reqroute, Inc
GCP Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Engineer based in Dallas, TX, with a contract length of "unknown." The pay rate is "unknown." Requires 10+ years of experience in Python/GCP, SQL, and geospatial analysis, with expertise in Cloud SQL, PySpark, and data processing tools.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
January 15, 2026
π - Duration
Unknown
-
ποΈ - Location
Hybrid
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Dallas, TX
-
π§ - Skills detailed
#Teradata #Datasets #Oracle #Visualization #Spatial Data #Data Integrity #Cloud #Security #Agile #Data Pipeline #Data Processing #Programming #Debugging #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #PySpark #Dataflow #Python #Data Profiling #Geospatial Analysis #Hadoop #Complex Queries #SQL (Structured Query Language) #Compliance #Data Analysis #SQL Queries #BigQuery #GCP (Google Cloud Platform) #Data Security #SQL Server
Role description
Role: GCP Engineer
Location β ONSITE Dallas, TX (LOCALS ONLY)
Experience: 10+ Years
Job summary :
At least 10 years of experience in Python/GCP .Familiar writing/interpreting/tuning/debugging complex SQL queries and stored procedures Distributed Multi-tier.
Experience with scheduling tool such as Autosys or similar tools.
Strong data profiling, data analysis, and data validations skills.
Deep DBMS experience on multiple platforms (Incl. Oracle, Teradata, and SQL Server).Advanced understanding of data warehousing ETL concepts (esp. change data capture).
Working experience as Agile developer
Required Skills : Python, PySpark, Cloud Composer, Cloud Dataflow, Cloud Dataproc, Cloud SQL, Google Big Query
Responsibilities :
- Analyze geospatial data using advanced cloud technologies to provide actionable insights for business strategies.
- Collaborate with cross-functional teams to design and implement data solutions that enhance operational efficiency.
- Utilize Cloud SQL to manage and query large datasets, ensuring data integrity and accessibility.
- Implement data workflows using Cloud Composer to automate processes and improve data pipeline efficiency.
- Develop and optimize PySpark scripts to process and analyse large volumes of geospatial data.
- Leverage Google Big Query to perform complex queries and generate reports that support decision-making.
- Apply Python programming skills to develop custom solutions for geospatial data analysis and visualization.
- Utilize Cloud Dataproc to manage and scale Hadoop and Spark clusters for efficient data processing.
- Implement Cloud Dataflow to streamline data processing and ensure real-time data analysis capabilities.
- Work in a hybrid model, balancing remote and on-site collaboration to maximize productivity and innovation.
- Ensure compliance with data security and privacy regulations in all data handling and processing activities.
- Contribute to the companyβs strategic goals by providing insights that drive informed decision-making and societal impact.
Qualifications
- Possess a minimum of 6 years of experience in geospatial analysis and cloud technologies.
- Demonstrate expertise in Cloud SQL, Cloud Composer, PySpark, Google BigQuery, and Python.
- Experience with Cloud Dataproc and Cloud Dataflow is essential for efficient data processing.
- Strong analytical skills to interpret complex geospatial data and provide actionable insights.
- Ability to collaborate effectively in a hybrid work model, balancing remote and on-site tasks.
- Excellent communication skills to convey technical information to non-technical stakeholders.
- Proven track record of implementing data solutions that enhance business operations and decision-making.
Role: GCP Engineer
Location β ONSITE Dallas, TX (LOCALS ONLY)
Experience: 10+ Years
Job summary :
At least 10 years of experience in Python/GCP .Familiar writing/interpreting/tuning/debugging complex SQL queries and stored procedures Distributed Multi-tier.
Experience with scheduling tool such as Autosys or similar tools.
Strong data profiling, data analysis, and data validations skills.
Deep DBMS experience on multiple platforms (Incl. Oracle, Teradata, and SQL Server).Advanced understanding of data warehousing ETL concepts (esp. change data capture).
Working experience as Agile developer
Required Skills : Python, PySpark, Cloud Composer, Cloud Dataflow, Cloud Dataproc, Cloud SQL, Google Big Query
Responsibilities :
- Analyze geospatial data using advanced cloud technologies to provide actionable insights for business strategies.
- Collaborate with cross-functional teams to design and implement data solutions that enhance operational efficiency.
- Utilize Cloud SQL to manage and query large datasets, ensuring data integrity and accessibility.
- Implement data workflows using Cloud Composer to automate processes and improve data pipeline efficiency.
- Develop and optimize PySpark scripts to process and analyse large volumes of geospatial data.
- Leverage Google Big Query to perform complex queries and generate reports that support decision-making.
- Apply Python programming skills to develop custom solutions for geospatial data analysis and visualization.
- Utilize Cloud Dataproc to manage and scale Hadoop and Spark clusters for efficient data processing.
- Implement Cloud Dataflow to streamline data processing and ensure real-time data analysis capabilities.
- Work in a hybrid model, balancing remote and on-site collaboration to maximize productivity and innovation.
- Ensure compliance with data security and privacy regulations in all data handling and processing activities.
- Contribute to the companyβs strategic goals by providing insights that drive informed decision-making and societal impact.
Qualifications
- Possess a minimum of 6 years of experience in geospatial analysis and cloud technologies.
- Demonstrate expertise in Cloud SQL, Cloud Composer, PySpark, Google BigQuery, and Python.
- Experience with Cloud Dataproc and Cloud Dataflow is essential for efficient data processing.
- Strong analytical skills to interpret complex geospatial data and provide actionable insights.
- Ability to collaborate effectively in a hybrid work model, balancing remote and on-site tasks.
- Excellent communication skills to convey technical information to non-technical stakeholders.
- Proven track record of implementing data solutions that enhance business operations and decision-making.






