

GCP Data Engineer - Geopatial Data Specialist - Houston, TX - 12+ Months Contract
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer - Geospatial Data Specialist in Houston, TX, with a 12+ month contract. Requires 3+ years of GCP experience, strong SQL and Python/Java skills, and expertise in geospatial data handling and processing.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 27, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Houston, TX
-
π§ - Skills detailed
#Dataflow #Java #SQL (Structured Query Language) #Apache Beam #Data Processing #GCP (Google Cloud Platform) #Indexing #Libraries #Batch #Cloud #Data Engineering #Kafka (Apache Kafka) #Pandas #Python #Data Lake #Airflow #Spatial Data #BigQuery
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: GCP Data Engineer β Geospatial Data Specialist
Location: Houston, TX (Onsite)
Duration: 12+ Months Contract
Required Qualifications
β’ 3+ years of hands-on experience as a Data Engineer on Google Cloud Platform (GCP).
β’ Strong proficiency with SQL (especially BigQuery GIS) and Python or Java.
β’ Experience handling geospatial data formats (GeoJSON, KML, shapefiles, raster, tiles) and spatial indexing techniques.
β’ Familiarity with PostGIS, GDAL, GeoPandas, or other GIS libraries and tools.
β’ Experience with streaming and batch data processing (Kafka/PubSub, Apache Beam, Dataflow).
β’ Solid understanding of data warehousing and data lake architectures.
β’ Experience with Airflow or Cloud Composer for orchestration.