Harnham

Senior Data Engineer - Geospatial Focus

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a geospatial focus, offering a 6-month contract at $65-$80/hr. Required skills include strong Python, Snowflake, AWS, and geospatial tools. Flexible hours, ramping from 20 to 40 hours/week.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date
March 19, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, NY
-
🧠 - Skills detailed
#Snowflake #Libraries #Spark (Apache Spark) #Scala #Scripting #Cloud #Datasets #Pandas #Data Ingestion #Spatial Data #Airflow #"ETL (Extract #Transform #Load)" #PySpark #Documentation #AWS (Amazon Web Services) #ML (Machine Learning) #Data Engineering #Python #Code Reviews
Role description
Senior Data Engineer - Geospatial Focus About the Role We are seeking a Senior Data Engineer (Contract) to help build and scale the ingestion, transformation, and orchestration layers for a new data product entering beta launch across multiple U.S. states. This role is ideal for an engineer with strong Python foundations, deep experience working with geospatial data, and a passion for building reliable, production‑ready data systems. You'll work closely with a highly collaborative, research‑driven engineering team to design data models, develop pipelines, troubleshoot complex geospatial challenges, and ensure the platform is ready for its next phase of growth. What You'll Do • Develop, own, and maintain data ingestion and transformation pipelines in a type‑safe Python environment • Design and optimize data models, with an emphasis on geospatial datasets and real‑estate-oriented data problems • Build and manage orchestration workflows (e.g., Prefect) to support scalable, reliable production systems • Work within a sprint-based process, delivering high‑quality, well‑tested code • Collaborate closely with a thoughtful, academically minded engineering team to solve complex data challenges • Contribute to documentation, code reviews, and continuous improvement of engineering standards Required Skills & Experience Core Technical Background • Strong Data Engineering experience • Hands-on experience with: • Python (custom/bespoke codebases, not just notebooks or light scripting) • Snowflake • AWS (data‑oriented cloud services) • Workflow orchestration tools (e.g., Prefect, Dagster, Airflow, etc.) Geospatial Expertise (Required) Experience with geospatial data tools, libraries, formats, or platforms such as: • ArcGIS • GeoPandas • Apache Sedona • PySpark GIS • QGIS • Shapefiles, raster/tiling systems, spatial joins, projections, coordinate systems, etc. Nice to Have • Experience with Machine Learning, feature engineering, or ML model pipelines • Familiarity with real‑estate or location‑based data products • Experience building systems for early‑stage product launches Logistics • Flexible time zones; mandatory standups M/W/F at 9:30 AM ET. • 20 hours/week ramping to 40 hours/week within 2-3 months, continuing full-time for ~6 months. • Hourly Rate: $65/hr - $80/hr