

Infotree Global Solutions
ONLY W2 CANDIDATES :: Data Engineer/ Scientist (Geospatial) :: Remote Role
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer/Scientist (Geospatial) on a remote contract for "X months" at a pay rate of "$X/hour". Key skills include SQL, Python, DBT, and geospatial tools (PostGIS, ArcGIS). Experience in data modeling and analysis is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 2, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#ML (Machine Learning) #dbt (data build tool) #Spatial Data #Data Storage #Pandas #Database Performance #Data Engineering #Data Quality #Python #Storage #Documentation #PostgreSQL #Data Access #"ETL (Extract #Transform #Load)" #Airflow #Data Science #SQL (Structured Query Language) #Snowflake #Metadata
Role description
Responsibilities:
• Design, create, refine, and maintain data processes and pipelines used for modeling, analysis, and reporting using SQL (ideally Snowflake and PostgreSQL), Python and pipeline and transformation tools like Airflow and dbt.
• Conduct detailed data research on internal and external geospatial data (POI, geocoding, map layers, geometric shapes), identify changes over time and maintain geospatial data (shape files, polygons, and metadata).
• Operationalize data products with detailed documentation, automated data quality checks and change alerts.
• Support data access through various sharing platforms, including dashboard tools.
• Collaborate with other data scientists, analysts, and engineers to build full-service data solutions.
• Develop and communicate architectures, code patterns, and data structure design choices to team of data scientists, analysts and engineers laying out trade-offs.
• Optimize query and database performance through designing, creating, refining, and maintaining performance management system.
Skills Required:
• Experience with SQL, Python, DBT.
• Utilize geospatial data tools (PostGIS, ArcGIS/ArcPy, QGIS, Geopandas, etc.) to optimize and normalize spatial data storage, run spatial queries and processes to power analysis and data products.
• Simulation, Statistical Analysis, Data Science models, Machine Learning, Predictive models.
Responsibilities:
• Design, create, refine, and maintain data processes and pipelines used for modeling, analysis, and reporting using SQL (ideally Snowflake and PostgreSQL), Python and pipeline and transformation tools like Airflow and dbt.
• Conduct detailed data research on internal and external geospatial data (POI, geocoding, map layers, geometric shapes), identify changes over time and maintain geospatial data (shape files, polygons, and metadata).
• Operationalize data products with detailed documentation, automated data quality checks and change alerts.
• Support data access through various sharing platforms, including dashboard tools.
• Collaborate with other data scientists, analysts, and engineers to build full-service data solutions.
• Develop and communicate architectures, code patterns, and data structure design choices to team of data scientists, analysts and engineers laying out trade-offs.
• Optimize query and database performance through designing, creating, refining, and maintaining performance management system.
Skills Required:
• Experience with SQL, Python, DBT.
• Utilize geospatial data tools (PostGIS, ArcGIS/ArcPy, QGIS, Geopandas, etc.) to optimize and normalize spatial data storage, run spatial queries and processes to power analysis and data products.
• Simulation, Statistical Analysis, Data Science models, Machine Learning, Predictive models.






