Infotree Global Solutions

Data Engineer - ONLY W2

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for 12 months, offering a remote work location and a pay rate of "ONLY W2." Key skills include SQL (Snowflake, PostgreSQL), Python, Airflow, and dbt, with a focus on geospatial data management and data pipeline optimization.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 3, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Data Engineering #Data Quality #Spatial Data #"ETL (Extract #Transform #Load)" #Data Ingestion #Data Science #PostgreSQL #Python #Airflow #SQL (Structured Query Language) #dbt (data build tool) #Database Performance #Data Access #Metadata #Documentation #Snowflake
Role description
Role: Data Engineer Location: Remote Duration: 12 Months Job description: Design, create, refine, and maintain data processes and pipelines used for modeling, analysis, and reporting using SQL (ideally Snowflake and PostgreSQL), Python and pipeline and transformation tools like Airflow and dbt β€’ Conduct detailed data research on internal and external geospatial data (POI, geocoding, map layers, geometrics shapes), identify changes over time and maintain geospatial data (shape files, polygons and metadata) β€’ Operationalize data products with detailed documentation, automated data quality checks and change alerts β€’ Support data access through various sharing platforms, including dashboard tools β€’ Troubleshoot failures in data processes, pipelines, and products β€’ Communicate and educate consumers on data access and usage, managing transparency in metric and logic definitions β€’ Collaborate with other data scientists, analysts, and engineers to build full-service data solutions β€’ Develop and communicate architectures, code patterns and data structure design choices to team of data scientists, analysts and engineers laying out tradeoffs β€’ Optimize query and database performance through designing, creating, refining, and maintaining performance management system β€’ Work with cross-functional business partners and vendors to acquire and transform raw data sources β€’ Design, create, refine, and maintain data ingestion process β€’ Provide frequent updates to the team on progress and status of planned work