

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 5+ years of experience in data engineering and analytics, specializing in Cloud Database Platforms (preferably Snowflake), Python, and MRP/ERP Systems (SAP). Contract duration is 5+ months, remote work in Arizona hours, no sponsorship available.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 12, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Arizona, United States
-
π§ - Skills detailed
#Deployment #Azure Data Factory #Python #Snowflake #Data Ingestion #Batch #Libraries #Azure #Alteryx #Agile #GitHub #SQL (Structured Query Language) #Liquibase #Databases #Pandas #Databricks #SAP #Datasets #SSIS (SQL Server Integration Services) #Data Engineering #ADF (Azure Data Factory) #SQL Queries #Jenkins #ML (Machine Learning) #"ETL (Extract #Transform #Load)" #Matillion #Scala #Data Pipeline #Data Cleansing #SnowPipe #DevOps #Cloud #Database Administration
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
We have an urgent need in seeking one data engineer with advanced expertise in Cloud Database Platforms (preferably Snowflake), Python, and MRP/ERP Systems (SAP).
Title: Data Engineer
Duration: 5+ months , will extend
Location: Remote ( Arizona hours)
NO SPONSORSHIP
The engineer will join an experienced team to design and deliver impactful data products. The ideal candidate is passionate data engineering and empowering customers with analytics solutions.
Key Responsibilities
β’ Develop and optimize complex batch and near-real-time SQL queries to meet product requirements and business needs.
β’ Design and implement core foundational datasets that is reusable, scalable, and performant.
β’ Architect, implement, deploy, and maintain data-driven solutions in Snowflake and Databricks.
β’ Develop and manage data pipelines supporting multiple reports, tools, and applications.
β’ Conduct advanced statistical analysis to yield actionable insights, identify correlations/trends, measure performance, and visualize disparate sources of data.
Required
β’ 5+ years in in operations, supply chain, data engineering, data analytics, and/or database administration.
β’ Experience in design, implementations, and optimization in Snowflake or other relational SQL databases
β’ Experience with data cleansing, curation, mining, manipulation, and analysis from disparate systems
β’ Experience with configuration control (GitHub preferred) and Data DevOps practices using GitHub Actions, Jenkins or other deployment pipelines that provide Continuous Integration and Continuous Delivery (CI/CD)
β’ Experience with Data Extract, Transform and Load (ETL) processes using SQL as the foundation
β’ Experience with database structures, modeling implementation such as third normal form
Preferred Skills
β’ Experience with Manufacturing, Operations, and/or Supply Chain process and systems
β’ Experience using MRP/ERP systems (SAP)
β’ Experience with Python and related libraries such as Pandas for advanced data analytics
β’ Experience with schema deployment solutions such as SchemaChange or Liquibase
β’ Working knowledge of Agile Software development methodologies
β’ Ability to filter, extract, and analyze information from large, complex datasets
β’ Great verbal and written communication skills to collaborate cross functionally
β’ Experience with data deployment solutions such as Snowflake Tasks, Matillion, SSIS, Azure Data Factory (ADF) or Alteryx
β’ Experience with Snowflake Streams, Stages and Snowpipes for data ingestion
β’ Experience with VS Code and GitHub Desktop for integrated development
β’ Experience with web application relational database models and APIs
β’ Knowledge of statistical modeling and machine learning methods