SBS Creatix

Associate Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Associate Data Engineer in Clayton, MO, offering a contract-to-hire position with a pay rate of "unknown". Requires 0-2 years of data engineering experience, proficiency in SQL and Python, and familiarity with Azure cloud platforms.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
360
-
πŸ—“οΈ - Date
March 7, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
St Louis, MO
-
🧠 - Skills detailed
#Datasets #Synapse #Azure SQL #Scala #Tableau #Data Quality #Security #"ETL (Extract #Transform #Load)" #Computer Science #Data Framework #Data Modeling #Metadata #Monitoring #Data Management #AWS (Amazon Web Services) #BI (Business Intelligence) #Spark (Apache Spark) #SQL (Structured Query Language) #Data Lake #dbt (data build tool) #Azure SQL Database #Databases #DevOps #Python #Microsoft Power BI #SaaS (Software as a Service) #Data Pipeline #Azure Synapse Analytics #Cloud #Classification #DAX #Azure #Migration #Data Engineering
Role description
General Info: US Citizens or GC Holders only Must be on our W2- no C2C Contract to hire Hybrid - 3 days a week in the office after getting up to speed. Location: Clayton, MO Key Responsibilities: Data Platform & Infrastructure Development β€’ Design, develop, and maintain cloud-based data infrastructure using technologies such as Azure SQL Database, Azure Data Lake, Azure Synapse Analytics, and DuckDB (AWS experience also considered). β€’ Build scalable data pipelines capable of processing large and complex datasets from multiple enterprise sources. β€’ Develop and manage cloud data solutions utilizing Azure Synapse Spark Pools, Data Pipelines, Spark Notebooks, and DBT. Data Engineering & Modeling β€’ Create optimized data models and implement efficient ETL/ELT processes, transformations, and metadata frameworks. β€’ Integrate data from transactional systems, on-premise warehouses, enterprise databases, and SaaS platforms into modern cloud data environments. β€’ Maintain and enhance existing data models and analytics infrastructure. Data Quality & Optimization β€’ Implement data validation, monitoring, error handling, security controls, and performance optimization techniques. β€’ Ensure reliability, scalability, and resilience across data platforms. β€’ Analyze datasets to identify anomalies, trends, and meaningful business insights. Cloud Migration & Analytics Enablement β€’ Support initiatives migrating legacy data and BI platforms to cloud-based data lakes and lakehouse architectures. β€’ Enable advanced analytics and reporting capabilities for business stakeholders. β€’ Troubleshoot data issues and recommend practical technical solutions. Required Qualifications: β€’ 0–2 years of experience in Data Engineering, ETL development, or related roles (internship experience acceptable). β€’ Strong proficiency in SQL and Python. β€’ Experience or exposure to cloud data platforms (Azure preferred). β€’ Understanding of: β€’ Data warehousing concepts β€’ Data modeling principles β€’ ETL/ELT processes β€’ Structured and unstructured data management β€’ Bachelor’s degree in Computer Science, Information Systems, Analytics, or related field. Preferred Qualifications: β€’ Experience with Azure Synapse Analytics, Spark environments, or similar cloud analytics platforms. β€’ Exposure to Power BI or Tableau. β€’ Familiarity with DAX or Power Query (M). β€’ Understanding of CI/CD pipelines and DevOps practices. β€’ Experience working with large-scale or distributed datasets. Additional Responsibilities: β€’ Build and scale data infrastructure aligned with organizational needs. β€’ Support statistical analysis and data classification initiatives. β€’ Leverage existing data environments to fulfill analytics and reporting requests. β€’ Monitor data performance and recommend improvements. β€’ Perform additional duties as required to support client and project success.