Sharp Decisions

Sr Data Engineer (967)

โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Data Engineer (967) on a contract-to-hire basis, hybrid in Phoenix, AZ, with a W2 pay rate. Key skills include Azure Data Factory, Databricks, Python, and Power BI. A BS in Computer Science or equivalent experience is required.
๐ŸŒŽ - Country
United States
๐Ÿ’ฑ - Currency
$ USD
-
๐Ÿ’ฐ - Day rate
Unknown
-
๐Ÿ—“๏ธ - Date
April 15, 2026
๐Ÿ•’ - Duration
Unknown
-
๐Ÿ๏ธ - Location
Hybrid
-
๐Ÿ“„ - Contract
W2 Contractor
-
๐Ÿ”’ - Security
Unknown
-
๐Ÿ“ - Location detailed
Scottsdale, AZ
-
๐Ÿง  - Skills detailed
#Jira #Data Warehouse #Data Lineage #Azure SQL Database #Azure DevOps #Kubernetes #Microsoft Power BI #Azure #Java #JSON (JavaScript Object Notation) #SQL Server #GCP (Google Cloud Platform) #Data Quality #Database Performance #Scripting #JavaScript #Batch #Data Engineering #Scala #Computer Science #Jenkins #Storage #Data Modeling #AWS (Amazon Web Services) #ADLS (Azure Data Lake Storage) #Databases #Azure Data Factory #Delta Lake #Cloud #DevOps #SSIS (SQL Server Integration Services) #BI (Business Intelligence) #REST (Representational State Transfer) #Visualization #Data Profiling #REST API #Power Automate #Spark (Apache Spark) #Spark SQL #GitHub #Agile #Data Pipeline #Azure SQL #Databricks #Python #Documentation #Azure ADLS (Azure Data Lake Storage) #Data Transformations #"ETL (Extract #Transform #Load)" #Data Lake #Bash #ADF (Azure Data Factory) #C++ #SQL (Structured Query Language) #Shell Scripting #Docker #Collibra #API (Application Programming Interface)
Role description
Contract to Hire Hybrid ย ยทย  Phoenix, AZ W2 Only ย ยทย  No 3rd Parties Key Responsibilities Data Pipelines & ETL โ€ข Design, develop, and maintain data pipelines and ETL/ELT processes from various sources into the data warehouse. โ€ข Deliver on-prem SQL Server reporting requirements using SSIS and integration processes. โ€ข Implement batch and streaming ETL using Azure Data Factory, Spark, Python, and Scala on Databricks. Azure & Cloud Infrastructure โ€ข Work with Azure Data Lake Storage, Azure SQL Databases, and Azure Data Factory. โ€ข Implement Databricks Delta Lake, Medallion Architecture, and Unity Catalog. โ€ข Optimize ingestion and transformation processes for efficient, scalable data flows. Data Modeling & Quality โ€ข Develop and maintain data models, schemas, and data dictionaries. โ€ข Perform data profiling, cleansing, validation, and quality checks. โ€ข Monitor and optimize database performance including query tuning and index optimization. Power BI & Reporting โ€ข Develop Power BI dashboards, reports, and visualizations. โ€ข Implement data transformations, modeling, and integration processes in Power BI. โ€ข Communicate data insights clearly to business stakeholders. DevOps & Agile Delivery โ€ข Use Azure DevOps, GitHub, and Jenkins for build and deploy workflows. โ€ข Deliver features incrementally on an Agile team using Jira and Confluence. โ€ข Create and maintain technical documentation for databases, schemas, and reporting solutions. Continuous Improvement โ€ข Stay current with industry trends in data engineering and Power BI. โ€ข Proactively recommend and implement improvements to data infrastructure. โ€ข Multi-task across priorities and adapt within a team environment. Required & Preferred Skills Required Azure Data Factory Databricks Delta Lake Medallion Architecture Unity Catalog Python Scala Spark SQL Server SSIS Azure Data Lake Storage Azure SQL Power BI Data Warehousing ETL / ELT Azure DevOps GitHub Jenkins Jira / Confluence MS Office Preferred REST API / JSON Banking / Financial Domain Azure / GCP / AWS Cert. Java / JavaScript / C++ Docker / Kubernetes Bash / Shell Scripting Power Automate Power Apps Collibra Data Lineage Collibra Data Quality Education BS Computer Science BS Information Systems Equivalent Work Experience