Career Soft Solutions Inc

Data Engineer (Warehouse, BI & SQL DBA)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer (Warehouse, BI & SQL DBA) based in Scottsdale, AZ, with a contract length of "unknown" and a pay rate of "unknown." Key skills include SQL Server, Azure SQL, ETL/ELT development, and Power BI.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
May 13, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Scottsdale, AZ
-
🧠 - Skills detailed
#Databases #Presto #GIT #Deployment #Security #Logging #Documentation #Data Pipeline #DBA (Database Administrator) #Indexing #Database Design #Spark (Apache Spark) #ADF (Azure Data Factory) #Scala #"ETL (Extract #Transform #Load)" #SQL Server #Agile #Observability #EDW (Enterprise Data Warehouse) #Azure Data Factory #Azure SQL Database #Azure DevOps #Data Governance #Data Integrity #DevOps #Cloud #GitHub #Monitoring #Snowflake #BI (Business Intelligence) #Synapse #Azure SQL #Data Warehouse #Semantic Models #Azure #Schema Design #SQL (Structured Query Language) #Data Quality #Datasets #Microsoft Power BI #Data Engineering
Role description
Job Title: Data Engineer (Warehouse, BI & SQL DBA) Scottsdale, AZ , United States - 85250 Position Purpose: We are seeking a Data Engineer (Warehouse, BI & SQL DBA) to join our Scottsdale-based team and own our end-to-end Microsoft data platformβ€”from database design and administration, through reliable data pipelines, into a governed enterprise Data Warehouse/Lake-house (Azure Synapse and/or Microsoft Fabric), and ultimately enabling analytics and reporting in Power BI. This role blends hands-on SQL Server/Azure SQL engineering and DBA responsibilities with modern ELT/ETL development, dimensional modeling, performance tuning, security, and data governance. You will partner closely with application teams, business stakeholders, and analytics users to deliver trusted, well-documented datasets and scalable solutions that support operational and strategic decision-making. Requirements Key Responsibilities: β€’ Design, implement, and administer SQL Server and Azure SQL databases, including schema design, indexing strategies, constraints, and data integrity controls. β€’ Own core DBA functions: backup/restore, HA/DR, patching and upgrades, capacity planning, monitoring/alerting, and routine maintenance. β€’ Build and optimize ELT/ETL pipelines using Microsoft tooling (e.g., Azure Data Factory/Synapse Pipelines and/or Fabric Data Factory), integrating data from on-prem and cloud sources. β€’ Develop scalable transformation patterns using T-SQL and Spark (where applicable), implementing repeatable, testable data-processing frameworks. β€’ Design and manage the enterprise Data Warehouse/Lake-house, including dimensional modeling (star/snowflake), incremental loads, historization (SCD), and semantic layer alignment for BI. β€’ Administer and optimize Azure Synapse and/or Microsoft Fabric (Lake house/Warehouse), including workspace organization, resource governance, and performance/cost tuning. β€’ Enable Power BI reporting by delivering curated datasets, developing and supporting semantic models, and partnering with analysts on performance, refresh strategies, and best practices. β€’ Implement data quality, reconciliation, and observability (logging, lineage, SLAs), and drive root-cause analysis for data issues across pipelines and reporting. β€’ Establish and enforce security and governance controls (RBAC, least privilege, encryption, PII handling, audit ability) across databases, lake house/warehouse, and BI. β€’ Automate deployments and operations using Git-based workflows and CI/CD (Azure DevOps/GitHub), including infrastructure-as-code where appropriate. β€’ Create technical documentation for data models, source-to-target mappings, operational run books, and support procedures. β€’ Collaborate with application engineering, product, and business stakeholders to translate requirements into reliable data products and deliver iteratively in an Agile environment.