

Brooksource
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 5–7+ years of experience, starting Jan. 5th, 2026, in downtown Indianapolis. Key skills include Azure Data Factory, Databricks, and SQL. Requires expertise in data engineering, ETL development, and Medallion architecture.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
544
-
🗓️ - Date
December 4, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Indianapolis, IN
-
🧠 - Skills detailed
#Debugging #Python #Data Warehouse #Agile #SQL (Structured Query Language) #Azure SQL #Microsoft Power BI #Oracle #Scala #Data Pipeline #Data Lake #GCP (Google Cloud Platform) #ADF (Azure Data Factory) #Azure #Data Architecture #Azure Data Factory #Deployment #DevOps #"ETL (Extract #Transform #Load)" #Documentation #Automation #Data Bricks #Jira #Cloud #Data Modeling #BI (Business Intelligence) #AWS (Amazon Web Services) #Data Engineering #Databricks #SQL Server #Data Lakehouse
Role description
Senior Data Engineer – Azure Data Warehouse (5–7+ Years Experience)
Start Date: Jan. 5th, 2026
Long term renewing contract
Location: onsite, downtown Indianapolis
• Azure-based data warehouse and dashboarding initiatives.
• Work alongside architects, analysts, and researchers to build scalable, auditable, and business-aligned data assets using modern cloud tools and best practices.
Key Responsibilities
· Design and implement scalable data pipelines using ADF, Databricks, and Azure SQL Server
· Apply Medallion architecture principles and best practices for data lake and warehouse design
· Collaborate with Data Architects, Analysts, and Researchers to translate business needs into technical solutions
· Develop and maintain CI/CD pipelines for data workflows and dashboard deployments
· Lead troubleshooting and debugging efforts across ETL, SQL, and cloud environments
· Mentor junior team members and promote best practices in data modeling, cleansing, and promotion
· Support dashboarding initiatives with Power BI and wireframe collaboration
· Ensure auditability, lineage, and performance across SQL Server and Oracle environments
Required Skills & Experience
· 5–7+ years in data engineering, data warehouse design, and ETL development
· Strong expertise in Azure Data Factory, Data Bricks, and Python
· Deep understanding of SQL Server, Oracle, Postgres SQL & Cosmos DB and data modeling standards
· Proven experience with Medallion architecture and data Lakehouse best practices
· Hands-on with CI/CD, DevOps, and deployment automation
· Agile mindset with ability to manage multiple priorities and deliver on time
· Excellent communication and documentation skills
Bonus Skills
· Experience with GCP or AWS
· Familiarity with Jira, Confluence, and AppDynamics
Senior Data Engineer – Azure Data Warehouse (5–7+ Years Experience)
Start Date: Jan. 5th, 2026
Long term renewing contract
Location: onsite, downtown Indianapolis
• Azure-based data warehouse and dashboarding initiatives.
• Work alongside architects, analysts, and researchers to build scalable, auditable, and business-aligned data assets using modern cloud tools and best practices.
Key Responsibilities
· Design and implement scalable data pipelines using ADF, Databricks, and Azure SQL Server
· Apply Medallion architecture principles and best practices for data lake and warehouse design
· Collaborate with Data Architects, Analysts, and Researchers to translate business needs into technical solutions
· Develop and maintain CI/CD pipelines for data workflows and dashboard deployments
· Lead troubleshooting and debugging efforts across ETL, SQL, and cloud environments
· Mentor junior team members and promote best practices in data modeling, cleansing, and promotion
· Support dashboarding initiatives with Power BI and wireframe collaboration
· Ensure auditability, lineage, and performance across SQL Server and Oracle environments
Required Skills & Experience
· 5–7+ years in data engineering, data warehouse design, and ETL development
· Strong expertise in Azure Data Factory, Data Bricks, and Python
· Deep understanding of SQL Server, Oracle, Postgres SQL & Cosmos DB and data modeling standards
· Proven experience with Medallion architecture and data Lakehouse best practices
· Hands-on with CI/CD, DevOps, and deployment automation
· Agile mindset with ability to manage multiple priorities and deliver on time
· Excellent communication and documentation skills
Bonus Skills
· Experience with GCP or AWS
· Familiarity with Jira, Confluence, and AppDynamics






