
Databricks Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Senior Data Engineer in Pittsburgh, PA, on a contract basis. It requires 10+ years of data engineering experience, expertise in supply chain, data architecture, ETL development, and proficiency in Azure Databricks and SQL.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
480
-
ποΈ - Date discovered
September 18, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Pittsburgh, PA
-
π§ - Skills detailed
#Databricks #Data Quality #Data Engineering #Migration #Database Management #Data Analysis #"ETL (Extract #Transform #Load)" #Data Integration #Requirements Gathering #ADF (Azure Data Factory) #Data Warehouse #SQL (Structured Query Language) #Azure #Data Migration #Data Architecture #EDW (Enterprise Data Warehouse) #Data Cleansing #Data Access #Azure Databricks
Role description
Role: Databricks Senior Data Engineer
Location: Pittsburgh, PA - Onsite
Position Type : Contract
JOB DESCRIPTION
Required Skills and Experience:
β’ 10+ years of experience as Datawarehouse Data Engineer.
β’ Candidate being Knowledgeable in Supply Chain domain is mandatory.
β’ Data Architecture Design: Develop/Modify the architecture for the EDW, including data models, data quality standards, data integration strategies, data migration and data warehousing technologies to effectively store and access data from various sources as per the requirements.
β’ Business Requirements Gathering: Collaborate with business stakeholders to understand their data needs, identify key performance indicators (KPIs), and translate them into data warehouse requirements.
β’ Data Integration and ETL Development: Design and implementation of ETL processes to extract data from various source systems, transform it into a consistent format and load it into the EDW through Databricks. Proven track record in implementing Databricks for modernization projects.
β’ Performance Optimization: Monitor and optimize the performance of the EDW to ensure efficient data access and query response times.
β’ Overall Technical Leader: Provide technical guidance on data warehouse technologies, tools, and best practices to the relevant stakeholders.
β’ Collaboration with Stakeholders: Work closely with business users, data analysts, and application developers to understand their data needs and deliver data-driven insights.
β’ Data Warehousing Expertise: Deep understanding of data warehousing concepts, architectures, and technologies like dimensional modeling, star schema, Databricks schema, and ETL processes.
β’ Database Management Systems (DBMS): Proficient in SQLβs and Databricks.
β’ Data Integration Tools: Familiarity with data integration tools for ETL development and data cleansing.
Communication Skills:
β’ Excellent communication skills to effectively collaborate with stakeholders from various departmentsβ
Tech Stacks:
β’ Azure Databricks , ADF, Azure apps.
Best Regards,
Bismillah Arzoo (AB)
Role: Databricks Senior Data Engineer
Location: Pittsburgh, PA - Onsite
Position Type : Contract
JOB DESCRIPTION
Required Skills and Experience:
β’ 10+ years of experience as Datawarehouse Data Engineer.
β’ Candidate being Knowledgeable in Supply Chain domain is mandatory.
β’ Data Architecture Design: Develop/Modify the architecture for the EDW, including data models, data quality standards, data integration strategies, data migration and data warehousing technologies to effectively store and access data from various sources as per the requirements.
β’ Business Requirements Gathering: Collaborate with business stakeholders to understand their data needs, identify key performance indicators (KPIs), and translate them into data warehouse requirements.
β’ Data Integration and ETL Development: Design and implementation of ETL processes to extract data from various source systems, transform it into a consistent format and load it into the EDW through Databricks. Proven track record in implementing Databricks for modernization projects.
β’ Performance Optimization: Monitor and optimize the performance of the EDW to ensure efficient data access and query response times.
β’ Overall Technical Leader: Provide technical guidance on data warehouse technologies, tools, and best practices to the relevant stakeholders.
β’ Collaboration with Stakeholders: Work closely with business users, data analysts, and application developers to understand their data needs and deliver data-driven insights.
β’ Data Warehousing Expertise: Deep understanding of data warehousing concepts, architectures, and technologies like dimensional modeling, star schema, Databricks schema, and ETL processes.
β’ Database Management Systems (DBMS): Proficient in SQLβs and Databricks.
β’ Data Integration Tools: Familiarity with data integration tools for ETL development and data cleansing.
Communication Skills:
β’ Excellent communication skills to effectively collaborate with stakeholders from various departmentsβ
Tech Stacks:
β’ Azure Databricks , ADF, Azure apps.
Best Regards,
Bismillah Arzoo (AB)