Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a contract basis, requiring expertise in Databricks, Power BI, Azure, AWS, and PySpark. The position involves building scalable ETL pipelines and developing executive dashboards. Strong cloud and data governance knowledge is essential.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 24, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
New Jersey, United States
-
🧠 - Skills detailed
#Programming #Airflow #"ETL (Extract #Transform #Load)" #Oracle #Delta Lake #Cloud #ADF (Azure Data Factory) #DAX #Kafka (Apache Kafka) #Informatica #SQL (Structured Query Language) #Azure Data Factory #Data Engineering #Azure #PySpark #Data Integration #Automation #Data Governance #Spark (Apache Spark) #Spark SQL #Databricks #Microsoft Power BI #Scala #AWS (Amazon Web Services) #BI (Business Intelligence)
Role description
We are seeking a senior Data Engineer with proven success designing and delivering enterprise-scale BI solutions, data platforms, and cloud-native analytics systems. This candidate will have expert-level proficiency in Databricks, Microsoft Power BI, and modern data engineering toolsets including Azure, AWS, PySpark, SQL, and Airflow. Contribute to the buildout of a robust Databricks Lakehouse architecture, engineering scalable ETL/ELT pipelines using PySpark and SQL, integrating diverse data sources (Kafka, APIs, Oracle DB, Informatica). Power BI development is core strength that we are seeking in this candidate, should be able to ingest a wide range of sources to deliver high-impact, executive-ready dashboards and KPI reporting suites to track a number of metrics. Key strengths include: β€’ Scalable Power BI dashboard creation using DAX, star schema modeling, and real-time data integrations β€’ Hands-on Databricks pipeline development (Delta Lake, Unity Catalog, Spark optimization) β€’ Deep knowledge of cloud platforms (Azure, AWS), SQL/PLSQL programming, and data governance β€’ Orchestration expertise with Airflow and Azure Data Factory, enabling complex workflow automation