Ascendum Solutions

Sr. Data Engineer (Azure/Databricks/Python)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer (Azure/Databricks/Python) with a contract length of "unknown" and a pay rate of "$XX/hour". Requires a Bachelor's Degree, 5+ years of experience, strong SQL skills, and expertise in data lakes, ETL/ELT, and cloud technologies.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
480
-
πŸ—“οΈ - Date
May 2, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Plano, TX
-
🧠 - Skills detailed
#Deployment #JSON (JavaScript Object Notation) #Code Reviews #Scala #Snowflake #Data Modeling #Microsoft SQL #Data Engineering #Data Mart #REST (Representational State Transfer) #dbt (data build tool) #Python #Data Pipeline #Microsoft Power BI #Tableau #XML (eXtensible Markup Language) #Leadership #ADF (Azure Data Factory) #GIT #SQL (Structured Query Language) #Web API #SQL Server #SSRS (SQL Server Reporting Services) #ADLS (Azure Data Lake Storage) #Microsoft SQL Server #SSAS (SQL Server Analysis Services) #BI (Business Intelligence) #Azure #Database Systems #Databricks #Java #GraphQL #Data Warehouse #Cloud #BigQuery #MS SQL (Microsoft SQL Server) #Airflow #Data Lake #API (Application Programming Interface) #"ETL (Extract #Transform #Load)" #Continuous Deployment
Role description
Responsible for delivering senior level innovative and compelling software solutions for our consumer, internal operations, and value chain constituents across a wide variety of enterprise applications. The job duties and requirements are defined for backend. Provides technical leadership and mentorship to junior team members. Responsibilities β€’ Designs, develops, and delivers solutions that meet business line and enterprise requirements. β€’ Creates data and reporting infrastructure by building and optimizing production-grade data pipelines using continuous integration/ continuous deployment. β€’ Participates in rapid prototyping and POC development efforts. β€’ Understand business and technical requirements and constraints to design effective data engineering solutions. β€’ Assists in efforts to develop and refine functional and non-functional requirements. β€’ Champion engineering excellence, including software design patterns, code reviews and automated unit/functional testing. β€’ Create conceptual architectures and detailed designs for data engineering solutions. β€’ Contributes to overall enterprise technical architecture and implementation best practices. β€’ Participates in iteration and release planning. β€’ Performs other duties and projects as assigned. Qualifications Bachelor’s Degree required, and a minimum of five plus related work experience. Required Skills β€’ Design and implementation of data lake, data warehouse and data mart. β€’ Web API, REST, GraphQL, XML/JSON. β€’ ETL/ELT, and one of data engineering languages (Python/Java/Scala). β€’ ADF, ADLS, Airflow, DBT and any of cloud-native data warehouse (Snowflake, Databricks, BigQuery). β€’ Strong SQL experience, with expert-level skills in query performance tuning and data modeling. β€’ Designs enterprise database systems using Microsoft SQL Server. β€’ Has working knowledge of GIT. Develops branching and merging strategies. β€’ Ability to design, develop, and maintain scalable, reusable code. β€’ Estimates tasks with a level of granularity and accuracy commensurate with information provided to ensure expectations of delivery are reasonable. β€’ Excels in a rapid iteration environment with short turnaround times. β€’ Knowledge of cubes and SSAS is a plus. β€’ Knowledge of SSRS, Power BI, and Tableau a plus. Preferred Skills β€’ Knowledge of cubes and SSAS is a plus. β€’ Knowledge of SSRS, Power BI, and Tableau a plus.