

Ascendum Solutions
Sr. Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer on a 12-month contract, paying "pay rate". Candidates must work PST hours and have 7+ years of experience in Databricks, Python/Scala, and Microsoft Azure, including ETL and data governance expertise.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
600
-
🗓️ - Date
October 25, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Irvine, CA
-
🧠 - Skills detailed
#Azure Data Factory #Scala #Data Lake #Business Analysis #ADF (Azure Data Factory) #NoSQL #Code Reviews #Datasets #Data Pipeline #Documentation #Data Engineering #SQL Server #Databricks #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Azure #Data Architecture #Microsoft Azure #Batch #Databases #Python #Data Governance
Role description
This position can ONLY consider W2 candidates. C2C/1099 candidates are not eligible to be hired for this opportunity.
Title: Sr. Data Engineer
Term: Contract – 12 months extendable
Must be open to working PST hours
Data Engineer – Databricks/Python
What You’ll Do
• Design, develop, optimize, and maintain data architecture and pipelines that adhere to ETL principles and business goals
• Collaborate with data engineers, data consumers, and other team members to come up with simple, functional, and elegant solutions that balance the data needs across the organization
• Solve complex data problems to deliver insights that helps the organization achieve its goals
• Create data products that will be used throughout the organization
• Advise, consult, mentor and coach other data and analytic professionals on data standards and practices
• Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytic solutions
• Develop and deliver documentation on data engineering capabilities, standards, and processes; participate in coaching, mentoring, design reviews and code reviews
• Partner with business analysts and solutions architects to develop technical architectures for strategic enterprise projects and initiatives.
• Deliver awesome code
What You’ll Bring
• 7+ years relevant and progressive data engineering experience
• Deep Technical knowledge and experience in Databricks, Python/Scala, Microsoft Azure architecture and platform including Azure Event Hub and ADF (Azure Data Factory) pipelines
• Hands-on experience working with data pipelines using a variety of source and target locations
• (e.g., Databricks, SQL Server, Data Lake, file-based, SQL and No-SQL database)
• Thorough knowledge of SQL Server including T-SQL and stored procedures
• Experience in engineering practices such as development, code refactoring, and leveraging design patterns, CI/CD, and building highly scalable data applications and processes
• Experience developing batch ETL pipelines; real-time pipelines are a plus
• Knowledge of advanced data engineering concepts such as dimensional modeling, ETL, data governance, data warehousing involving structured and unstructured data
• A successful history of manipulating, processing and extracting value from large disconnected datasets.
• Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
• Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
This position can ONLY consider W2 candidates. C2C/1099 candidates are not eligible to be hired for this opportunity.
Title: Sr. Data Engineer
Term: Contract – 12 months extendable
Must be open to working PST hours
Data Engineer – Databricks/Python
What You’ll Do
• Design, develop, optimize, and maintain data architecture and pipelines that adhere to ETL principles and business goals
• Collaborate with data engineers, data consumers, and other team members to come up with simple, functional, and elegant solutions that balance the data needs across the organization
• Solve complex data problems to deliver insights that helps the organization achieve its goals
• Create data products that will be used throughout the organization
• Advise, consult, mentor and coach other data and analytic professionals on data standards and practices
• Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytic solutions
• Develop and deliver documentation on data engineering capabilities, standards, and processes; participate in coaching, mentoring, design reviews and code reviews
• Partner with business analysts and solutions architects to develop technical architectures for strategic enterprise projects and initiatives.
• Deliver awesome code
What You’ll Bring
• 7+ years relevant and progressive data engineering experience
• Deep Technical knowledge and experience in Databricks, Python/Scala, Microsoft Azure architecture and platform including Azure Event Hub and ADF (Azure Data Factory) pipelines
• Hands-on experience working with data pipelines using a variety of source and target locations
• (e.g., Databricks, SQL Server, Data Lake, file-based, SQL and No-SQL database)
• Thorough knowledge of SQL Server including T-SQL and stored procedures
• Experience in engineering practices such as development, code refactoring, and leveraging design patterns, CI/CD, and building highly scalable data applications and processes
• Experience developing batch ETL pipelines; real-time pipelines are a plus
• Knowledge of advanced data engineering concepts such as dimensional modeling, ETL, data governance, data warehousing involving structured and unstructured data
• A successful history of manipulating, processing and extracting value from large disconnected datasets.
• Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
• Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.






