

Experion Technologies
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer, offering a remote contract with a competitive pay rate. Candidates should have expertise in data lake, ETL/ELT pipelines, and tools like Azure Data Factory and Snowflake, along with strong SQL skills.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 9, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Plano, TX
-
🧠 - Skills detailed
#Tableau #SQL (Structured Query Language) #Data Mart #REST (Representational State Transfer) #BigQuery #API (Application Programming Interface) #Microsoft Power BI #Storage #ADLS (Azure Data Lake Storage) #SQL Server #Data Architecture #GIT #Leadership #GraphQL #Java #Data Lake #Airflow #"ETL (Extract #Transform #Load)" #Visualization #Apache Airflow #SSAS (SQL Server Analysis Services) #XML (eXtensible Markup Language) #Snowflake #Azure ADLS (Azure Data Lake Storage) #Data Engineering #dbt (data build tool) #JSON (JavaScript Object Notation) #Data Pipeline #Data Warehouse #Microsoft SQL Server #Python #Microsoft SQL #Azure Data Factory #Azure #Database Systems #Scala #BI (Business Intelligence) #SSRS (SQL Server Reporting Services) #Web API #Cloud #Code Reviews #MS SQL (Microsoft SQL Server) #Databricks #Data Modeling #ADF (Azure Data Factory)
Role description
The Senior Data Engineer is responsible for delivering innovative, compelling, and production-grade software solutions for our consumer-facing products, internal operations, and value chain constituents across a wide variety of enterprise applications. This role focuses on backend data engineering and requires a practitioner who can lead through expertise — designing robust data architectures, building scalable pipelines, and providing technical mentorship to junior team members.
About the Role
This role focuses on backend data engineering and requires a practitioner who can lead through expertise — designing robust data architectures, building scalable pipelines, and providing technical mentorship to junior team members.
Responsibilities
• Design, develop, and deliver data engineering solutions that meet business line and enterprise requirements.
• Build and optimize production-grade data pipelines using CI/CD practices to create robust data and reporting infrastructure.
• Participate in rapid prototyping and proof-of-concept (POC) development efforts.
• Understand business and technical requirements and constraints to design effective, scalable data engineering solutions.
• Assist in developing and refining functional and non-functional requirements.
• Champion engineering excellence, including software design patterns, thorough code reviews, and automated unit/functional testing.
• Create conceptual architectures and detailed designs for data engineering solutions.
• Contribute to overall enterprise technical architecture and implementation best practices.
• Participate actively in iteration and release planning ceremonies.
• Provide technical leadership and mentorship to junior and mid-level team members.
• Perform other duties and projects as assigned.
Qualifications
• Design and implementation of data lake, data warehouse, and data mart solutions.
• Web API, REST, GraphQL, and XML/JSON data formats.
• ETL/ELT pipelines; proficiency in at least one data engineering language: Python, Java, or Scala.
• Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), Apache Airflow, dbt, and at least one cloud-native data warehouse platform (Snowflake, Databricks, or BigQuery).
• Strong SQL skills with expert-level proficiency in query performance tuning and data modeling.
• Experience designing enterprise database systems using Microsoft SQL Server.
• Solid working knowledge of Git, including branching and merging strategies.
Required Skills
• Ability to design, develop, and maintain scalable, reusable, and maintainable code.
• Accurately estimates tasks with appropriate granularity given available information.
• Thrives in a rapid-iteration environment with short turnaround times and shifting priorities.
• Strong communication skills; able to translate complex technical concepts for non-technical stakeholders.
Preferred Skills
• Knowledge of OLAP cubes and SQL Server Analysis Services (SSAS).
• Experience with reporting and visualization tools: SSRS, Power BI, and/or Tableau.
• Familiarity with real estate, mortgage, or proptech domain data.
This role is primarily remote; however, candidates are expected to relocate to Plano, TX, or Irvine, CA in the future. Occasional travel to these locations may be required initially.
The Senior Data Engineer is responsible for delivering innovative, compelling, and production-grade software solutions for our consumer-facing products, internal operations, and value chain constituents across a wide variety of enterprise applications. This role focuses on backend data engineering and requires a practitioner who can lead through expertise — designing robust data architectures, building scalable pipelines, and providing technical mentorship to junior team members.
About the Role
This role focuses on backend data engineering and requires a practitioner who can lead through expertise — designing robust data architectures, building scalable pipelines, and providing technical mentorship to junior team members.
Responsibilities
• Design, develop, and deliver data engineering solutions that meet business line and enterprise requirements.
• Build and optimize production-grade data pipelines using CI/CD practices to create robust data and reporting infrastructure.
• Participate in rapid prototyping and proof-of-concept (POC) development efforts.
• Understand business and technical requirements and constraints to design effective, scalable data engineering solutions.
• Assist in developing and refining functional and non-functional requirements.
• Champion engineering excellence, including software design patterns, thorough code reviews, and automated unit/functional testing.
• Create conceptual architectures and detailed designs for data engineering solutions.
• Contribute to overall enterprise technical architecture and implementation best practices.
• Participate actively in iteration and release planning ceremonies.
• Provide technical leadership and mentorship to junior and mid-level team members.
• Perform other duties and projects as assigned.
Qualifications
• Design and implementation of data lake, data warehouse, and data mart solutions.
• Web API, REST, GraphQL, and XML/JSON data formats.
• ETL/ELT pipelines; proficiency in at least one data engineering language: Python, Java, or Scala.
• Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), Apache Airflow, dbt, and at least one cloud-native data warehouse platform (Snowflake, Databricks, or BigQuery).
• Strong SQL skills with expert-level proficiency in query performance tuning and data modeling.
• Experience designing enterprise database systems using Microsoft SQL Server.
• Solid working knowledge of Git, including branching and merging strategies.
Required Skills
• Ability to design, develop, and maintain scalable, reusable, and maintainable code.
• Accurately estimates tasks with appropriate granularity given available information.
• Thrives in a rapid-iteration environment with short turnaround times and shifting priorities.
• Strong communication skills; able to translate complex technical concepts for non-technical stakeholders.
Preferred Skills
• Knowledge of OLAP cubes and SQL Server Analysis Services (SSAS).
• Experience with reporting and visualization tools: SSRS, Power BI, and/or Tableau.
• Familiarity with real estate, mortgage, or proptech domain data.
This role is primarily remote; however, candidates are expected to relocate to Plano, TX, or Irvine, CA in the future. Occasional travel to these locations may be required initially.





