

Holistic Partners, Inc
Data Engineer (W2)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (W2) with a long-term remote contract. Key requirements include 5+ years of experience in data engineering, strong skills in Python, SQL, and Azure, plus expertise in Data Vault modeling and Databricks.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
November 25, 2025
π - Duration
Unknown
-
ποΈ - Location
Remote
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#PySpark #Unix #API (Application Programming Interface) #GCP (Google Cloud Platform) #Consulting #Perl #RDBMS (Relational Database Management System) #Cloud #SQL (Structured Query Language) #Data Quality #Data Warehouse #Data Science #Azure ADLS (Azure Data Lake Storage) #Debugging #Storage #Fivetran #XML (eXtensible Markup Language) #Data Modeling #Azure Data Factory #Jira #AWS (Amazon Web Services) #Data Lake #GitHub #Datasets #JSON (JavaScript Object Notation) #Databricks #Computer Science #Scripting #Security #Version Control #Azure SQL #Code Reviews #Big Data #ML (Machine Learning) #REST (Representational State Transfer) #Vault #Data Bricks #"ETL (Extract #Transform #Load)" #Migration #Data Engineering #SAS #Agile #ADF (Azure Data Factory) #AI (Artificial Intelligence) #Python #GIT #Public Cloud #Compliance #Data Security #Data Governance #Documentation #Data Pipeline #Scala #Spark (Apache Spark) #.Net #Data Vault #Databases #Azure #SQL Server #Project Management #REST API #Spark SQL #DevOps #Logic Apps #C# #Batch #Azure DevOps #Programming #ADLS (Azure Data Lake Storage) #Java
Role description
Job Title: Data Engineer (W2)
Location: Remote
Duration: Long Term
Interview Process: Phone then Skype hire
Tax Terms: W2 Only,
One of my candidate interviewed but rejected
Take a look at these roles and let me know if you can find me candidates for them. A key requirement for the Data Engineer role that is different from the last round is we need candidates with experience with Data Vault. I remember this being niche but we need to ask candidates if they've done data vault modeling or Key Vault.
Top Skills - Must Haves:
Python
Sql
data
big data
data warehouse
Azure
Pyspark
databricks
Top Skills' Details
1. Databricks Expertise: Hands-on advanced experience with Databricks and Spark (PySpark preferred) for building and optimizing workflows and pipelines.
1. Hands-on proficiency in designing, building, and maintaining ETL/ELT pipelines using Python, PySpark, and related tools.
1. Strong SQL skills
1. Cloud Platforms: Hands-on experience with cloud services (AWS, Azure, or GCP) and cloud-native data tools.
Job Description:
Client is building a new data platform to support their Modern Sales initiative. This data platform will help sales reps at Constellation gain better access to customer, market, pricing, and real time data. This is a new organization that is being built out. They are hiring a data science team, data enablement, full stack development, and testing teams.
Primary Duties and Accountabilities:
β’ Produce high-quality, secure, and maintainable code/configurations
β’ Learn and understand business processes with limited guidance
β’ Work collaboratively as a member of the development team to build best-in-class software solutions in an agile environment
β’ Support and research issues across all application layers and database
β’ Identify areas to improve and scale our Azure architecture and application design
β’ Ensure code can be deployed using Azure DevOps
β’ Design and query database tables, views, functions, stored
procedures and batch processes
β’ Develop, implement, and support interfaces that connect our websites, back-end systems, and various 3rd party cloud solutions, vendors and customers.
β’ Engage with business partners to gather information, analyze requirements and deliver practical, efficient and
cost-effective solutions that satisfy business needs. Write product specifications and design documentation for assigned system components.
β’ Design, develop and/or review complex code or configurations to ensure AI solutions meet the
requirements of functional and technical specifications. Develop and/or review project technical architecture design and development. Solutions may include but are not limited to the development of various types of applications, web, mobile app, full stack or integrations hosted on premises data centers or in the cloud. Develop automated Unit tests and/or automated UI tests.
β’ Design and build large-scale AI data platforms/products & frameworks for processing high volumes of data, in
real-time as well as batch, that will be used by teams across the organization
β’ Develop complex data algorithms for AI/ML, data analytics, machine learning, or scientific computing
β’ Utilize a deep understanding of AI/ML/data applications to translate business needs into technical solutions
β’ Lead team members and provide oversight for less experienced engineers; Stay on top of the latest technologies and trends. Support development of version control principles (e.g. Git and working within an agile environment)
β’ Enhance knowledge of and compliance with preferred technologies, methodologies, standards and policies. Maintain technical knowledge and business acumen within own discipline or function. Strong debugging and problem-solving skills; lead peer code reviews.
β’ Provide IT teams and business personnel with technology solutions by weighing advantages of
technology trends, market availability of products, risks and benefits of technology to meet business/IT needs. Participate in IT architecture review and standards setting. Assist in the development of white papers, conducting presentations as needed to explain why a technology is being recommended by IT.
Big Data and Integration:
Expertise in handling big data systems and integrating data from diverse sources (e.g., APIs, flat files, databases).
Network and External Connections:
Experience in configuring secure external connections to data sources (e.g., Azure Data Lake Storage, On-prem SQL Server) using service principals, SAS tokens, OAuth, or mounting techniques. Security and Governance:
Strong understanding of data security, encryption, and compliance standards
Experience implementing data governance frameworks to manage data quality.
Azure Data Factory (ADF):
Proven expertise in using ADF for orchestrating and automating data workflows across cloud and on-premises systems.
β’ Bachelorβs degree (ex: Computer Science or related discipline) and 5-8 years of relevant experience or minimum 9-12 years of relevant combination of education and work experience.
β’ Experience with private and public cloud architectures, pros/cons, and migration considerations
β’ Minimum of 5 years of RDBMS experience
β’ Experience with JSON, JSON-LD, XML data structures
Experience implementing data pipelines using latest technologies and techniques
Experience with SDLC products
(Azure DevOps, JIRA, Confluence, GitHub) or similar agile project management tools
3-5+ years of hands-on experience in programming languages such as Java 8, c#, node.js, python, SQL, Unix shell/Perl scripting
β’ 5+ years of industry experience data engineer, with involvement in the data component of the AI/ML lifecycle, and a solid understanding of applied machine learning topics
At least 3-5 years of consulting or client service delivery experience on Azure
Experience handling Structured and unstructured datasets
Strong t-SQL skills with experience in Azure SQL DW
Experience in Data Modeling and Advanced SQL techniques
Cloud migration methodologies and processes with tools like Azure Data Factory, Event Hub, etc.
Excellent problem solving, analytical, and critical thinking skills.
Bachelor's degree (ex: Computer Science or related discipline) and 5 - 8
years of relevant experience (programming technologies) or minimum 9-12 years of relevant combination of education and work experience Knowledge of ETL tools.
Experience with cloud-native Azure technologies like Azure Functions, Event Grids,
Service Bus Queues, Cosmos DB, Azure Data Factory, or related is a plus.
Experience with designing integration applications, both upstream and downstream applications.
Experience with git, Azure DevOps, build pipelines, code branching/merging, code management.
Experience writing and executing stored procedures
Knowledge of Event driven architecture.
Additional Skills & Qualifications:
Preferred Qualifications:
β’ Logic Apps
β’ Azure Function App
β’ Microservice Architecture
β’ Background and knowledge of .NET REST Api.
β’ Service, Bus Queue & Event Grids
β’ Experience in USQL, Hive SQL, Spark SQL, Data Bricks
Nice-to-Have Skills:
Data Vault Modeling:
Knowledge of Data Vault methodology for designing scalable and flexible data models.
Other Orchestration Tools:
Familiarity with workflow orchestration tools like FiveTran
Employee Value Proposition (EVP)
As the nation's largest producer of clean, carbon-free energy, Client is focused on our purpose: accelerating the transition to a carbon-free future. We have been the leader in clean energy production for more than a decade, and we are cultivating a workplace where our employees can grow, thrive, and contribute.
Our culture and employee experience make it clear: We are powered by passion and purpose. Together, we're creating healthier communities and a cleaner planet, and our people are the driving force behind our success. you can build a fulfilling career with opportunities to learn, grow and make an impact. By doing our best work and meeting new challenges, we can accomplish great things and help fight climate change. Join us to lead the clean energy future.
Work Environment
This is a brand new initiative that will bring cutting edge technology in data science, AI, and Machine Learning.
Business Drivers/Customer Impact
Digital Transformation office is leading a Modern Sales initiative focused on creating more efficient and transformative sales process. More data for customized offerings to customers. More tools, analytics. They are building data pods and are seeking a vendor to scale quickly.
Job Title: Data Engineer (W2)
Location: Remote
Duration: Long Term
Interview Process: Phone then Skype hire
Tax Terms: W2 Only,
One of my candidate interviewed but rejected
Take a look at these roles and let me know if you can find me candidates for them. A key requirement for the Data Engineer role that is different from the last round is we need candidates with experience with Data Vault. I remember this being niche but we need to ask candidates if they've done data vault modeling or Key Vault.
Top Skills - Must Haves:
Python
Sql
data
big data
data warehouse
Azure
Pyspark
databricks
Top Skills' Details
1. Databricks Expertise: Hands-on advanced experience with Databricks and Spark (PySpark preferred) for building and optimizing workflows and pipelines.
1. Hands-on proficiency in designing, building, and maintaining ETL/ELT pipelines using Python, PySpark, and related tools.
1. Strong SQL skills
1. Cloud Platforms: Hands-on experience with cloud services (AWS, Azure, or GCP) and cloud-native data tools.
Job Description:
Client is building a new data platform to support their Modern Sales initiative. This data platform will help sales reps at Constellation gain better access to customer, market, pricing, and real time data. This is a new organization that is being built out. They are hiring a data science team, data enablement, full stack development, and testing teams.
Primary Duties and Accountabilities:
β’ Produce high-quality, secure, and maintainable code/configurations
β’ Learn and understand business processes with limited guidance
β’ Work collaboratively as a member of the development team to build best-in-class software solutions in an agile environment
β’ Support and research issues across all application layers and database
β’ Identify areas to improve and scale our Azure architecture and application design
β’ Ensure code can be deployed using Azure DevOps
β’ Design and query database tables, views, functions, stored
procedures and batch processes
β’ Develop, implement, and support interfaces that connect our websites, back-end systems, and various 3rd party cloud solutions, vendors and customers.
β’ Engage with business partners to gather information, analyze requirements and deliver practical, efficient and
cost-effective solutions that satisfy business needs. Write product specifications and design documentation for assigned system components.
β’ Design, develop and/or review complex code or configurations to ensure AI solutions meet the
requirements of functional and technical specifications. Develop and/or review project technical architecture design and development. Solutions may include but are not limited to the development of various types of applications, web, mobile app, full stack or integrations hosted on premises data centers or in the cloud. Develop automated Unit tests and/or automated UI tests.
β’ Design and build large-scale AI data platforms/products & frameworks for processing high volumes of data, in
real-time as well as batch, that will be used by teams across the organization
β’ Develop complex data algorithms for AI/ML, data analytics, machine learning, or scientific computing
β’ Utilize a deep understanding of AI/ML/data applications to translate business needs into technical solutions
β’ Lead team members and provide oversight for less experienced engineers; Stay on top of the latest technologies and trends. Support development of version control principles (e.g. Git and working within an agile environment)
β’ Enhance knowledge of and compliance with preferred technologies, methodologies, standards and policies. Maintain technical knowledge and business acumen within own discipline or function. Strong debugging and problem-solving skills; lead peer code reviews.
β’ Provide IT teams and business personnel with technology solutions by weighing advantages of
technology trends, market availability of products, risks and benefits of technology to meet business/IT needs. Participate in IT architecture review and standards setting. Assist in the development of white papers, conducting presentations as needed to explain why a technology is being recommended by IT.
Big Data and Integration:
Expertise in handling big data systems and integrating data from diverse sources (e.g., APIs, flat files, databases).
Network and External Connections:
Experience in configuring secure external connections to data sources (e.g., Azure Data Lake Storage, On-prem SQL Server) using service principals, SAS tokens, OAuth, or mounting techniques. Security and Governance:
Strong understanding of data security, encryption, and compliance standards
Experience implementing data governance frameworks to manage data quality.
Azure Data Factory (ADF):
Proven expertise in using ADF for orchestrating and automating data workflows across cloud and on-premises systems.
β’ Bachelorβs degree (ex: Computer Science or related discipline) and 5-8 years of relevant experience or minimum 9-12 years of relevant combination of education and work experience.
β’ Experience with private and public cloud architectures, pros/cons, and migration considerations
β’ Minimum of 5 years of RDBMS experience
β’ Experience with JSON, JSON-LD, XML data structures
Experience implementing data pipelines using latest technologies and techniques
Experience with SDLC products
(Azure DevOps, JIRA, Confluence, GitHub) or similar agile project management tools
3-5+ years of hands-on experience in programming languages such as Java 8, c#, node.js, python, SQL, Unix shell/Perl scripting
β’ 5+ years of industry experience data engineer, with involvement in the data component of the AI/ML lifecycle, and a solid understanding of applied machine learning topics
At least 3-5 years of consulting or client service delivery experience on Azure
Experience handling Structured and unstructured datasets
Strong t-SQL skills with experience in Azure SQL DW
Experience in Data Modeling and Advanced SQL techniques
Cloud migration methodologies and processes with tools like Azure Data Factory, Event Hub, etc.
Excellent problem solving, analytical, and critical thinking skills.
Bachelor's degree (ex: Computer Science or related discipline) and 5 - 8
years of relevant experience (programming technologies) or minimum 9-12 years of relevant combination of education and work experience Knowledge of ETL tools.
Experience with cloud-native Azure technologies like Azure Functions, Event Grids,
Service Bus Queues, Cosmos DB, Azure Data Factory, or related is a plus.
Experience with designing integration applications, both upstream and downstream applications.
Experience with git, Azure DevOps, build pipelines, code branching/merging, code management.
Experience writing and executing stored procedures
Knowledge of Event driven architecture.
Additional Skills & Qualifications:
Preferred Qualifications:
β’ Logic Apps
β’ Azure Function App
β’ Microservice Architecture
β’ Background and knowledge of .NET REST Api.
β’ Service, Bus Queue & Event Grids
β’ Experience in USQL, Hive SQL, Spark SQL, Data Bricks
Nice-to-Have Skills:
Data Vault Modeling:
Knowledge of Data Vault methodology for designing scalable and flexible data models.
Other Orchestration Tools:
Familiarity with workflow orchestration tools like FiveTran
Employee Value Proposition (EVP)
As the nation's largest producer of clean, carbon-free energy, Client is focused on our purpose: accelerating the transition to a carbon-free future. We have been the leader in clean energy production for more than a decade, and we are cultivating a workplace where our employees can grow, thrive, and contribute.
Our culture and employee experience make it clear: We are powered by passion and purpose. Together, we're creating healthier communities and a cleaner planet, and our people are the driving force behind our success. you can build a fulfilling career with opportunities to learn, grow and make an impact. By doing our best work and meeting new challenges, we can accomplish great things and help fight climate change. Join us to lead the clean energy future.
Work Environment
This is a brand new initiative that will bring cutting edge technology in data science, AI, and Machine Learning.
Business Drivers/Customer Impact
Digital Transformation office is leading a Modern Sales initiative focused on creating more efficient and transformative sales process. More data for customized offerings to customers. More tools, analytics. They are building data pods and are seeking a vendor to scale quickly.






