

Data Warehouse Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Warehouse Engineer with a contract length of "unknown," offering a pay rate of "unknown." It requires expertise in Python, SQL, ETL tools, and certifications in Epic Clarity and Caboodle Development. Remote work is permitted in select states.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 14, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Arcata, CA
-
π§ - Skills detailed
#SQL (Structured Query Language) #Data Pipeline #BI (Business Intelligence) #Compliance #Python #Data Modeling #Programming #SQL Server #Data Engineering #Security #Microsoft SQL #Data Extraction #Data Warehouse #Data Quality #Azure #"ETL (Extract #Transform #Load)" #MS SQL (Microsoft SQL Server) #Automation #Scala #Microsoft SQL Server #Documentation #Data Manipulation
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
We are seeking a highly skilled and proactive Data Warehouse Engineer to join our team. In this role, you will transform complex, unstructured data from a variety of sources into consistent, machine-readable formats that support predictive and prescriptive analytics. You will design, develop, and test scalable data pipelines and ETL architectures, playing a key role in driving data-informed decision-making across the organization. This position requires strong analytical abilities, programming expertise, and a collaborative mindset. If youβre passionate about building data infrastructure that powers business intelligence, we want to hear from you.
Remote- but must sit in one of these states: Arkansas, California, Colorado, Georgia, Idaho, Illinois, New Hampshire, Nevada, Oklahoma, Oregon, Pennsylvania, Texas, and Washington.
Essential Duties And Responsibilities
β’ Design, develop, and maintain scalable and efficient data pipelines that ensure high data quality and integrity.
β’ Integrate and harmonize data from multiple sources to support analytics and reporting initiatives.
β’ Build and test robust ETL architectures to streamline data extraction, transformation, and loading processes.
β’ Partner with the Business Intelligence team to support data modeling and analytical frameworks.
β’ Monitor, troubleshoot, and optimize data systems to ensure optimal performance and reliability.
β’ Ensure thorough documentation of data pipelines, transformations, and governance processes.
β’ Work with analysts and stakeholders to understand data needs and provide technical support for analytical projects.
β’ Implement and maintain security and compliance protocols in line with relevant standards and regulations.
β’ Resolve data-related issues in a timely and effective manner.
Qualifications And Requirements
β’ Proven ability to build positive, collaborative working relationships across diverse teams.
β’ Proficiency in Python and SQL for data manipulation and automation.
β’ Hands-on experience with ETL tools and data warehousing platforms, particularly Microsoft SQL Server and Azure.
β’ Certification in Epic Clarity data model and Caboodle Development required; additional Epic certifications are a plus.
β’ Experience developing extracts from OCHIN Epic preferred.
β’ Strong analytical thinking with attention to detail and problem-solving capabilities.
β’ Excellent verbal and written communication skills.
β’ Ability to work independently and manage multiple priorities in a fast-paced environment.
β’ Creative thinker who brings new ideas and solutions to the table.
β’ Must be able to work during Pacific Standard Time (PST) business hours.
β’ Experience with healthcare industry data and standards is highly desirable.
Education And Experience
β’ Bachelorβs degree or equivalent combination of education and related experience.
β’ Minimum of 5 years of advanced data engineering experience.