Vertex Elite LLC

HVR Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an HVR Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." It requires strong experience in Fivetran/HVR, Python, SQL, and cloud technologies, preferably in the healthcare industry. Remote work only.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
440
-
🗓️ - Date
October 23, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Airflow #Data Engineering #Data Quality #Programming #Deployment #SQL (Structured Query Language) #Scripting #"ETL (Extract #Transform #Load)" #Data Pipeline #Monitoring #Unit Testing #API (Application Programming Interface) #dbt (data build tool) #Python #Scala #Replication #Fivetran #DevOps #Databases #Metadata #Snowflake #Data Replication #Database Management #Data Integration #Cloud #Computer Science #Automation
Role description
Dear All, Vertex Elite is currently seeking a qualified HVR Data Engineer to join our team. If you or someone you know is interested, please feel free to reach out for more details or share your updated resume. Work Authorization : USC or Any valid USA work authorization(At this moment, we are not providing any sponsorships) Job Type : W2 with Vertex Elite LLC Location : Remote Required Skills: Fivetran/HVR for high volume data replication - Cloud experience - Snowflake – if no snowflake, MUST have very strong Fivetran/HVR experience - Python & SQL Plusses: - Healthcare industry experience would be big plus Primary Purpose: Responsible for the implementation of a technology framework providing technical support of initiatives in cloud computing, integration, and automation, with a focus on the design of systems and services that run on cloud platforms. Primary focus will be to support design and development of end-to-end data integration solutions in AAH cloud infrastructure using approved technologies. Contributes to the Cloud Data Engineering team effort to provide architecture, design support for data movement within AAH cloud infrastructure. Additionally, will aid in ensuring the integrity, reliability and quality of that data services implemented in the platform. Major Responsibilities: • Drive scope definition, requirements analysis, data and technical design, pipeline build, product configuration, unit testing, and production deployment. • Design scalable ingestion processes to bring on-prem, API drive, 3rd party, end user generated data sources to integrate in common cloud infrastructure. • Design reusable assets, components, standards, frameworks, and processes to accelerate and facilitate data integration projects. • Develop data integration and transformation jobs using Python, SQL and ETL /ELT tools. • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources. • Build processes supporting data transformation, data structures, metadata, dependency and workload management. • Design parameter driven orchestration to allow for change data capture and monitoring. • Develop and implement scripts for data process maintenance, monitoring, and performance tuning. • Test and document data processes through data validation and verification procedures. • Collaborate with cross functional team to resolve data quality and operational issues. • Ensure delivered solutions meet/perform to technical and functional/non-functional requirements. • Ensure delivered solutions are realized in time frame committed. • Provide technical guidance and mentorship to junior engineers, ensuring best practices in data engineering. • Maintain overall industry knowledge on latest trends, technology, etc. Licensure, Registration and/or Certification Required: Must have experience in data transformation and data pipeline development using GUI based tools or programming languages like SQL and Python. Education Required: Bachelor's Degree in Computer Science or related field. Experience Required: Typically requires 5 years of experience in at least two IT disciplines, including database management, cloud engineering, data engineering and middleware technologies. Includes 2 years of work experience with cloud platforms, including experience with data integration, performance optimization, and platform administration Knowledge, Skills & Abilities Required: • Experience defining, designing, and developing solutions with data integration platforms/tools • Proven experience building and optimizing data pipelines, and data sets. • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. • Hands-on experience working with cloud based modern ETL/ELT tools and technologies like Fivetran, HVR, dbt, Airflow etc. • Proficiency in Python and SQL for scripting and building data transformation processes is preferred. • Experience in test automation with a focus on testing integrations, including APIs and data flows between enterprise systems. • Must have experience with DevOps tool chains and processes. • Understanding and exposure to Snowflake Data Cloud Physical Requirements and Working Conditions: This position does not require travel This job description indicates the general nature and level of work expected of the incumbent. It is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities required of the incumbent. Incumbent may be required to perform other related duties.