TEK NINJAS

Senior Data Engineer (HRIS Exp)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (HRIS Exp) in Boston, MA (Hybrid) for a contract length of "unknown." The pay rate is "unknown." Requires 10+ years of experience, expertise in HRIS/HCM systems, Snowflake, SQL, and cloud data warehousing.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
October 22, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Boston, MA
-
🧠 - Skills detailed
#EC2 #Data Ingestion #Redshift #Data Vault #Cloud #dbt (data build tool) #Data Modeling #Automation #Data Migration #Data Warehouse #Computer Science #Programming #Workday #Kafka (Apache Kafka) #Security #SnowSQL #Data Quality #Airflow #Data Pipeline #Azure #Synapse #Amazon Redshift #SQL (Structured Query Language) #Scala #SnowPipe #ADLS (Azure Data Lake Storage) #Python #Datasets #Scripting #GCP (Google Cloud Platform) #Data Governance #Azure Synapse Analytics #Snowflake #BI (Business Intelligence) #Migration #SAP #Data Engineering #AWS (Amazon Web Services) #Oracle #S3 (Amazon Simple Storage Service) #BigQuery #Vault #"ETL (Extract #Transform #Load)"
Role description
Job Role: Data Engineer (HRIS EXP) Location: Boston, MA (Hybrid) Summary We are seeking an experienced and highly skilled Data Engineer. The ideal candidate will be instrumental in designing, building, and maintaining robust and scalable data pipelines, with a specific focus on integrating and transforming data from our Human Resources Information System (HRIS) and Human Capital Management (HCM) platforms. This role requires deep expertise in modern cloud data warehousing solutions, particularly Snowflake, and a strong background in working with diverse data technologies. Key Responsibilities β€’ HRIS/HCM Data Expertise: Design, develop, and optimize data ingestion and transformation pipelines to extract, load, and transform large datasets from HRIS or HCM systems (e.g., Workday, SAP SuccessFactors, Oracle HCM). β€’ Data Warehouse Development: Build and maintain high-performance data pipelines and data models within our cloud data warehouse, primarily using Snowflake. β€’ ETL/ELT Development: Write complex SQL, use advanced data warehousing concepts, and employ tools to ensure data quality, integrity, and timely delivery of data to downstream consumers for reporting and analytics. β€’ Cross-Platform Integration: Develop solutions for data migration and integration involving a variety of cloud data platforms, including Redshift, Synapse, and BigQuery. β€’ Performance Optimization: Tune and optimize data warehouse queries and structures for speed and efficiency, especially in high-volume environments. β€’ Collaboration: Work closely with HR analysts, business stakeholders, and other engineering teams to understand data requirements and deliver actionable data solutions. Required Skills and Experience β€’ 10+ years of professional experience in a Data Engineering, BI Engineering, or similar role. β€’ Deep experience with HRIS or HCM systems as a data source, including familiarity with common HR data domains (e.g., employee lifecycle, payroll, benefits, talent). β€’ Extensive hands-on experience with Snowflake (e.g., SnowSQL, Snowpipe, administering warehouses, performance tuning). β€’ Expert-level proficiency in SQL and data modeling (Kimball, Inmon, Data Vault). β€’ Proven experience working with at least two of the following cloud data warehouses: Amazon Redshift, Azure Synapse Analytics, or Google BigQuery. β€’ Strong programming skills in a language such as Python for scripting and ETL/ELT pipeline automation. β€’ Experience with workflow orchestration tools (e.g., Airflow, dbt). Great to Have β€’ Direct hands-on experience with Workday as an HRIS/HCM platform (including use of Workday Studio, Report Writer, or Prism Analytics). β€’ Experience with real-time data streaming technologies (e.g., Kafka, Kinesis). β€’ Familiarity with cloud platforms (AWS, Azure, or GCP) services relevant to data engineering (e.g., S3, ADLS, GCS, EC2, Azure VMs). β€’ Understanding of data governance, security, and PII/PHI handling best practices, particularly in an HR context. Education β€’ Bachelor’s degree in computer science, Engineering, Information Technology, or a related field, or equivalent practical experience.