Cypress HCM

Senior Data Engineer, 36670111

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown" and a pay rate of "$85-91.03/hr." Located in Kirkland, WA, candidates must have 5+ years in Python, SQL, ETL, and experience with Hadoop and cloud data solutions.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
728
-
πŸ—“οΈ - Date
February 22, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Kirkland, WA
-
🧠 - Skills detailed
#Snowflake #Compliance #"ETL (Extract #Transform #Load)" #Automation #Data Governance #Data Management #Trino #Base #Storage #DevOps #Data Warehouse #SQL (Structured Query Language) #Tableau #Forecasting #Kafka (Apache Kafka) #MIS Systems (Management Information Systems) #NoSQL #Lambda (AWS Lambda) #Data Mart #Data Processing #Shell Scripting #Data Storage #Jenkins #Agile #HDFS (Hadoop Distributed File System) #Apache Airflow #Data Lake #SaaS (Software as a Service) #Spark (Apache Spark) #Scripting #Hadoop #Mathematics #Python #Data Transformations #Data Engineering #Programming #Unix #Data Science #API (Application Programming Interface) #Cloud #Data Pipeline #Computer Science #Airflow
Role description
Role Overview The FinOps Tools team is seeking a Senior Data Engineer. Your role will involve analyzing and preparing data for ingestion, developing new and supporting existing highly available data products, models, and data marts in a Hadoop environment. Your work will drive our FinOps Cloud Infrastructure Cost Management as well as Capacity Forecasting decisions. What you get to do in this role:?? β€’ Analyze and prepare data for ingestion, including data governance and compliance reviews to support various analytics initiatives. β€’ Develop and maintain highly available data products, models, and data marts. β€’ Collaborate with cross-functional teams to drive Cloud and Infrastructure Cost Management capabilities. β€’ Support FinOps and Capacity Analytics through data engineering solutions. β€’ Troubleshoot data currency and integration issues to ensure data reliability and accuracy. In order to qualify for and be successful in this role, you should have: β€’ 5+ years of demonstrated end-to-end Python, SQL, and ETL tool development to automate and accelerate data acquisition, manage data products, transformations, enrichment, and data pipelines at scale. β€’ 3+ years of experience working with on-premises and cloud-based infrastructure services such as SaaS, IaaS, FinTech, capacity planning, supply chain, or data analytics environments. β€’ Extensive experience in Data Engineering and DevOps within a data, FinOps or analytics-focused organization. β€’ Strong programming skills in Python, SQL, Rust, Unix programming, and similar languages. β€’ Experience working with on-premises and cloud data warehouse, DBMS, data lake platforms, technologies, most specifically Hadoop, Snowflake, NoSQL, HDFs, Spark, Kafka, Trino, etc. β€’ Familiarity with data transformations, data pipeline concepts, including ETL, ELT, streaming, Lambda, and patterns such as change data capture, SCD, lineage, and orchestration tools such as (e.g., Apache Airflow, or similar). β€’ Familiarity with CI/CD best practices and automation tools such as Jenkins and agile development environments. β€’ Experience with developing data insights and dashboards using Tableau and Tableau Prep. β€’ Excellent written and verbal communication skills, willing to prepare knowledge base, operational procedures, and change plans. β€’ Experience working with the ServiceNow platform, data management frameworks, and API's is a plus. β€’ Ability to work independently, execute with agility, and learn in a fast-paced environment. β€’ Exceptional team player with a collaborative mindset. Education β€’ Bachelor's degree in Computer Science, Data Science, Management Information Systems, Mathematics, or a related field. Equivalent work experience is also acceptable for the position Manager’s Notes Based on the job description, what are the must have non-negotiable items that a candidate must have to be successful in this role? β€’ Proficient in programming languages like Python, SQL, Rust, and shell scripting to build and support automations and data products. β€’ Skilled in working with both on-site and cloud-based data storage solutions, including Hadoop, Snowflake, NoSQL, and others. β€’ Knowledgeable about data processing and pipeline concepts, such as ETL, ELT, and tools like Apache Airflow. β€’ Understanding of GitOps and CI/CD best practices and tools like Jenkins, as well as agile development. β€’ Experienced in managing and creating data insights and dashboards using Tableau What backgrounds/skills can we be more flexible with that can be learned on the job? β€’ ServiceNow Platform and API integrations β€’ Homegrown Ingestion pipelines Do you know if this position requires to sit onsite or travel? β€’ Yes Hybrid in Kirkland WA, onsite 3 days a week. Does this position have the opportunity to extend beyond the initial contract or convert to FTE? β€’ yes What will the interview process look like? β€’ 45-minute technical competency and capabilities β€’ 30-minute behavioral and team fit β€’ 30-minute w/ Manager Pay Rate Range β€’ $85-91.03/hr.