

Hunter
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Contract Data Engineer position based in California, requiring 5+ years of data engineering experience, proficiency in SQL and Python, and familiarity with Apache Airflow. Experience with financial data and ETL processes is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 22, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
California, United States
-
🧠 - Skills detailed
#Redshift #Cloud #Luigi #Data Processing #Data Quality #Airflow #Data Pipeline #Linux #Documentation #GIT #SQL (Structured Query Language) #Scala #Python #Datasets #Java #Metadata #Snowflake #SAP #Apache Airflow #Data Engineering #Oracle #Data Architecture #BigQuery #Version Control #"ETL (Extract #Transform #Load)"
Role description
Job title: Data Engineer Client: Hunter Scouts Location: California, United States - On-Site Contract type: CONTRACT Contract duration: 40 hours/week, contract (duration not specified) Salary:
About the role Hunter Scouts is seeking a Contract Data Engineer to support a confidential, large professional services client. This is a hands-on role focused on designing and operating robust ETL/data pipelines to support Accounting and Treasury workflows. You will build and maintain SQL/Python pipelines, orchestrate workflows with Airflow (or similar), and collaborate with finance stakeholders to deliver reliable, well-documented data solutions.
Responsibilities
• Design, build, and maintain scalable ETL/data pipelines using SQL and Python
• Orchestrate workflows and schedules using Apache Airflow or similar orchestration tools
• Develop and optimize data models following data architecture best practices
• Ensure data quality, reliability, and documentation across pipelines and datasets
• Collaborate with Accounting and Treasury stakeholders to translate business requirements into data solutions
• Monitor, troubleshoot, and improve pipeline performance and cost efficiency
• Contribute to and promote data engineering standards and best practices
Requirements
• Must be based in California, United States and authorized to work in the U.S. without employer sponsorship
• Minimum 5 years of hands-on experience in data engineering or data pipeline development
• Strong proficiency in SQL and Python for data engineering
• Experience building and maintaining production ETL processes and data models
• Proficiency with Apache Airflow or a similar orchestration tool (e.g., Prefect, Luigi)
• Experience working with financial data and a solid understanding of Accounting and Treasury concepts
Preferred Skills
• Experience integrating with ERP or Treasury Management Systems (e.g., SAP, Oracle, Kyriba)
• Familiarity with modern cloud data platforms and warehousing patterns (e.g., Snowflake, BigQuery, Redshift)
• Experience with additional data processing languages (e.g., Scala, Java)
• Experience implementing data quality frameworks, lineage, or metadata tooling
• Familiarity with version control (Git), Linux, and CI/CD for data pipelines
Notes
• This is a 40-hours-per-week contract role; compensation is hourly.
• Client identity is confidential; do not disclose the company name externally.
Job title: Data Engineer Client: Hunter Scouts Location: California, United States - On-Site Contract type: CONTRACT Contract duration: 40 hours/week, contract (duration not specified) Salary:
About the role Hunter Scouts is seeking a Contract Data Engineer to support a confidential, large professional services client. This is a hands-on role focused on designing and operating robust ETL/data pipelines to support Accounting and Treasury workflows. You will build and maintain SQL/Python pipelines, orchestrate workflows with Airflow (or similar), and collaborate with finance stakeholders to deliver reliable, well-documented data solutions.
Responsibilities
• Design, build, and maintain scalable ETL/data pipelines using SQL and Python
• Orchestrate workflows and schedules using Apache Airflow or similar orchestration tools
• Develop and optimize data models following data architecture best practices
• Ensure data quality, reliability, and documentation across pipelines and datasets
• Collaborate with Accounting and Treasury stakeholders to translate business requirements into data solutions
• Monitor, troubleshoot, and improve pipeline performance and cost efficiency
• Contribute to and promote data engineering standards and best practices
Requirements
• Must be based in California, United States and authorized to work in the U.S. without employer sponsorship
• Minimum 5 years of hands-on experience in data engineering or data pipeline development
• Strong proficiency in SQL and Python for data engineering
• Experience building and maintaining production ETL processes and data models
• Proficiency with Apache Airflow or a similar orchestration tool (e.g., Prefect, Luigi)
• Experience working with financial data and a solid understanding of Accounting and Treasury concepts
Preferred Skills
• Experience integrating with ERP or Treasury Management Systems (e.g., SAP, Oracle, Kyriba)
• Familiarity with modern cloud data platforms and warehousing patterns (e.g., Snowflake, BigQuery, Redshift)
• Experience with additional data processing languages (e.g., Scala, Java)
• Experience implementing data quality frameworks, lineage, or metadata tooling
• Familiarity with version control (Git), Linux, and CI/CD for data pipelines
Notes
• This is a 40-hours-per-week contract role; compensation is hourly.
• Client identity is confidential; do not disclose the company name externally.