

Fusion Life Sciences Technologies LLC
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position, remote with 10 on-site days/year, offering a competitive pay rate. Requires 5–8 years of experience, strong skills in AWS, PostgreSQL, Airflow, and Python, plus prior work in education or public-sector analytics.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 25, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Automation #SQL (Structured Query Language) #Data Warehouse #Data Automation #Data Modeling #AWS (Amazon Web Services) #Security #AWS RDS (Amazon Relational Database Service) #Docker #IAM (Identity and Access Management) #"ETL (Extract #Transform #Load)" #PostgreSQL #Data Engineering #Agile #Python #Scrum #Airflow #GitLab #RDS (Amazon Relational Database Service) #S3 (Amazon Simple Storage Service)
Role description
Job Description – Data Engineer
Location: Remote (10 on-site days/year)
Needed skills :(AWS / PostgreSQL / Airflow / Python)
We are seeking a Data Engineer with strong experience in AWS, PostgreSQL, Airflow, and Python to support and enhance the State of Washington’s education data systems. The engineer will maintain and extend the AWS/PostgreSQL data warehouse, build and optimize ETL/ELT pipelines, and improve data automation and reliability within an Agile team. Prior work in education or public-sector analytics is required.
Responsibilities
• Maintain and enhance AWS/PostgreSQL data warehouse and analytics environment
• Build and optimize ETL/ELT pipelines using SQL and Python
• Develop and manage Airflow DAGs for workflow orchestration
• Ensure data reliability, quality, and security (including PII handling)
• Support CI/CD automation using GitLab and Docker
• Collaborate in Agile/Scrum activities, perform reviews, and support integrations
Required Skills
• 5–8 years of Data Engineering experience
• 4+ years building ETL/ELT pipelines (Python, SQL)
• 4+ years with Airflow (DAGs, scheduling)
• Strong PostgreSQL skills (data modeling, query tuning, AWS RDS optimization)
• Experience with AWS services: RDS, S3, IAM
• Familiarity with GitLab CI/CD, Docker
• Experience with education data systems or public-sector analytics
Job Description – Data Engineer
Location: Remote (10 on-site days/year)
Needed skills :(AWS / PostgreSQL / Airflow / Python)
We are seeking a Data Engineer with strong experience in AWS, PostgreSQL, Airflow, and Python to support and enhance the State of Washington’s education data systems. The engineer will maintain and extend the AWS/PostgreSQL data warehouse, build and optimize ETL/ELT pipelines, and improve data automation and reliability within an Agile team. Prior work in education or public-sector analytics is required.
Responsibilities
• Maintain and enhance AWS/PostgreSQL data warehouse and analytics environment
• Build and optimize ETL/ELT pipelines using SQL and Python
• Develop and manage Airflow DAGs for workflow orchestration
• Ensure data reliability, quality, and security (including PII handling)
• Support CI/CD automation using GitLab and Docker
• Collaborate in Agile/Scrum activities, perform reviews, and support integrations
Required Skills
• 5–8 years of Data Engineering experience
• 4+ years building ETL/ELT pipelines (Python, SQL)
• 4+ years with Airflow (DAGs, scheduling)
• Strong PostgreSQL skills (data modeling, query tuning, AWS RDS optimization)
• Experience with AWS services: RDS, S3, IAM
• Familiarity with GitLab CI/CD, Docker
• Experience with education data systems or public-sector analytics






