

Agile Resources, Inc.
Sr. Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer on a 1-year remote contract (EST or CST) with a pay rate of "Rate". Requires 8+ years of experience, strong Python skills, ETL development, and knowledge of PostgreSQL. Familiarity with automation tools like Airflow is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
640
-
ποΈ - Date
November 9, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Observability #Linux #Data Modeling #PostgreSQL #Data Governance #RDBMS (Relational Database Management System) #Data Security #Python #Scala #Programming #Automation #Database Design #BI (Business Intelligence) #Data Science #"ETL (Extract #Transform #Load)" #JSON (JavaScript Object Notation) #Scripting #Airflow #Data Warehouse #Data Engineering #Data Pipeline #Documentation #Data Integration #Data Lake #Security
Role description
Sr. Data Engineer
Remote - EST or CST
1 year contract (potential for extension)
No C2C or third parties
Weβre looking for a Data Engineer to design, build, and optimize modern data platforms and pipelines supporting advanced analytics and business intelligence initiatives. This role is ideal for someone who enjoys solving complex data challenges, automating data flows, and making information accessible and actionable at scale.
Youβll collaborate closely with analysts, product managers, and engineers to architect and deliver reliable data solutions that drive strategic decision-making.
What Youβll Do
β’ Design, develop, and maintain scalable data pipelines and integrations across multiple systems.
β’ Support and optimize data warehouse and data lake environments for performance, accessibility, and security.
β’ Transform raw data into structured, usable formats for analytics and reporting.
β’ Collaborate with analysts and data scientists on data modeling, mapping, and transformation rules.
β’ Monitor and troubleshoot production data pipelines to ensure reliability and availability.
β’ Stay current on emerging data technologies, contributing to continuous improvement and innovation.
What Weβre Looking For
β’ 8+ years of experience working with modern data solutions and platforms.
β’ Strong programming and scripting skills in Python.
β’ Hands-on experience with ETL development and data integration into data warehouses or data lakes.
β’ Deep knowledge of Postgres (PostgreSQL), relational database design, and query optimization.
β’ Experience working with various data formats (CSV, JSON, Excel, APIs, RDBMS, etc.).
β’ Familiarity with automation and orchestration tools such as Airflow or similar frameworks.
β’ Working knowledge of Windows and Linux environments.
β’ Excellent communication, problem-solving, and documentation skills.
Bonus Skills (Nice to Have):
β’ Experience with Microsoft Fabric or Elastic (ELK).
β’ Exposure to data governance, observability, or data security best practices.
Sr. Data Engineer
Remote - EST or CST
1 year contract (potential for extension)
No C2C or third parties
Weβre looking for a Data Engineer to design, build, and optimize modern data platforms and pipelines supporting advanced analytics and business intelligence initiatives. This role is ideal for someone who enjoys solving complex data challenges, automating data flows, and making information accessible and actionable at scale.
Youβll collaborate closely with analysts, product managers, and engineers to architect and deliver reliable data solutions that drive strategic decision-making.
What Youβll Do
β’ Design, develop, and maintain scalable data pipelines and integrations across multiple systems.
β’ Support and optimize data warehouse and data lake environments for performance, accessibility, and security.
β’ Transform raw data into structured, usable formats for analytics and reporting.
β’ Collaborate with analysts and data scientists on data modeling, mapping, and transformation rules.
β’ Monitor and troubleshoot production data pipelines to ensure reliability and availability.
β’ Stay current on emerging data technologies, contributing to continuous improvement and innovation.
What Weβre Looking For
β’ 8+ years of experience working with modern data solutions and platforms.
β’ Strong programming and scripting skills in Python.
β’ Hands-on experience with ETL development and data integration into data warehouses or data lakes.
β’ Deep knowledge of Postgres (PostgreSQL), relational database design, and query optimization.
β’ Experience working with various data formats (CSV, JSON, Excel, APIs, RDBMS, etc.).
β’ Familiarity with automation and orchestration tools such as Airflow or similar frameworks.
β’ Working knowledge of Windows and Linux environments.
β’ Excellent communication, problem-solving, and documentation skills.
Bonus Skills (Nice to Have):
β’ Experience with Microsoft Fabric or Elastic (ELK).
β’ Exposure to data governance, observability, or data security best practices.






