Twine

Freelance Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Freelance Data Engineer with a contract length of over 6 months, offering a pay rate of "unknown." Key skills include ELT pipeline development using Snowflake, Airflow, Pyspark, AWS, Fivetran, and dbt, along with strong SQL proficiency.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 8, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Spark (Apache Spark) #PySpark #Data Warehouse #Cloud #Data Quality #Snowflake #Data Integration #SQL (Structured Query Language) #Data Engineering #BI (Business Intelligence) #Data Pipeline #Airflow #AI (Artificial Intelligence) #Migration #AWS (Amazon Web Services) #Documentation #Scala #Observability #dbt (data build tool) #Fivetran
Role description
This role offers the opportunity to design, build, and maintain advanced ELT pipelines within a dynamic cloud-based environment. You will be responsible for ensuring the reliability, scalability, and efficiency of data workflows, supporting both legacy and modern data systems. The position requires close collaboration with business intelligence and architecture teams to deliver high-quality, timely data solutions that drive business insights and decision-making. Deliverables • Develop, implement, and maintain robust ELT pipelines using Snowflake, Airflow, Pyspark, AWS, Fivetran, and dbt • Manage and support both legacy and modern data systems, ensuring seamless data integration and migration • Monitor and enhance data quality, observability, and reliability across all data workflows • Automate data integration processes and support migration from legacy environments to modern cloud platforms • Maintain comprehensive documentation for data pipelines, workflows, and system architecture • Collaborate with BI and architecture teams to address business requirements and solve complex data challenges Requirements • Proven experience in data engineering, with a strong focus on ELT pipeline development and maintenance • Advanced proficiency in SQL and experience with pipeline orchestration tools such as Airflow • Hands-on expertise with Snowflake, Pyspark, AWS, Fivetran, and dbt • Solid understanding of data quality, observability, and governance best practices • Experience supporting both legacy and modern data systems in a cloud environment • Familiarity with business intelligence tools and data warehouse management • Strong problem-solving skills and ability to work collaboratively in cross-functional teams • Excellent written and verbal communication skills • Availability for a full-time contract position About Twine Twine is a leading freelance marketplace connecting top freelancers, consultants, and contractors with companies needing creative and tech expertise. Trusted by Fortune 500 companies and innovative startups alike, Twine enables companies to scale their teams globally. Our Mission Twine's mission is to empower creators and businesses to thrive in an AI-driven, freelance-first world.