Compunnel Inc.

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 5+ years of experience, offering a hybrid contract in Westbrook, ME. Pay rate is "unknown." Key skills include AWS, dbt, Airflow, Snowflake, Python, and SQL. Monthly on-site rotation required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 23, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Westbrook, ME
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Agile #SQS (Simple Queue Service) #Storage #Programming #Data Pipeline #SNS (Simple Notification Service) #Airflow #Snowflake #dbt (data build tool) #Terraform #Apache Iceberg #Data Engineering #Metadata #Cloud #Data Processing #Python #Infrastructure as Code (IaC) #Lambda (AWS Lambda) #Scala #S3 (Amazon Simple Storage Service) #Data Accuracy #SQL (Structured Query Language) #AWS (Amazon Web Services)
Role description
Data Engineer – Hybrid (Westbrook, ME) Experience: 5+ years Location: Hybrid – Minimum 2 days/week onsite in Westbrook, Maine (No fully remote submissions) Interview Process: • 1st: Hiring Manager • 2nd: Technical Panel Rotation: Once a month for 1 week to support 24/7 operations About the Role We are looking for a motivated Data Engineer to support and enhance global instrument data pipelines. You will design, build, and maintain scalable, fault-tolerant data solutions using modern cloud technologies and collaborate with cross-functional teams. Top Required Skills • Strong AWS cloud services expertise: S3, SNS, SQS, Lambda, etc. • Proven experience building and maintaining operational data pipelines with dbt, Airflow, Snowflake • Strong proficiency in Python and SQL Nice-to-Have Skills • Familiarity with Terraform (Infrastructure as Code) • Experience with Quality Engineering (QE) processes Key Responsibilities • Design and implement ingestion and storage solutions on AWS • Develop analytical and ETL/ELT solutions using Snowflake, Airflow, Apache Iceberg • Collaborate with teams to understand and meet data processing needs • Maintain scalable and reliable data solutions for operational and business requirements • Document data flows, architecture decisions, and metadata • Implement fault-tolerant, high-availability pipelines • Participate in testing and quality engineering with the QE team • Translate complex business requirements into efficient data solutions • Proactively identify and mitigate data-related risks Preferred Background • 8+ years of experience in a similar role, preferably technology-driven • Experience building cloud-based ETL/ELT pipelines with IaC and large-scale data processing engines • Strong experience with SQL and at least one programming language (Python preferred) Success Metrics • Deliver projects on schedule • Collaborate effectively with cross-functional teams • Ensure data accuracy, accessibility, and platform stability • Reduce pipeline failures through resilient design • Positive stakeholder feedback on delivered solutions • Active contribution to Agile ceremonies and team productivity