

Anatta - Shopify Platinum Partner
Data Engineer (Contract) - North America
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer (Contract) position for a fast-growing eCommerce brand, offering $50 - $70 USD per hour. Requires 5+ years in data engineering, expertise in BigQuery, dbt, Looker, and experience with eCommerce data integration. Remote work.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
February 20, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
1099 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Remote
-
🧠 - Skills detailed
#Data Quality #Data Architecture #Anomaly Detection #"ETL (Extract #Transform #Load)" #Scala #Data Ingestion #Automation #BigQuery #Looker #A/B Testing #Data Pipeline #SQL (Structured Query Language) #Fivetran #AI (Artificial Intelligence) #Monitoring #dbt (data build tool) #SQL Queries #Data Engineering #Documentation #Cloud #Compliance
Role description
Overview We are seeking a highly experienced Data Engineer to support a fast-growing eCommerce brand operating within a modern, warehouse-first data architecture. This is a project-based, on-demand engagement for a senior-level professional who can design, build, and optimize scalable data infrastructure across analytics, marketing, and operational workflows. The ideal candidate has deep expertise in BigQuery, dbt, and Looker, with hands-on experience modeling complex eCommerce data from platforms such as Shopify, Klaviyo, subscription systems (e.g., Loop), and 3PL providers Key Responsibilities
Design, build, and maintain scalable data pipelines using BigQuery and dbt
Architect and optimize warehouse-first data models to support analytics, marketing, and operational reporting
Develop and maintain Looker dashboards and semantic layers
Integrate and transform data from Shopify, Klaviyo, Loop (subscriptions/returns) and 3PL systems (e.g., ShipHero, ShipBob, etc.)
Build automated workflows for data ingestion, validation, and monitoring
Implement best practices for data quality, governance, and documentation
Leverage AI tools (LLMs, automation frameworks) to:
Accelerate data transformation workflows
Refactor and optimize SQL/dbt models
Automate anomaly detection and QA processes
Collaborate with analytics, product, and marketing teams to translate business requirements into scalable data solutions
Troubleshoot data discrepancies and provide root-cause analysis
Recommend architectural improvements to improve performance, reliability, and scalability
Service Level Agreements (SLAs)
Response Time: Acknowledge requests within 24 hours
Critical Issues (P0): Immediate response and active resolution
Standard Requests: Timeline provided within 72 hours
Billing Model: Time & materials based on agreed scope and hours worked
Requirements
5+ years experience in data engineering within a modern cloud data stack
Advanced experience with BigQuery, Dbt, SQL performance optimization
Experience building and maintaining Looker dashboards and data models
Strong understanding of:
eCommerce metrics (AOV, LTV, churn, CAC, retention)
Marketing attribution
Subscription data structures
Experience integrating and modeling data from (not exactly similar software would work) Klaviyo, Loop or similar subscription platforms, Shopify 3PL systems
Experience implementing data quality checks and validation pipelines
Strong written and verbal communication skills
Ability to work independently with minimal oversight
Preferred Qualifications
Experience working with distributed or agency environments
Experience supporting DTC brands
Experience with marketing data pipelines (Google Ads, Meta, TikTok)
Experience building reverse ETL workflows
Familiarity with data ingestion tools (Fivetran, Daton, Airbyte, etc.)
Experience implementing AI-assisted data transformation workflows
Exposure to experimentation analytics (A/B testing frameworks)
Ideal Candidate Profile We are looking for someone who:
Understands both technical architecture and business implications
Can move quickly in a project-based environment
Thinks in systems, not just SQL queries
Is comfortable leveraging AI tools to improve speed and efficiency
Has experience working across marketing, product, and operations data
The duties and responsibilities described here are not a comprehensive list and the scope of the job may change as necessitated by business demands. Anatta Design reserves the right to revise the job description as circumstances warrant. This is a contract position, the pay range is $50 - $70 USD per hour. The hourly pay range listed is an estimate and may vary based on skills, experience, location, and project requirements. Final rates will be confirmed at the time of offer.
Compensation Transparency & Pay Philosophy At Anatta Design, we believe in fair and competitive compensation based on location. We post salary ranges in compliance with state requirements for U.S.-based roles, ensuring transparency for candidates in those regions. If you are applying from outside the U.S., please note that our pay scales are adjusted based on the cost of living and market conditions in each country.
gZgWLMW8rb
Overview We are seeking a highly experienced Data Engineer to support a fast-growing eCommerce brand operating within a modern, warehouse-first data architecture. This is a project-based, on-demand engagement for a senior-level professional who can design, build, and optimize scalable data infrastructure across analytics, marketing, and operational workflows. The ideal candidate has deep expertise in BigQuery, dbt, and Looker, with hands-on experience modeling complex eCommerce data from platforms such as Shopify, Klaviyo, subscription systems (e.g., Loop), and 3PL providers Key Responsibilities
Design, build, and maintain scalable data pipelines using BigQuery and dbt
Architect and optimize warehouse-first data models to support analytics, marketing, and operational reporting
Develop and maintain Looker dashboards and semantic layers
Integrate and transform data from Shopify, Klaviyo, Loop (subscriptions/returns) and 3PL systems (e.g., ShipHero, ShipBob, etc.)
Build automated workflows for data ingestion, validation, and monitoring
Implement best practices for data quality, governance, and documentation
Leverage AI tools (LLMs, automation frameworks) to:
Accelerate data transformation workflows
Refactor and optimize SQL/dbt models
Automate anomaly detection and QA processes
Collaborate with analytics, product, and marketing teams to translate business requirements into scalable data solutions
Troubleshoot data discrepancies and provide root-cause analysis
Recommend architectural improvements to improve performance, reliability, and scalability
Service Level Agreements (SLAs)
Response Time: Acknowledge requests within 24 hours
Critical Issues (P0): Immediate response and active resolution
Standard Requests: Timeline provided within 72 hours
Billing Model: Time & materials based on agreed scope and hours worked
Requirements
5+ years experience in data engineering within a modern cloud data stack
Advanced experience with BigQuery, Dbt, SQL performance optimization
Experience building and maintaining Looker dashboards and data models
Strong understanding of:
eCommerce metrics (AOV, LTV, churn, CAC, retention)
Marketing attribution
Subscription data structures
Experience integrating and modeling data from (not exactly similar software would work) Klaviyo, Loop or similar subscription platforms, Shopify 3PL systems
Experience implementing data quality checks and validation pipelines
Strong written and verbal communication skills
Ability to work independently with minimal oversight
Preferred Qualifications
Experience working with distributed or agency environments
Experience supporting DTC brands
Experience with marketing data pipelines (Google Ads, Meta, TikTok)
Experience building reverse ETL workflows
Familiarity with data ingestion tools (Fivetran, Daton, Airbyte, etc.)
Experience implementing AI-assisted data transformation workflows
Exposure to experimentation analytics (A/B testing frameworks)
Ideal Candidate Profile We are looking for someone who:
Understands both technical architecture and business implications
Can move quickly in a project-based environment
Thinks in systems, not just SQL queries
Is comfortable leveraging AI tools to improve speed and efficiency
Has experience working across marketing, product, and operations data
The duties and responsibilities described here are not a comprehensive list and the scope of the job may change as necessitated by business demands. Anatta Design reserves the right to revise the job description as circumstances warrant. This is a contract position, the pay range is $50 - $70 USD per hour. The hourly pay range listed is an estimate and may vary based on skills, experience, location, and project requirements. Final rates will be confirmed at the time of offer.
Compensation Transparency & Pay Philosophy At Anatta Design, we believe in fair and competitive compensation based on location. We post salary ranges in compliance with state requirements for U.S.-based roles, ensuring transparency for candidates in those regions. If you are applying from outside the U.S., please note that our pay scales are adjusted based on the cost of living and market conditions in each country.
gZgWLMW8rb






