

Compunnel Inc.
Hiring Former Retail Data Engineers
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with experience in Fortune-1-scale retail technology. Contract length is unspecified, with a pay rate of "unknown." Key skills include SQL, Python, ETL/ELT pipelines, and distributed data processing. Location is remote.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
440
-
ποΈ - Date
December 2, 2025
π - Duration
Unknown
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Bentonville, AR
-
π§ - Skills detailed
#GCP (Google Cloud Platform) #BigQuery #dbt (data build tool) #Azure #Data Pipeline #Data Modeling #PySpark #Scala #Data Engineering #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Spark (Apache Spark) #Airflow #Data Processing #Python #Pandas #AWS (Amazon Web Services) #Cloud
Role description
π¨ URGENT: Data Engineering Roles β Former Employees of a Major Retail Tech Enterprise π¨
Weβre hiring Data Engineers with prior experience working in a Fortune-1βscale retail technology environment.
If youβve supported large, complex retail data ecosystems and are open to new remote opportunities, letβs connect.
Role: Data Engineer
Location: Bentonville, AR (Remote)
Technical coding: Coderbyte (SQL + Python)
Tech Stack: Airflow, dbt, PySpark, Spark 3, Python 3, SQL, BigQuery, pandas
Cloud: GCP / AWS / Azure
Requirements:
β Strong experience building scalable ETL/ELT pipelines
β Data modeling, transformations, orchestration
β Distributed data processing (PySpark / Spark)
β SQL performance tuning
β CI/CD for data pipelines (preferred)
π¨ URGENT: Data Engineering Roles β Former Employees of a Major Retail Tech Enterprise π¨
Weβre hiring Data Engineers with prior experience working in a Fortune-1βscale retail technology environment.
If youβve supported large, complex retail data ecosystems and are open to new remote opportunities, letβs connect.
Role: Data Engineer
Location: Bentonville, AR (Remote)
Technical coding: Coderbyte (SQL + Python)
Tech Stack: Airflow, dbt, PySpark, Spark 3, Python 3, SQL, BigQuery, pandas
Cloud: GCP / AWS / Azure
Requirements:
β Strong experience building scalable ETL/ELT pipelines
β Data modeling, transformations, orchestration
β Distributed data processing (PySpark / Spark)
β SQL performance tuning
β CI/CD for data pipelines (preferred)






