

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer for 6-12 months, remote, with a pay rate of "unknown." Requires 8-10 years of data engineering experience, proficiency in Python, SQL, GCP tools, and expertise in ETL tools and retail media analytics.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 12, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
New York, United States
-
π§ - Skills detailed
#Databricks #API (Application Programming Interface) #ML (Machine Learning) #Python #Scala #SQL (Structured Query Language) #Datasets #GCP (Google Cloud Platform) #Talend #Data Engineering #Data Pipeline #BigQuery #Informatica #"ETL (Extract #Transform #Load)" #Migration #Dataflow #Slowly Changing Dimensions #Compliance #Kafka (Apache Kafka) #Data Ingestion
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Role: Sr. Data Engineer
Duration: 6-12 months
Location: Remote
Open to Sponsorship: Yes
Need proper LinkedIn with profile photo.
Role Overview
As a Sr. Data Engineer, you will build and optimize data pipelines that support Ultaβs retail media campaigns and customer journey analytics. Youβll work closely with architects, analysts, and media partners to deliver scalable, privacy-compliant data solutions.
Key Responsibilities
β’ Build and maintain data pipelines using GCP (BigQuery, Dataflow, Composer)
β’ Develop Python scripts for data transformation and ingestion
β’ Support real-time data streaming and campaign analytics
β’ Collaborate with users to deliver clean, validated datasets
β’ Integrate with clean room environments and ensure privacy compliance
β’ Build data models (e.g., slowly changing dimensions) for campaign and customer journey analytics
β’ Lead design and POC of migration planning from GCP to Databricks β Connecting to Databricks for ML Use cases (Hosted in GCP)
β’ Enable real-time data ingestion and streaming (Kafka or similar)
Required Skills
β’ 8β10 years in data engineering
β’ Proficient in Python, SQL, and GCP tools (BigQuery, Dataflow, Composer)
β’ Experience with clean rooms (Google, Databricks)
β’ Familiarity with Kafka or similar streaming tools
β’ Strong understanding of retail media and campaign KPIs
β’ Expertise in ETL tools like Informatica or Talend
β’ API Integrations experience e.g. Facebook/Meta or similar engineering experience
β’ Bonus: Experience with Databricks and, Data/ ML pipelines