Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a 12+ month contract, remote (PST preferred), offering competitive pay. Requires 7+ years of experience, proficiency in Python, SQL, Spark, Databricks, and Snowflake, with expertise in ETL processes and retention analytics.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 30, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Python #Batch #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Visualization #BI (Business Intelligence) #Customer Segmentation #Azure #Kafka (Apache Kafka) #Agile #Microsoft Power BI #Cloud #Snowflake #Data Pipeline #Big Data #AWS (Amazon Web Services) #Databricks #Datasets #Automation #Tableau #Data Engineering #Monitoring #GCP (Google Cloud Platform) #Data Analysis #ML (Machine Learning) #Spark (Apache Spark) #Strategy
Role description
Title : Data Engineer Location : Remote (PST Preferred) Job Type : Contract (12+ months) About The Company A leading entertainment company at the forefront of data-driven decision-making. This team focuses on optimizing retention strategies across streaming and television products, leveraging advanced data engineering techniques and cloud technologies to drive business impact. Job Description The Senior Data Engineer – Retention Analytics will play a critical role in building and optimizing data pipelines that support retention strategy and campaign performance analytics. This individual will work closely with data analysts and business stakeholders to develop, transform, and maintain high-quality datasets that drive insights into customer churn, financial performance, call volume, and viewership trends. Key Responsibilities β€’ Design, develop, and optimize ETL pipelines using Databricks and Snowflake . β€’ Collect, transform, and load (ETL) data into the warehouse and reporting environments. β€’ Optimize data performance and troubleshoot inefficiencies within Databricks. β€’ Work with large, unstructured datasets , managing complex joins and nested queries. β€’ Automate data dependencies and tasks , ensuring seamless data flow and pipeline reliability. β€’ Conduct daily monitoring of data flows to proactively resolve any errors or job failures. β€’ Build dashboards and visualizations in Tableau or Power BI (not a requirement but a plus). β€’ Support retention analytics by enabling data-driven decision-making across customer segmentation models. β€’ Implement data partitioning and performance tuning techniques (e.g., Salt technique, Repartitioning ) to optimize workloads. β€’ Work independently within an agile environment , collaborating with data analysts, business teams, and other engineering partners. Required Qualifications β€’ 7+ years of experience in data engineering . β€’ Proficiency in Python, SQL, Spark, and Databricks . β€’ Experience working with Snowflake and cloud-based data platforms (AWS, Azure, or GCP) . β€’ Strong understanding of ETL processes, data pipelines, and workflow automation . β€’ Experience working with churn, financial, subscription, and viewership data . β€’ Ability to track joins and manage nested, complex code structures in large datasets. β€’ Hands-on experience with data partitioning strategies for performance optimization. β€’ Preferred time zone: PST (open to Central but not East Coast). Nice-to-Have Qualifications β€’ Experience developing Tableau or Power BI dashboards . β€’ Familiarity with machine learning or predictive analytics in a data engineering context. β€’ Exposure to big data streaming technologies like Kafka or Spark Streaming. Interview Process & Timeline β€’ Open to reviewing candidates immediately – resumes should be submitted on a rolling basis (not in batches). β€’ Manager prefers go-getters with strong time management skills – independent workers who thrive in a structured, fast-paced environment.