Jobs via Dice

Principal Data Engineer(retail Client Experience)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Principal Data Engineer (Retail) on a 12-month remote contract, offering competitive pay. Required skills include 8+ years in data engineering, expertise in Python, GoLang, Spark, SQL, and cloud platforms (AWS or GCP).
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 23, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Scala #SQL (Structured Query Language) #Spark (Apache Spark) #Airflow #Data Quality #GCP (Google Cloud Platform) #GDPR (General Data Protection Regulation) #AWS (Amazon Web Services) #dbt (data build tool) #Data Access #Data Engineering #Data Processing #Computer Science #"ETL (Extract #Transform #Load)" #Golang #Compliance #Automation #Python #Databricks #Cloud
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Vertical Falls LLC, is seeking the following. Apply via Dice today! PRINCIPAL DATA ENGINEER(Retail) 100% REMOTE 12 MONTHS CONTRACT Responsibilities: β€’ Design, build, and optimize scalable ETL pipelines for structured and semi-structured data supporting insights use cases, growth metrics, live event analytics, and other external-facing data sets. β€’ Design and implement data models using industry best practices that capture a complete ecosystem view of user experiences, ensuring accuracy, scalability, and long-term usability. β€’ Architect and implement robust, maintainable, and high-performance data solutions. β€’ Automate workflows to reduce manual intervention and enhance data processing efficiency, including automation for content, growth, and live event analytics. β€’ Mentor data engineers across all levels, setting best practices and fostering technical growth. β€’ Optimize query performance and resolve pipeline bottlenecks to improve data accessibility. β€’ Evaluate and adopt new tools, frameworks, and methodologies to advance data engineering capabilities. β€’ Support cost optimization by ensuring scalable and efficient data solutions. β€’ Ensure data quality, governance, and compliance with regulatory standards (e.g., GDPR, CCPA). β€’ Contribute to the organization’s data engineering discipline by shaping infrastructure, standards, tooling, and best practices. Required Qualifications: β€’ Bachelor''s or Master’s degree in Computer Science, Information Systems, Engineering, or a related field. β€’ 8+ years experience in data engineering or a related field. β€’ Strong Expertise in big-data technologies, including Python, GoLang, Spark, Scala, SQL, and tools like Airflow and dbt. β€’ Proficiency in cloud infrastructure (AWS or Google Cloud Platform) and Databricks. β€’ Expertise in designing efficient and scalable data models with large data sets across multiple teams and in collaboration with insights or data stakeholders. β€’ Demonstrated ability to work cross-functionally with engineering, analytics, and product teams. β€’ Proven experience mentoring and guiding other engineers