

orangepeople
Staff Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Staff Data Engineer with a contract length of "unknown" and a pay rate of "$X/hour." Work is remote. Key skills include Apache Spark, Kafka, SQL, and cloud-native platforms. Requires 8–12+ years in large-scale data systems, with experience in FinTech or gaming preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 22, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Analysis #Alation #Batch #Monitoring #SQL (Structured Query Language) #Scala #Cloud #Trino #Data Pipeline #Data Engineering #Kafka (Apache Kafka) #Data Governance #Apache Spark #Databases #Security #Apache Kafka #Java #Python #AI (Artificial Intelligence) #Compliance #Documentation #Spark (Apache Spark) #Data Quality #Datasets
Role description
OP is partnering with a fast-growing iGaming startup to hire talented individuals building the next generation of sweepstakes-based gaming experiences. Operating at the intersection of gaming, technology, and user engagement, the platform enables seamless cross-game interactions powered by innovative mechanics. With plans to rapidly scale this year, the team is seeking builders who thrive in fast-paced environments where ideas move from concept to live product in weeks.
Role Overview
As a Staff Engineer, Data Platform, you will serve as a key contributor for the data foundations. You’ll design and evolve the systems that power real-time and batch data pipelines, canonical datasets, transactional systems, analytics, and compliance workflows. This is a hands-on senior individual contributor role with broad influence across engineering, product, analytics, security, and compliance.
Key Responsibilities
• Own and evolve the data platform architecture supporting real-time and batch workloads.
• Design and build scalable, reliable data pipelines for gameplay, payments, and analytics.
• Define canonical datasets, data models, and golden paths for data consumption.
• Ensure data correctness, auditability, and traceability in regulated environments.
• Define SLAs, monitoring, and operational standards for data systems.
• Serve as an escalation point for complex data incidents.
• Mentor senior engineers and influence architecture across teams.
• Leverage AI tools to accelerate development, testing, analysis, and documentation.
Core Data Technology Stack
• The ideal candidate has deep, hands-on experience with modern data platforms and large-scale data systems, including:
• Distributed processing: Apache Spark, Apache Flink.
• Streaming & messaging: Apache Kafka.
• Query & analytics engines: Trino.
• Databases: ClickHouse, Postgres.
• Data languages: SQL.
• Cloud-native data platforms and distributed systems.
• Operational tooling for monitoring, alerting, and data quality.
• Applied use of AI/LLMs and agents for data analysis, incident response, and developer productivity.
Required Qualifications
• 8–12+ years building large-scale data systems in production environments.
• Expertise in data platform architecture, pipelines, and distributed systems.
• Strong SQL plus proficiency in Python, Java, or Scala.
• Experience operating data platforms with SLAs, on-call rotation, and compliance requirements.
Nice To Have
• FinTech, gaming, payments, or regulated environment experience.
• Real-time streaming, CDC, or event-driven architectures.
• Experience in stewarding canonical datasets and data governance.
• Passion for gaming or entertainment platforms.
Role Clarifications
• This is an individual contributor role. It is a high-ownership, low-process, high-trust environment with strong emphasis on production quality, reliability, security, and operational excellence.
Benefits
• 100% medical, dental & vision coverage (partial for dependents).
• 401(k) employer match.
OP is partnering with a fast-growing iGaming startup to hire talented individuals building the next generation of sweepstakes-based gaming experiences. Operating at the intersection of gaming, technology, and user engagement, the platform enables seamless cross-game interactions powered by innovative mechanics. With plans to rapidly scale this year, the team is seeking builders who thrive in fast-paced environments where ideas move from concept to live product in weeks.
Role Overview
As a Staff Engineer, Data Platform, you will serve as a key contributor for the data foundations. You’ll design and evolve the systems that power real-time and batch data pipelines, canonical datasets, transactional systems, analytics, and compliance workflows. This is a hands-on senior individual contributor role with broad influence across engineering, product, analytics, security, and compliance.
Key Responsibilities
• Own and evolve the data platform architecture supporting real-time and batch workloads.
• Design and build scalable, reliable data pipelines for gameplay, payments, and analytics.
• Define canonical datasets, data models, and golden paths for data consumption.
• Ensure data correctness, auditability, and traceability in regulated environments.
• Define SLAs, monitoring, and operational standards for data systems.
• Serve as an escalation point for complex data incidents.
• Mentor senior engineers and influence architecture across teams.
• Leverage AI tools to accelerate development, testing, analysis, and documentation.
Core Data Technology Stack
• The ideal candidate has deep, hands-on experience with modern data platforms and large-scale data systems, including:
• Distributed processing: Apache Spark, Apache Flink.
• Streaming & messaging: Apache Kafka.
• Query & analytics engines: Trino.
• Databases: ClickHouse, Postgres.
• Data languages: SQL.
• Cloud-native data platforms and distributed systems.
• Operational tooling for monitoring, alerting, and data quality.
• Applied use of AI/LLMs and agents for data analysis, incident response, and developer productivity.
Required Qualifications
• 8–12+ years building large-scale data systems in production environments.
• Expertise in data platform architecture, pipelines, and distributed systems.
• Strong SQL plus proficiency in Python, Java, or Scala.
• Experience operating data platforms with SLAs, on-call rotation, and compliance requirements.
Nice To Have
• FinTech, gaming, payments, or regulated environment experience.
• Real-time streaming, CDC, or event-driven architectures.
• Experience in stewarding canonical datasets and data governance.
• Passion for gaming or entertainment platforms.
Role Clarifications
• This is an individual contributor role. It is a high-ownership, low-process, high-trust environment with strong emphasis on production quality, reliability, security, and operational excellence.
Benefits
• 100% medical, dental & vision coverage (partial for dependents).
• 401(k) employer match.






