

Sepal AI
Analytics Engineer (Observability Tooling)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Analytics Engineer (Observability Tooling) with 3+ years of data engineering experience, focusing on high-throughput log analysis. Pay ranges from $50-$85/hr. Key skills include SQL, analytical databases, and log ingestion tools. Remote position.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
680
-
ποΈ - Date
October 20, 2025
π - Duration
Unknown
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Observability #BigQuery #DevOps #AI (Artificial Intelligence) #Docker #Cloud #Python #Redshift #Schema Design #SQL (Structured Query Language) #Snowflake #Datasets #Databases #Logstash #Data Engineering
Role description
Sepal AI builds the worldβs hardest tests for AI grounded in real-world software systems. Weβre hiring a Data Engineer with 3+ years of experience and a strong systems mindset to help us build evaluation environments for AI in high-throughput log analysis contexts.
π§ What Youβll Do
- Design and implement analytical schemas and pipelines using tools like BigQuery, ClickHouse, Snowflake, Redshift, and other high-performance columnar databases.
- Work on complex, distributed queries over massive log and telemetry datasets.
- Create and manage synthetic datasets that simulate real-world DevOps, observability, or cloud infrastructure logs.
- Tune and optimize distributed query execution plans to avoid timeouts and reduce over-scanning.
β
Who You Are
- 3+ years of experience in data engineering or backend systems roles.
- Deep expertise in analytical databases and OLAP engines with a focus on large-scale query optimization, schema design, and performance tuning.
- Hands-on with log ingestion pipelines (e.g., FluentBit, Logstash, Vector) and schema design for observability systems.
- Strong SQL skills: you know how to reason through performance problems and spot inefficient query patterns.
- Bonus: Experience with Python, Docker, or synthetic data generation.
πΈ Pay
$50 - 85/hr depending on experience
Remote, flexible hours
Sepal AI builds the worldβs hardest tests for AI grounded in real-world software systems. Weβre hiring a Data Engineer with 3+ years of experience and a strong systems mindset to help us build evaluation environments for AI in high-throughput log analysis contexts.
π§ What Youβll Do
- Design and implement analytical schemas and pipelines using tools like BigQuery, ClickHouse, Snowflake, Redshift, and other high-performance columnar databases.
- Work on complex, distributed queries over massive log and telemetry datasets.
- Create and manage synthetic datasets that simulate real-world DevOps, observability, or cloud infrastructure logs.
- Tune and optimize distributed query execution plans to avoid timeouts and reduce over-scanning.
β
Who You Are
- 3+ years of experience in data engineering or backend systems roles.
- Deep expertise in analytical databases and OLAP engines with a focus on large-scale query optimization, schema design, and performance tuning.
- Hands-on with log ingestion pipelines (e.g., FluentBit, Logstash, Vector) and schema design for observability systems.
- Strong SQL skills: you know how to reason through performance problems and spot inefficient query patterns.
- Bonus: Experience with Python, Docker, or synthetic data generation.
πΈ Pay
$50 - 85/hr depending on experience
Remote, flexible hours