

Lab IV
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (remote) with a contract length of "unknown" and a pay rate of "unknown." Key skills required include Python, AWS, SQL, PostgreSQL, and experience with ETL processes and MLOps workflows.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
February 13, 2026
π - Duration
Unknown
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
San Francisco Bay Area
-
π§ - Skills detailed
#Data Pipeline #PostgreSQL #Cloud #Python #AWS (Amazon Web Services) #Data Engineering #DevOps #Airflow #Deployment #"ETL (Extract #Transform #Load)" #Databases #ML (Machine Learning) #SQL (Structured Query Language)
Role description
We are a Berlin based data consultancy, looking for a Data Engineer for our client's project (remote first)
Project
Build and evolve a backend platform that analyzes multiple data sources to detect service outages and shutdowns.
What youβll do
β’ Design and implement backend services and APIs
β’ Build reliable data pipelines and integrations
β’ Work hands-on with infrastructure and deployment
β’ Prepare and transform data for analytics and ML use cases
What weβre looking for
β’ Strong data engineering, software design and architecture skills
β’ Backend experience (Python, APIs, Shell)
β’ DevOps experience (AWS or similar cloud platforms)
β’ Fluent SQL
β’ Relational databases (PostgreSQL preferred)
β’ Experience with non-relational databases
β’ Understanding of modern ETL and orchestration (e.g. Airflow)
β’ Basic understanding of ML concepts and MLOps workflows (data collection, preparation, training readiness)
We are a Berlin based data consultancy, looking for a Data Engineer for our client's project (remote first)
Project
Build and evolve a backend platform that analyzes multiple data sources to detect service outages and shutdowns.
What youβll do
β’ Design and implement backend services and APIs
β’ Build reliable data pipelines and integrations
β’ Work hands-on with infrastructure and deployment
β’ Prepare and transform data for analytics and ML use cases
What weβre looking for
β’ Strong data engineering, software design and architecture skills
β’ Backend experience (Python, APIs, Shell)
β’ DevOps experience (AWS or similar cloud platforms)
β’ Fluent SQL
β’ Relational databases (PostgreSQL preferred)
β’ Experience with non-relational databases
β’ Understanding of modern ETL and orchestration (e.g. Airflow)
β’ Basic understanding of ML concepts and MLOps workflows (data collection, preparation, training readiness)






