

eSense Incorporated
Python Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Engineer with a contract length of "unknown," offering a pay rate of "unknown" and allowing remote work. Key skills include Python, ETL/ELT pipelines, API development, and cloud platforms (AWS, Azure, GCP).
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
February 10, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Libraries #Data Pipeline #SQLAlchemy #Python #AWS (Amazon Web Services) #Databases #Data Engineering #Version Control #Airflow #"ETL (Extract #Transform #Load)" #Kafka (Apache Kafka) #NoSQL #Django #ML (Machine Learning) #Microservices #GIT #NumPy #Docker #Kubernetes #API (Application Programming Interface) #Cloud #Code Reviews #Flask #Pandas #GCP (Google Cloud Platform) #Data Modeling #FastAPI #Terraform #Spark (Apache Spark) #Azure #Scala
Role description
Role - Python Engineer (Software + Data Engineering)
About the Role
Weβre looking for a Python Engineer who thrives at the intersection of software engineering and data engineering. This role is perfect for someone who loves building robust systems, designing clean and scalable code, and working with data pipelines, APIs, and backend services. Youβll play a key part in shaping our data-driven products by developing highβquality Python applications and ensuring the reliability and performance of the data that powers them.
What Youβll Do
Core Responsibilities
β’ Design, build, and maintain Python-based applications, services, and tools.
β’ Develop and optimize ETL/ELT pipelines for structured and unstructured data.
β’ Collaborate with product, analytics, and engineering teams to integrate data workflows into production systems.
β’ Implement APIs, microservices, and backend components using modern Python frameworks.
β’ Ensure code quality through testing, code reviews, and best practices.
β’ Work with cloud platforms (AWS, Azure, or GCP) to deploy and scale applications.
β’ Monitor system performance and troubleshoot issues across data and application layers.
β’ Contribute to architectural decisions and help evolve our engineering standards.
What You Bring
Required Skills
β’ Strong proficiency in Python, including experience with libraries such as pandas, numpy, requests, sqlalchemy, or similar.
β’ Solid understanding of software engineering fundamentals: algorithms, data structures, design patterns, and testing.
β’ Experience building and maintaining data pipelines or workflow orchestration (Airflow, Prefect, Dagster, etc.).
β’ Hands-on experience with relational and/or NoSQL databases.
β’ Familiarity with API development using frameworks like FastAPI, Flask, or Django.
β’ Experience with version control (Git) and CI/CD workflows.
β’ Comfort working in cloud environments (AWS, Azure, or GCP).
Nice-to-Haves
β’ Experience with containerization (Docker) and orchestration (Kubernetes).
β’ Knowledge of distributed data systems (Spark, Kafka, etc.).
β’ Background in data modeling or analytics engineering.
β’ Exposure to machine learning pipelines or MLOps concepts.
β’ Experience with infrastructure-as-code tools (Terraform, CloudFormation).
Role - Python Engineer (Software + Data Engineering)
About the Role
Weβre looking for a Python Engineer who thrives at the intersection of software engineering and data engineering. This role is perfect for someone who loves building robust systems, designing clean and scalable code, and working with data pipelines, APIs, and backend services. Youβll play a key part in shaping our data-driven products by developing highβquality Python applications and ensuring the reliability and performance of the data that powers them.
What Youβll Do
Core Responsibilities
β’ Design, build, and maintain Python-based applications, services, and tools.
β’ Develop and optimize ETL/ELT pipelines for structured and unstructured data.
β’ Collaborate with product, analytics, and engineering teams to integrate data workflows into production systems.
β’ Implement APIs, microservices, and backend components using modern Python frameworks.
β’ Ensure code quality through testing, code reviews, and best practices.
β’ Work with cloud platforms (AWS, Azure, or GCP) to deploy and scale applications.
β’ Monitor system performance and troubleshoot issues across data and application layers.
β’ Contribute to architectural decisions and help evolve our engineering standards.
What You Bring
Required Skills
β’ Strong proficiency in Python, including experience with libraries such as pandas, numpy, requests, sqlalchemy, or similar.
β’ Solid understanding of software engineering fundamentals: algorithms, data structures, design patterns, and testing.
β’ Experience building and maintaining data pipelines or workflow orchestration (Airflow, Prefect, Dagster, etc.).
β’ Hands-on experience with relational and/or NoSQL databases.
β’ Familiarity with API development using frameworks like FastAPI, Flask, or Django.
β’ Experience with version control (Git) and CI/CD workflows.
β’ Comfort working in cloud environments (AWS, Azure, or GCP).
Nice-to-Haves
β’ Experience with containerization (Docker) and orchestration (Kubernetes).
β’ Knowledge of distributed data systems (Spark, Kafka, etc.).
β’ Background in data modeling or analytics engineering.
β’ Exposure to machine learning pipelines or MLOps concepts.
β’ Experience with infrastructure-as-code tools (Terraform, CloudFormation).






