

DivIHN Integration Inc
Python Developer- Back End
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a "Python Developer (Back-End)" with a 12-month contract, remote work, paying competitive rates. Requires 5+ years of Python development, experience with PostgreSQL, FastAPI, and GraphQL. A BS/MS in Computer Science or equivalent experience is needed.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
October 9, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Corning, NY
-
π§ - Skills detailed
#Scala #Microservices #Schema Design #Docker #Automated Testing #GIT #DevOps #Apache Airflow #Computer Science #GraphQL #REST (Representational State Transfer) #Kubernetes #Django #PostgreSQL #Terraform #Normalization #Airflow #FastAPI #Data Science #Monitoring #Python #Indexing #Cloud #Logging #Agile #Observability #SQL (Structured Query Language) #SQL Queries #Data Pipeline #SQLAlchemy #"ETL (Extract #Transform #Load)" #Migration #AWS (Amazon Web Services) #ML (Machine Learning) #Databases #Version Control #Data Privacy #GitLab
Role description
Title: Python Developer (Back-End) - Remote
Location: Remote
Duration: 12 Months
Hours: Must be able to work 8 AM - 5 PM EST
Travel: Less than 15%
Description
The client is seeking a backend-oriented Software Developer with strong experience in Python to design, build, and operate scalable APIs, data services, and pipelines. You will collaborate with cross-functional teams to deliver secure, reliable, and performant backend systems, leveraging Python, PostgreSQL, ORMs, GraphQL, and Apache Airflow.
Required skills: 5 years minimum in software development using Python. Experience in Python, FastAPI, and SQL/PostgreSQL required. Experience in GraphQL APIs (e.g., Strawberry, Graphene) will set a candidate apart. BS/MS in Computer Science or 7 years of equivalent experience
What Youβll Do
β’ Design, develop, and maintain backend services using Python and related frameworks
β’ Build robust REST and GraphQL APIs, including designing schemas and implementing resolvers, pagination, and authorization.
β’ Model, design, and optimize relational databases (PostgreSQL) queries, including indexing, migrations, query optimization, and performance tuning.
β’ Implement and integrate ORMs (e.g. SQLAlchemy) with sound patterns for transactions and concurrency.
β’ Architect and operate data pipelines and workflows using Apache Airflow (DAG design, scheduling, monitoring, retry/backfill strategies).
β’ Ensure code quality through unit/integration tests and automated CI/CD pipelines.
β’ Apply secure development practices: authentication/authorization, secrets management, input validation, and data privacy.
β’ Implement observability: structured logging, metrics, tracing, and alerting to ensure reliability and performance.
β’ Collaborate with product owners, data scientists, and front-end engineers to deliver end-to-end solutions aligned with business goals.
β’ Document systems and APIs for maintainability and knowledge sharing.
What Client Is Looking For
Qualifications
β’ BS/MS in Computer Science or equivalent experience
β’ Professional backend Python development experience (5+ years) building production-grade services.
β’ Strong grasp of software engineering fundamentals: algorithms, data structures, concurrency, distributed systems basics.
β’ Hands-on experience designing and maintaining microservices.
β’ Strong proficiency in Python, and related framework like FastAPI or Django
β’ Deep knowledge of SQL and PostgreSQL, including schema design, normalization, indexing strategies, query planning, and performance tuning.
β’ Practical experience with ORMs, transactional integrity, and managing schema evolution/migrations
β’ Experience with automated testing, linting/typing, and Git-based version control system.
β’ Experience with next generation software methodologies such as Agile and tools such as Gitlab DevOps, CI/CD and other best practices.
β’ Clear communication skills and ability to collaborate in a cross-functional, agile environment.
Preferred Qualifications
β’ Experience with GraphQL APIs (e.g., Strawberry, Graphene) and/or RESTful services.
β’ Proficiency with Apache Airflow for orchestration of ETL/ELT workflows and data pipeline reliability.
β’ Experience operating services in AWS cloud environments, including containerization (Docker) and orchestration (Kubernetes)
β’ Infrastructure-as-code familiarity (Terraform/OpenTofu, CloudFormation) and secrets/config management.
β’ Performance profiling and capacity planning for high-throughput APIs.
β’ Background in applied ML integrations or retrieval-augmented systems using vector stores.
What Youβll Gain
β’ Opportunity to build impactful systems that support clients innovation and global operations.
β’ Collaborative, inclusive culture focused on continuous learning and technical excellence.
β’ Exposure to a broad technology stack and complex, real-world challenges.
Interview Process: Teams meeting
Title: Python Developer (Back-End) - Remote
Location: Remote
Duration: 12 Months
Hours: Must be able to work 8 AM - 5 PM EST
Travel: Less than 15%
Description
The client is seeking a backend-oriented Software Developer with strong experience in Python to design, build, and operate scalable APIs, data services, and pipelines. You will collaborate with cross-functional teams to deliver secure, reliable, and performant backend systems, leveraging Python, PostgreSQL, ORMs, GraphQL, and Apache Airflow.
Required skills: 5 years minimum in software development using Python. Experience in Python, FastAPI, and SQL/PostgreSQL required. Experience in GraphQL APIs (e.g., Strawberry, Graphene) will set a candidate apart. BS/MS in Computer Science or 7 years of equivalent experience
What Youβll Do
β’ Design, develop, and maintain backend services using Python and related frameworks
β’ Build robust REST and GraphQL APIs, including designing schemas and implementing resolvers, pagination, and authorization.
β’ Model, design, and optimize relational databases (PostgreSQL) queries, including indexing, migrations, query optimization, and performance tuning.
β’ Implement and integrate ORMs (e.g. SQLAlchemy) with sound patterns for transactions and concurrency.
β’ Architect and operate data pipelines and workflows using Apache Airflow (DAG design, scheduling, monitoring, retry/backfill strategies).
β’ Ensure code quality through unit/integration tests and automated CI/CD pipelines.
β’ Apply secure development practices: authentication/authorization, secrets management, input validation, and data privacy.
β’ Implement observability: structured logging, metrics, tracing, and alerting to ensure reliability and performance.
β’ Collaborate with product owners, data scientists, and front-end engineers to deliver end-to-end solutions aligned with business goals.
β’ Document systems and APIs for maintainability and knowledge sharing.
What Client Is Looking For
Qualifications
β’ BS/MS in Computer Science or equivalent experience
β’ Professional backend Python development experience (5+ years) building production-grade services.
β’ Strong grasp of software engineering fundamentals: algorithms, data structures, concurrency, distributed systems basics.
β’ Hands-on experience designing and maintaining microservices.
β’ Strong proficiency in Python, and related framework like FastAPI or Django
β’ Deep knowledge of SQL and PostgreSQL, including schema design, normalization, indexing strategies, query planning, and performance tuning.
β’ Practical experience with ORMs, transactional integrity, and managing schema evolution/migrations
β’ Experience with automated testing, linting/typing, and Git-based version control system.
β’ Experience with next generation software methodologies such as Agile and tools such as Gitlab DevOps, CI/CD and other best practices.
β’ Clear communication skills and ability to collaborate in a cross-functional, agile environment.
Preferred Qualifications
β’ Experience with GraphQL APIs (e.g., Strawberry, Graphene) and/or RESTful services.
β’ Proficiency with Apache Airflow for orchestration of ETL/ELT workflows and data pipeline reliability.
β’ Experience operating services in AWS cloud environments, including containerization (Docker) and orchestration (Kubernetes)
β’ Infrastructure-as-code familiarity (Terraform/OpenTofu, CloudFormation) and secrets/config management.
β’ Performance profiling and capacity planning for high-throughput APIs.
β’ Background in applied ML integrations or retrieval-augmented systems using vector stores.
What Youβll Gain
β’ Opportunity to build impactful systems that support clients innovation and global operations.
β’ Collaborative, inclusive culture focused on continuous learning and technical excellence.
β’ Exposure to a broad technology stack and complex, real-world challenges.
Interview Process: Teams meeting