DigiTran Technologies Inc.

Python Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Developer with a 12-month contract, offering competitive pay. Key skills include Python, FastAPI, PostgreSQL, Snowflake, and experience with automation frameworks. Strong data validation and API integration expertise are essential. Remote work is available.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 9, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Lambda (AWS Lambda) #S3 (Amazon Simple Storage Service) #Docker #Visualization #Data Engineering #Jira #Spark (Apache Spark) #REST (Representational State Transfer) #Kubernetes #API (Application Programming Interface) #REST API #Snowflake #GitHub #PostgreSQL #Data Integrity #Automation #FastAPI #Python #Streamlit #Databricks #Logging #Agile #SQL (Structured Query Language) #SQL Queries #Pandas #Snowpark #EC2 #SQLAlchemy #"ETL (Extract #Transform #Load)" #JSON (JavaScript Object Notation) #AWS (Amazon Web Services) #pydantic #Kanban #Libraries #Version Control #Data Quality #Pytest #RDS (Amazon Relational Database Service)
Role description
Job Description: • Develop automation tools to validate and reconcile data between API responses and database queries using PostgreSQL and Snowflake. • Build and maintain FastAPI endpoints utilizing Pydantic BaseModel for data validation, SQLAlchemy ORM for database integration, and Snowflake connectors for secure connections. • Leverage Python Requests and JSON libraries to interact with REST APIs, parse structured responses, and automate API-based testing. • Design and execute data validation frameworks using pytest and Pandas to ensure accuracy and integrity across multiple data sources. • Create Streamlit dashboards for visualization of validation results, metrics, and exception reports in real time. • Integrate automation workflows with Databricks and SQL pipelines to support large-scale data validation and performance optimization. • Utilize Jira for Agile sprint tracking, test case management, and issue resolution while maintaining code quality through GitHub version control and CI/CD workflows. • Document reusable FastAPI modules, validation schemas, and automation utilities to enhance standardization and accelerate development cycles. Must Have Skills/Requirements: • Minimum experience: BS + 12 years relevant experience • Strong proficiency in Python with experience in automation and data validation frameworks (pytest, Pandas, Requests, JSON, os, logging, re). • Hands-on experience with FastAPI, including Pydantic, BaseModel, and SQLAlchemy ORM for backend API development. • Expertise in PostgreSQL and Snowflake, including secure connection handling via SQLAlchemy, snowflake-connector-python, and psycopg2. • Proven ability to design and automate validation between API responses and database queries, ensuring accuracy and consistency. • Proficiency in REST API integration using Requests, FastAPI, and Uvicorn with structured JSON schema handling. • Familiarity with GitHub (version control, branching, pull requests) and Jira (Agile tracking, sprint and Kanban management). • Strong understanding of CI/CD pipelines, API authentication (JWT/OAuth2), and data integrity testing. • Experience optimizing SQL queries, FastAPI endpoints, and improving performance in high-volume systems. • Knowledge of Databricks and integrating validation workflows within large-scale data engineering environments. Nice to Have Skills/Requirements: • Experience developing Streamlit dashboards for real-time visualization of data validation metrics and test results. • Exposure to AWS services (EC2, Lambda, RDS, S3) for deploying data and API solutions. • Understanding of Docker or Kubernetes for containerizing and orchestrating automation tools and APIs. • Familiarity with data quality frameworks and ETL testing in enterprise environments. • Experience with workflow orchestration and scheduling of validation pipelines. • Knowledge of Snowpark or Spark for advanced data transformation and validation automation.