

Arkhya Tech. Inc.
Technical Lead
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Technical Lead position in Iselin, NJ (Hybrid) for 6 months at a pay rate of "X". Requires expertise in Python, Databricks, Angular, cloud platforms (Azure preferred), and CI/CD practices. Strong leadership and Agile experience essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
February 27, 2026
π - Duration
Unknown
-
ποΈ - Location
Hybrid
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Iselin, NJ
-
π§ - Skills detailed
#Databases #DevOps #Observability #GitHub #Angular #Code Reviews #AWS (Amazon Web Services) #Python #"ETL (Extract #Transform #Load)" #Data Architecture #Logging #Monitoring #Flask #Databricks #Scrum #Azure DevOps #Azure #Spark SQL #Microservices #NoSQL #Deployment #Scala #Leadership #MySQL #PostgreSQL #Agile #PySpark #Data Engineering #Data Processing #SQL (Structured Query Language) #Docker #Kanban #GIT #TypeScript #Django #Automated Testing #GCP (Google Cloud Platform) #Spark (Apache Spark) #Strategy #Cloud #Jenkins #MongoDB #FastAPI #Data Pipeline
Role description
Role : Tech Lead (Python + Databricks + Angular)
Location : Iselin, NJ (Hybrid)
Key Responsibilities
Technical Leadership
β’ Lead a cross-functional engineering team across backend, frontend, and data engineering streams.
β’ Own the full SDLC: architecture, design, development, code reviews, DevOps pipeline oversight, and production deployment.
β’ Provide technical direction, mentor developers, enforce coding best practices, and promote engineering excellence.
β’ Work closely with product owners, architects, and stakeholders to define technical strategy and roadmap.
Backend Engineering (Python)
β’ Design and build robust microservices, APIs, and data processing frameworks using Python.
β’ Develop scalable ETL/ELT workflows and backend logic aligned with enterprise data standards.
β’ Implement best practices in error handling, logging, performance tuning, and automated testing.
Data Engineering (Databricks)
β’ Build large-scale data pipelines and transformations using Databricks (PySpark/Spark SQL).
β’ Optimize data workflows for cost, performance, and reliability.
β’ Collaborate with data architects to define data models, governance, and quality frameworks.
β’ Integrate Databricks with cloud services (Azure/AWS) for end-to-end data solutions.
Frontend Engineering (Angular)
β’ Develop user-friendly, responsive, and scalable UI components using Angular.
β’ Drive best practices in UI architecture, state management, reusable components, and performance tuning.
β’ Integrate frontend systems with backend APIs and data services.
Cloud & DevOps
β’ Work with CI/CD pipelines (Azure DevOps, GitHub Actions, Jenkins, etc.) for automated builds and deployments.
β’ Ensure seamless deployment and monitoring of applications in cloud environments.
β’ Drive improvements in observability, reliability, and system health using monitoring tools.
Collaboration & Stakeholder Management
β’ Partner with business teams to translate requirements into scalable technology solutions.
β’ Estimate, plan, and track project deliverables.
β’ Guide teams in Agile development practices (Scrum/Kanban).
Required Skills & Qualifications
Technical Skills
β’ Strong hands-on experience with Python (Flask/FastAPI/Django desirable).
β’ Advanced knowledge of Databricks, PySpark, Spark SQL.
β’ Solid expertise in Angular (v10+) and TypeScript.
β’ Experience building large-scale distributed systems.
β’ Strong understanding of cloud platforms (Azure preferred, AWS/GCP acceptable).
β’ Familiarity with SQL/NoSQL databases (PostgreSQL, MySQL, Cosmos, MongoDB, etc.).
β’ Experience with Git, CI/CD pipelines, Docker, and containerized deployments.
Role : Tech Lead (Python + Databricks + Angular)
Location : Iselin, NJ (Hybrid)
Key Responsibilities
Technical Leadership
β’ Lead a cross-functional engineering team across backend, frontend, and data engineering streams.
β’ Own the full SDLC: architecture, design, development, code reviews, DevOps pipeline oversight, and production deployment.
β’ Provide technical direction, mentor developers, enforce coding best practices, and promote engineering excellence.
β’ Work closely with product owners, architects, and stakeholders to define technical strategy and roadmap.
Backend Engineering (Python)
β’ Design and build robust microservices, APIs, and data processing frameworks using Python.
β’ Develop scalable ETL/ELT workflows and backend logic aligned with enterprise data standards.
β’ Implement best practices in error handling, logging, performance tuning, and automated testing.
Data Engineering (Databricks)
β’ Build large-scale data pipelines and transformations using Databricks (PySpark/Spark SQL).
β’ Optimize data workflows for cost, performance, and reliability.
β’ Collaborate with data architects to define data models, governance, and quality frameworks.
β’ Integrate Databricks with cloud services (Azure/AWS) for end-to-end data solutions.
Frontend Engineering (Angular)
β’ Develop user-friendly, responsive, and scalable UI components using Angular.
β’ Drive best practices in UI architecture, state management, reusable components, and performance tuning.
β’ Integrate frontend systems with backend APIs and data services.
Cloud & DevOps
β’ Work with CI/CD pipelines (Azure DevOps, GitHub Actions, Jenkins, etc.) for automated builds and deployments.
β’ Ensure seamless deployment and monitoring of applications in cloud environments.
β’ Drive improvements in observability, reliability, and system health using monitoring tools.
Collaboration & Stakeholder Management
β’ Partner with business teams to translate requirements into scalable technology solutions.
β’ Estimate, plan, and track project deliverables.
β’ Guide teams in Agile development practices (Scrum/Kanban).
Required Skills & Qualifications
Technical Skills
β’ Strong hands-on experience with Python (Flask/FastAPI/Django desirable).
β’ Advanced knowledge of Databricks, PySpark, Spark SQL.
β’ Solid expertise in Angular (v10+) and TypeScript.
β’ Experience building large-scale distributed systems.
β’ Strong understanding of cloud platforms (Azure preferred, AWS/GCP acceptable).
β’ Familiarity with SQL/NoSQL databases (PostgreSQL, MySQL, Cosmos, MongoDB, etc.).
β’ Experience with Git, CI/CD pipelines, Docker, and containerized deployments.






