Python Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Python Developer in Iselin, NJ, with an 8+ year requirement in Python, Databricks, and Kafka. Contract length is unspecified, with a focus on capital markets and data streaming solutions. Cloud certification preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 5, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Iselin, NJ
-
🧠 - Skills detailed
#Data Pipeline #Python #Azure #Agile #Monitoring #Kafka (Apache Kafka) #NoSQL #GCP (Google Cloud Platform) #Compliance #Spark (Apache Spark) #Data Engineering #Databases #PySpark #SQL (Structured Query Language) #Hadoop #Big Data #Spark SQL #Cloud #"ETL (Extract #Transform #Load)" #Scala #Version Control #GIT #Documentation #MongoDB #AWS (Amazon Web Services) #Data Governance #Distributed Computing #Security #Code Reviews #Data Processing #Data Quality #Databricks
Role description
Senior Python Developer (Capital Markets & DataBricks/ Kafka Job Title: Sr. Python Developer Location: Iselin NJ Experience: 8+ Years in Python Development with expertise in Databricks, and Kafka Job Overview: We are seeking a skilled Python Developer with hands-on experience in Databricks, and Kafka to join our technology team. The ideal candidate will design, develop, and optimize large-scale data processing pipelines and real-time data streaming solutions to support our trading, risk, and compliance functions. You will collaborate with business stakeholders and data teams to deliver high-performance data solutions in a fast-paced financial environment. Responsibilities: β€’ Develop, test, and maintain scalable ETL/ELT data pipelines using Python, PySpark, and Databricks on cloud platforms. β€’ Build and manage real-time data streaming solutions with Kafka to support low-latency data feeds. β€’ Collaborate with quantitative analysts, traders, and risk managers to understand data requirements and deliver effective solutions. β€’ Optimize existing data workflows for performance, reliability, and efficiency. β€’ Implement data quality checks and monitoring mechanisms. β€’ Participate in code reviews, documentation, and knowledge sharing within the team. β€’ Ensure compliance with financial data governance and security standards. β€’ Stay updated with emerging technologies and propose innovative solutions for data processing challenges. Required Skills & Qualifications: β€’ 8+ years of experience in Python development β€’ Strong experience with Databricks platform and cloud-based data engineering. β€’ Proven expertise in Kafka for building scalable, real-time streaming applications. β€’ Knowledge of relational and NoSQL databases (e.g., SQL, Cassandra, MongoDB). β€’ Familiarity with investment banking processes, trading systems, risk management, or financial data workflows. β€’ Good understanding of distributed computing concepts and big data ecosystem. β€’ Experience with version control systems (e.g., Git) and Agile development methodologies. β€’ Excellent problem-solving skills, attention to detail, and ability to work under tight deadlines. Preferred Qualifications: β€’ Experience with other big data tools such as Hadoop, Spark SQL, or Flink. β€’ Knowledge of financial data standards and regulations. β€’ Certification in Cloud platforms (AWS, Azure, GCP). β€’ Previous experience working in a regulated financial environment.