Sr. Python Developer | W2 Only

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Python Developer in Iselin, NJ, requiring 8+ years of experience in data engineering within financial services. Key skills include Python, Databricks, Kafka, and cloud platforms. Contract length is W2, with 3 days onsite.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 10, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Iselin, NJ
-
🧠 - Skills detailed
#Compliance #MongoDB #Spark SQL #Kafka (Apache Kafka) #AWS (Amazon Web Services) #Data Pipeline #Python #"ETL (Extract #Transform #Load)" #PySpark #Data Quality #Cloud #Azure #Hadoop #Apache Kafka #SQL (Structured Query Language) #NoSQL #Databases #Distributed Computing #GIT #GCP (Google Cloud Platform) #Agile #Databricks #Spark (Apache Spark) #Scala #Big Data #Data Engineering
Role description
Job Title: Sr. Python Developer Location: Iselin, NJ (3 days onsite) Experience: 8+ years Job Summary: We are seeking a highly skilled Sr. Python Developer with a strong background in data engineering, real-time streaming, and cloud platforms to join our team in Iselin, NJ. The ideal candidate will have hands-on experience working with Databricks, Kafka, and distributed data systems in a financial services environment. Key Responsibilities: β€’ Design, build, and optimize large-scale ETL/ELT data pipelines using Python, PySpark, and Databricks. β€’ Develop and maintain real-time data streaming solutions leveraging Apache Kafka. β€’ Collaborate closely with traders, quants, and risk teams to deliver timely and accurate data solutions. β€’ Tune workflows for maximum performance, scalability, and reliability. β€’ Enforce data quality, compliance, and governance standards across all pipelines. Required Skills: β€’ 8+ years of hands-on Python development experience in enterprise environments β€’ Strong expertise in Databricks, cloud data platforms (AWS, Azure, or GCP) β€’ Advanced knowledge of Apache Kafka for real-time streaming applications β€’ Solid experience with Relational (SQL) and NoSQL databases (e.g., Cassandra, MongoDB) β€’ Good understanding of investment banking, trading, or risk management systems β€’ Familiarity with big data technologies and distributed computing frameworks β€’ Proficient in Git, Agile development practices, and CI/CD pipelines Preferred Qualifications: β€’ Experience with Spark SQL, Apache Hadoop, Apache Flink β€’ Cloud certifications (AWS, Azure, or GCP) β€’ Prior experience in a regulated financial services or capital markets environment