Integris Group

Senior Python Data Engineer – 1 to 2 Year Contract | Hybrid in Iselin, NJ!

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Python Data Engineer on a 1 to 2-year hybrid contract in Iselin, NJ, requiring 5+ years of data engineering experience, strong Python skills, and familiarity with cybersecurity data formats. Must be a Green Card holder or US citizen.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
600
-
🗓️ - Date
May 13, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte Metro
-
🧠 - Skills detailed
#Libraries #Python #Pandas #Data Modeling #Security #Data Science #Data Pipeline #SQLAlchemy #Spark (Apache Spark) #Scala #Kafka (Apache Kafka) #Splunk #"ETL (Extract #Transform #Load)" #Cybersecurity #Compliance #GCP (Google Cloud Platform) #Cloud #Computer Science #Apache Airflow #AWS (Amazon Web Services) #Anomaly Detection #Data Security #Azure #PySpark #Airflow #Data Engineering
Role description
Integris Group is currently partnering with a leading financial services organization in Charlotte, NC. Our client has an immediate need for a Senior Python Data Engineer to join their team on a multi-year, long-term contract. This is a HYBRID contract position, and candidates MUST be able to work onsite 2–3 days per week in Iselien, NJ. • • • • • • THIS POSITION IS ONLY OPEN TO GREEN CARD HOLDERS & US CITIZENS • • • • • • • Job Summary: We’re looking for a Senior Python Data Engineer with strong Python expertise to join a high-impact eCrime Defense team (Cybersecurity), focused on detecting, investigating, and preventing electronic crimes. This position sits at the intersection of data engineering and cybersecurity, partnering closely with threat hunters, analysts, and data scientists to deliver real-time and actionable intelligence. What You’ll Do • Design, build, and maintain scalable data pipelines using Python to process security data (logs, APIs, threat feeds) • Develop robust ETL workflows supporting threat intelligence, digital forensics, and incident response • Integrate and enrich external threat intelligence data to enhance detection capabilities • Collaborate with cybersecurity teams to translate investigative needs into data solutions • Design and optimize data models for high-performance querying in cloud environments • Implement data validation, quality checks, and anomaly detection processes • Monitor pipeline performance and troubleshoot issues in real time • Ensure data security, integrity, and compliance with regulatory standards Required Qualifications: • 5+ years of experience in Data Engineering, with strong hands-on Python development • Proven experience building data pipelines and ETL workflows • Expertise with Python libraries such as Pandas, PySpark, Requests, and SQLAlchemy • Experience with cloud platforms (AWS, Azure, or GCP) • Strong understanding of data modeling, validation, and governance • Familiarity with cybersecurity data formats (e.g., STIX/TAXII, Syslog, NetFlow) • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Cybersecurity, or related field Preferred Skills: • Experience with Apache Airflow, Kafka, or Spark (orchestration & streaming) • Exposure to SIEM platforms (e.g., Splunk, Sentinel) • Understanding of eCrime / cyber threat TTPs • Relevant certifications (AWS, GCP, or GIAC) are a plus