Integris Group

Senior Python Data Engineer – Multi-Year Contract | Hybrid in Charlotte, NC

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Python Data Engineer on a multi-year contract, hybrid in Charlotte, NC. Requires 5+ years in Data Engineering, strong Python skills, and experience with cloud platforms. Familiarity with cybersecurity data formats is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
600
-
🗓️ - Date
April 24, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte Metro
-
🧠 - Skills detailed
#Data Pipeline #Data Security #AWS (Amazon Web Services) #Computer Science #Cybersecurity #Data Engineering #Data Modeling #Compliance #PySpark #"ETL (Extract #Transform #Load)" #Splunk #GCP (Google Cloud Platform) #Libraries #Scala #Azure #SQLAlchemy #Apache Airflow #Airflow #Python #Security #Pandas #Cloud #Data Science #Spark (Apache Spark) #Kafka (Apache Kafka) #Anomaly Detection
Role description
Integris Group is currently partnering with a leading financial services organization in Charlotte, NC. Our client has an immediate need for a Senior Python Data Engineer to join their team on a multi-year, long-term contract. This is a hybrid opportunity, and candidates must be able to work onsite 2–3 days per week in Charlotte, NC. Job Summary: We’re looking for a Senior Data Engineer with strong Python expertise to join a high-impact eCrime Defense team (Cybersecurity), focused on detecting, investigating, and preventing electronic crimes. This position sits at the intersection of data engineering and cybersecurity, partnering closely with threat hunters, analysts, and data scientists to deliver real-time and actionable intelligence. What You’ll Do • Design, build, and maintain scalable data pipelines using Python to process security data (logs, APIs, threat feeds) • Develop robust ETL workflows supporting threat intelligence, digital forensics, and incident response • Integrate and enrich external threat intelligence data to enhance detection capabilities • Collaborate with cybersecurity teams to translate investigative needs into data solutions • Design and optimize data models for high-performance querying in cloud environments • Implement data validation, quality checks, and anomaly detection processes • Monitor pipeline performance and troubleshoot issues in real time • Ensure data security, integrity, and compliance with regulatory standards Required Qualifications: • 5+ years of experience in Data Engineering, with strong hands-on Python development • Proven experience building data pipelines and ETL workflows • Expertise with Python libraries such as Pandas, PySpark, Requests, and SQLAlchemy • Experience with cloud platforms (AWS, Azure, or GCP) • Strong understanding of data modeling, validation, and governance • Familiarity with cybersecurity data formats (e.g., STIX/TAXII, Syslog, NetFlow) • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Cybersecurity, or related field Preferred Skills: • Experience with Apache Airflow, Kafka, or Spark (orchestration & streaming) • Exposure to SIEM platforms (e.g., Splunk, Sentinel) • Understanding of eCrime / cyber threat TTPs • Relevant certifications (AWS, GCP, or GIAC) are a plus