

New York Technology Partners
Big Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Big Data Engineer in St. Louis, MO, on a W2 contract. Key skills include Spark, Scala, PySpark, Python, and AI agent knowledge. Experience with AWS, Databricks, and real-time data pipelines is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 25, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
St. Louis County, MO
-
🧠 - Skills detailed
#Cloudera #Hadoop #NiFi (Apache NiFi) #Scala #Java #Big Data #Data Ingestion #Data Engineering #Python #AWS (Amazon Web Services) #PySpark #AI (Artificial Intelligence) #Impala #Monitoring #SQL (Structured Query Language) #Kafka (Apache Kafka) #Splunk #Cloud #Databricks #Spark (Apache Spark)
Role description
Title: Big data Engineer
Location: St Louis, MO onsite
Contract Type: W2 Contract Only
Must Have Skill: Spark, Scala, PySpark, Python is mandatory skills.
Role Overview:
Team Focus: Supports critical data ingestion, streaming and model monitoring platforms that enable real time decisioning and fraud risk capabilities across Mastercard. The team contributes to enhancements & modernization. The main team focus right now is model monitoring, second is modernization of applications/legacy systems and third, leveraging AI tools
High level overview: Engineer with experience in Scala, Python or Java, Big Data+ AWS/Databricks. Needs AI agent knowledge to create agents within the team to automate the processes
Core skills:
• Data Engineering / Streaming Platforms (Kafka, real-time pipelines using scala/python, NIFI)
• Big Data Platforms (Spark, Hadoop/Ozone, Hive or impala, AWS, Databricks, cloudera manager platform)
• Production Support & Platform Reliability (using Splunk for monitoring, troubleshooting, performance tuning)
• Strong SQL
• AI agent knowledge
• Team schedule: Onsite Monday, Tues and Fri
Title: Big data Engineer
Location: St Louis, MO onsite
Contract Type: W2 Contract Only
Must Have Skill: Spark, Scala, PySpark, Python is mandatory skills.
Role Overview:
Team Focus: Supports critical data ingestion, streaming and model monitoring platforms that enable real time decisioning and fraud risk capabilities across Mastercard. The team contributes to enhancements & modernization. The main team focus right now is model monitoring, second is modernization of applications/legacy systems and third, leveraging AI tools
High level overview: Engineer with experience in Scala, Python or Java, Big Data+ AWS/Databricks. Needs AI agent knowledge to create agents within the team to automate the processes
Core skills:
• Data Engineering / Streaming Platforms (Kafka, real-time pipelines using scala/python, NIFI)
• Big Data Platforms (Spark, Hadoop/Ozone, Hive or impala, AWS, Databricks, cloudera manager platform)
• Production Support & Platform Reliability (using Splunk for monitoring, troubleshooting, performance tuning)
• Strong SQL
• AI agent knowledge
• Team schedule: Onsite Monday, Tues and Fri






