

Vista Applied Solutions Group Inc
Sr. Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer in Charlotte, NC, on a hybrid basis. Contract length and pay rate are unspecified. Must have 4 years of Spark, 3 years each of Kafka, GCP, and Airflow. US citizenship or Green Card required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 10, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#SQL (Structured Query Language) #Scrum #Data Security #Data Engineering #Security #Data Pipeline #GCP (Google Cloud Platform) #Microservices #Spark (Apache Spark) #Kafka (Apache Kafka) #Cloud #Airflow #Java #Jira #Stories #Python
Role description
HYBRID – Charlotte, NC
Must be US Citizens or Green Cards
Face to Face interview
Below are the MUST have Non-Negotiable Required Skills:
• Spark Frame Work – 4 Years Minimum
• Kafka – 3 Years
• GCP – 3 years
• Airflow – 3 years
• Big Query
Preferred Skills:
• Java – 3 years
• Python – 3 years
• Microservices – 2years
• SQL – 4 Years
Day to Day Responsibilities
• Attending Scrum calls to provide daily status updates
• Pickup assigned JIRA stories and work with Analysts or Product Owners to understand
• Responsible for the development, maintenance, and enhancements of Data Engineering solutions of varying complexity levels across different data sources like DBMS, File systems (structured and unstructured) on-prem and cloud infrastructure; creates level metrics and other complex metrics.
• Demonstrates Data Engineering skills (Spark Data Frame, Big Queries) by writing pipelines for business requirements.
• Responsible for building, testing, and enhancement of Data Pipelines solutions from a wide variety of sources like Kafka streaming, Google Big Query and File systems; develops solutions with optimized data performance and data security
• Taking ownership to test the feature End to End
HYBRID – Charlotte, NC
Must be US Citizens or Green Cards
Face to Face interview
Below are the MUST have Non-Negotiable Required Skills:
• Spark Frame Work – 4 Years Minimum
• Kafka – 3 Years
• GCP – 3 years
• Airflow – 3 years
• Big Query
Preferred Skills:
• Java – 3 years
• Python – 3 years
• Microservices – 2years
• SQL – 4 Years
Day to Day Responsibilities
• Attending Scrum calls to provide daily status updates
• Pickup assigned JIRA stories and work with Analysts or Product Owners to understand
• Responsible for the development, maintenance, and enhancements of Data Engineering solutions of varying complexity levels across different data sources like DBMS, File systems (structured and unstructured) on-prem and cloud infrastructure; creates level metrics and other complex metrics.
• Demonstrates Data Engineering skills (Spark Data Frame, Big Queries) by writing pipelines for business requirements.
• Responsible for building, testing, and enhancement of Data Pipelines solutions from a wide variety of sources like Kafka streaming, Google Big Query and File systems; develops solutions with optimized data performance and data security
• Taking ownership to test the feature End to End





