

Strategic Staffing Solutions
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "$XX/hour." Key skills include SQL, Python or Java, ETL tools, and big data technologies. Experience with cloud platforms and Agile methodology is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date
April 10, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte Metro
-
🧠 - Skills detailed
#Apache Spark #Java #Microsoft Azure #Cloud #Azure #Agile #Spark (Apache Spark) #GCP (Google Cloud Platform) #Data Pipeline #"ETL (Extract #Transform #Load)" #Data Engineering #Automation #Big Data #Hadoop #S3 (Amazon Simple Storage Service) #Kafka (Apache Kafka) #Web Services #AI (Artificial Intelligence) #NoSQL #SQL (Structured Query Language) #AWS (Amazon Web Services) #Python
Role description
• Data Engineering experience
• Proficiency in Database, SQL, PLSQL skills.
• Experience with Python or Java
• Experience with ETL tools and data pipeline frameworks
• Experience with Parquet files in S3 buckets
• Familiarity with relational and NoSQL database
• Familiarity with big data technologies such as Apache Spark, Hadoop, or Kafka
• Performance Tuning and Automation: Hands-on experience in performance tuning, including any report automations involved.
• Agile Methodology: Experience in working in Agile projects, including participation in Backlog grooming, sprint planning, and Daily stand-up meetings.
• Experience with Artificial Intelligence
• Experience with cloud platforms such as Amazon Web Services (AWS), Google Cloud Platform (GCP), or Microsoft Azure
• Data Engineering experience
• Proficiency in Database, SQL, PLSQL skills.
• Experience with Python or Java
• Experience with ETL tools and data pipeline frameworks
• Experience with Parquet files in S3 buckets
• Familiarity with relational and NoSQL database
• Familiarity with big data technologies such as Apache Spark, Hadoop, or Kafka
• Performance Tuning and Automation: Hands-on experience in performance tuning, including any report automations involved.
• Agile Methodology: Experience in working in Agile projects, including participation in Backlog grooming, sprint planning, and Daily stand-up meetings.
• Experience with Artificial Intelligence
• Experience with cloud platforms such as Amazon Web Services (AWS), Google Cloud Platform (GCP), or Microsoft Azure






