Wells Fargo

Cloud Data Engineer (contract)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Data Engineer in Charlotte, NC, with a contract duration of 100 weeks and a pay rate of "Unknown." Key skills include data engineering, SQL, Python or Java, ETL tools, and cloud platforms like AWS or GCP.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 16, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Web Services #Compliance #Cloud #Agile #Spark (Apache Spark) #Microsoft Azure #SQL (Structured Query Language) #Kafka (Apache Kafka) #NoSQL #Data Pipeline #GCP (Google Cloud Platform) #Hadoop #Java #Big Data #Data Engineering #Azure #Automation #AWS (Amazon Web Services) #S3 (Amazon Simple Storage Service) #Python #Apache Spark #"ETL (Extract #Transform #Load)" #AI (Artificial Intelligence) #Consulting
Role description
Description Title: Cloud Data Engineer Location: Charlotte, NC Duration: 100 W, 1 D Work Engagement: W2 Work Schedule: Onsite Full Time Benefits on offer for this contract position: Health Insurance, Life insurance, 401K and Voluntary Benefits Summary: In this contingent resource assignment, you may: Consult on complex initiatives with broad impact and large-scale planning for Software Engineering. Review and analyze complex multi-faceted, larger scale or longer-term Software Engineering challenges that require in-depth evaluation of multiple factors including intangibles or unprecedented factors. Contribute to the resolution of complex and multi-faceted situations requiring solid understanding of the function, policies, procedures, and compliance requirements that meet deliverables. Strategically collaborate and consult with client personnel. Required Qualifications: Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work or consulting experience, training, military experience, education. Key Requirements: • Applicants must be authorized to work for ANY employer in the U.S. This position is not eligible for visa sponsorship. • Data Engineering experience • Proficiency in Database, SQL, PLSQL skills. • Experience with Python or Java • Experience with ETL tools and data pipeline frameworks • Experience with Parquet files in S3 buckets • Familiarity with relational and NoSQL database • Familiarity with big data technologies such as Apache Spark, Hadoop, or Kafka • Performance Tuning and Automation: Hands-on experience in performance tuning, including any report automations involved. • Agile Methodology: Experience in working in Agile projects, including participation in Backlog grooming, sprint planning, and Daily stand-up meetings. • Experience with Artificial Intelligence • Experience with cloud platforms such as Amazon Web Services (AWS), Google Cloud Platform (GCP), or Microsoft Azure