

Transcend Staffing Solutions LLC. a 100% Women Owned Minority Staffing Firm!
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 6+ month contract, based in Charlotte, NC. Key skills include PySpark, SQL, Python, and GCP. Financial company experience is required. Work involves data pipeline development and collaboration with business teams.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
376
-
🗓️ - Date
May 14, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#BigQuery #Storage #Stories #Spark SQL #Cloud #Data Pipeline #Spark (Apache Spark) #GCP (Google Cloud Platform) #Programming #PySpark #Indexing #Compliance #Python #"ETL (Extract #Transform #Load)" #Database Design #SQL (Structured Query Language) #Data Modeling #Big Data #Scala #Security #Data Processing #Dataflow #Data Engineering
Role description
Data Engineer (3 days onsite/week)
6+ months contract
Location: 2151 Hawkins Street, Charlotte, NC In-person interview - Glider test required
Note: Financial company experience needed .
Key skills:- PySpark, SQL, Python and GCP
The position of Data Engineer II will be on a dynamic and growing team within the Global Risk & Compliance Technology (GRCT) organization at client.
• Evaluates data requirements and stories, documents them for seamless integration into existing architectures, and maintains data models to support business needs.
• Builds and enhances data pipelines and database designs to meet performance, scalability, and security requirements.
• Collaborates in design reviews and testing and provides production environment support with guidance from peers and leaders.
• Operates data assets according to consistent standards, guidelines, and policies.
• Completes work reviews and fosters a collaborative learning environment.
• Communicates and collaborates with business and product teams to facilitate changes and implementation.
• Completes Big Data requirements by implementing basic partitioning and indexing solutions.
• Collaborates and co-creates effectively with teams in product and the business to align technology initiatives with business objectives.
• GCP Expertise: Hands-on experience with GCP services like BigQuery, Dataflow, Cloud Storage, Pub/Sub, Cloud Composer, and Cloud Functions.
• Programming Languages: Strong proficiency in Python and SQL for data processing and manipulation.
• ETL/ELT: Experience with ETL/ELT processes and data warehousing concepts.
• Data Modeling: Understanding of data modeling principles and techniques.
• Cloud Technologies: Knowledge of cloud computing concepts and standard processes.
Data Engineer (3 days onsite/week)
6+ months contract
Location: 2151 Hawkins Street, Charlotte, NC In-person interview - Glider test required
Note: Financial company experience needed .
Key skills:- PySpark, SQL, Python and GCP
The position of Data Engineer II will be on a dynamic and growing team within the Global Risk & Compliance Technology (GRCT) organization at client.
• Evaluates data requirements and stories, documents them for seamless integration into existing architectures, and maintains data models to support business needs.
• Builds and enhances data pipelines and database designs to meet performance, scalability, and security requirements.
• Collaborates in design reviews and testing and provides production environment support with guidance from peers and leaders.
• Operates data assets according to consistent standards, guidelines, and policies.
• Completes work reviews and fosters a collaborative learning environment.
• Communicates and collaborates with business and product teams to facilitate changes and implementation.
• Completes Big Data requirements by implementing basic partitioning and indexing solutions.
• Collaborates and co-creates effectively with teams in product and the business to align technology initiatives with business objectives.
• GCP Expertise: Hands-on experience with GCP services like BigQuery, Dataflow, Cloud Storage, Pub/Sub, Cloud Composer, and Cloud Functions.
• Programming Languages: Strong proficiency in Python and SQL for data processing and manipulation.
• ETL/ELT: Experience with ETL/ELT processes and data warehousing concepts.
• Data Modeling: Understanding of data modeling principles and techniques.
• Cloud Technologies: Knowledge of cloud computing concepts and standard processes.






