CBTS

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a long-term contract for a Senior Data Engineer, paying "X" per hour, located in a hybrid setting (3 days in-office). Key skills include Quantexa, Scala, GCP, and ETL development, with 6-9 years of relevant experience required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 25, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York City Metropolitan Area
-
🧠 - Skills detailed
#Cloud #Data Integration #Data Architecture #Data Modeling #Deployment #Data Quality #Data Pipeline #Datasets #"ETL (Extract #Transform #Load)" #Security #Scala #GCP (Google Cloud Platform) #Data Processing #Batch #Compliance #Data Engineering
Role description
CBTS is a leading IT Solutions Provider with an exceptional track record of delivering results to its clients. With over 27 years of experience, time tested business acumen and a unique vendor neutral approach that ensures unbiased solution design, CBTS is one of the world's leading IT Solution Providers. We "Design Build and Operate" complex, best-of-breed data and data center solutions for highly recognizable customers. Role Overview This is a long-term contract role that is Hybrid (3 days in the office) Our client is seeking an experienced Senior Data Engineer with strong hands-on expertise in Quantexa, Scala, Google Cloud Platform (GCP), and ETL development. This role will focus on building and configuring batch and real-time entity resolution solutions that support large-scale data processing and analytics initiatives. The ideal candidate has practical experience developing, configuring, and deploying Quantexa modules, working closely with data engineering and architecture teams to deliver high-quality, scalable solutions. Key Responsibilities Design, develop, and configure Quantexa modules for batch and real-time entity resolution Build and maintain scalable data pipelines using Scala and ETL frameworks Develop and optimize data processing solutions on Google Cloud Platform Collaborate with data architects, engineers, and business stakeholders to translate requirements into technical solutions Ensure performance, data quality, and reliability across large-scale datasets Support deployment, testing, and troubleshooting of Quantexa-based solutions Follow best practices for data engineering, security, and compliance in enterprise environments Required Qualifications 6 to 9 years of overall experience in data engineering or related roles Strong hands-on experience with Quantexa, including batch and real-time entity resolution modules Proficiency in Scala for data processing and application development Experience building ETL pipelines and large-scale data integrations Hands-on experience with Google Cloud Platform (GCP) Strong understanding of data modeling, data quality, and performance optimization Preferred Qualifications Quantexa Certification Experience working in large enterprise or financial services environments Familiarity with real-time streaming and event-driven architectures Strong communication skills and ability to work cross-functionally