

Nasscomm
Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer, remote for 3.5 months, with a pay rate of "unknown." Key skills include Snowflake, SQL, Azure Data Factory, and API integration. Finance industry experience is preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
April 15, 2026
π - Duration
3 to 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Data Engineering #Kafka (Apache Kafka) #Automation #Data Layers #Snowflake #Unit Testing #Azure #"ETL (Extract #Transform #Load)" #ADF (Azure Data Factory) #Deployment #SQL (Structured Query Language) #ADLS (Azure Data Lake Storage) #Azure Data Factory #Data Pipeline #UAT (User Acceptance Testing) #API (Application Programming Interface)
Role description
Role: Sr. Data Engineer
Location: Remote
Duration: 3.5 Months
Scope : Role Summary-The Data Engineer will serve as a hands-on delivery resource responsible for designing, building, testing, and promoting data pipelines and data products that support Cambridgeβs Snowflake-based data platform. This role will work across ingestion, transformation, API integration, delta processing, and production support to deliver prioritized use cases for the Cambridge Data Program.
Responsibilities
β’ Build and maintain source-to-target pipelines using Azure Data Factory, ADLS, and Snowflake to support bronze, silver, and gold data layers.
β’ Develop and enhance Snowflake objects and transformation logic, including tables, views, stages, and stored procedures for client, account, compensation, and other priority use cases.
β’ Support API and integration delivery, including API payload updates, APIM-related work, Azure Functions connectivity, and test automation.
β’ Implement and support Kafka / delta / CDC processing for ongoing client and account data movement and near-real-time integration patterns.
β’ Execute testing, validation, and promotion activities across dev, UAT, and production, including QA support, unit testing, and deployment readiness.
Skills - Must Have
β’ Ideal Candidate Profile
β’ Strong experience with Snowflake, SQL, ADF, ADLS, APIs, and modern data pipeline delivery.
β’ Comfortable acting as a hands-on builder who can move from design to implementation to testing and deployment.
Skills - Nice to Have
β’ Finance company experience
Role: Sr. Data Engineer
Location: Remote
Duration: 3.5 Months
Scope : Role Summary-The Data Engineer will serve as a hands-on delivery resource responsible for designing, building, testing, and promoting data pipelines and data products that support Cambridgeβs Snowflake-based data platform. This role will work across ingestion, transformation, API integration, delta processing, and production support to deliver prioritized use cases for the Cambridge Data Program.
Responsibilities
β’ Build and maintain source-to-target pipelines using Azure Data Factory, ADLS, and Snowflake to support bronze, silver, and gold data layers.
β’ Develop and enhance Snowflake objects and transformation logic, including tables, views, stages, and stored procedures for client, account, compensation, and other priority use cases.
β’ Support API and integration delivery, including API payload updates, APIM-related work, Azure Functions connectivity, and test automation.
β’ Implement and support Kafka / delta / CDC processing for ongoing client and account data movement and near-real-time integration patterns.
β’ Execute testing, validation, and promotion activities across dev, UAT, and production, including QA support, unit testing, and deployment readiness.
Skills - Must Have
β’ Ideal Candidate Profile
β’ Strong experience with Snowflake, SQL, ADF, ADLS, APIs, and modern data pipeline delivery.
β’ Comfortable acting as a hands-on builder who can move from design to implementation to testing and deployment.
Skills - Nice to Have
β’ Finance company experience





