

Nasscomm
Data Engineer (W2 Candidate Only)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 6+ years of experience, focusing on data architecture for Wallet, Payments, and Commerce products. Contract duration is 9+ months, with a pay rate of "unknown." Location is hybrid in Cupertino, CA; New York City, NY; or Austin, TX. Key skills include SQL, Python, Spark, and cloud platforms like AWS or Azure.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 17, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, NY
-
🧠 - Skills detailed
#Scala #Data Modeling #Snowflake #SQL (Structured Query Language) #AWS (Amazon Web Services) #Data Architecture #Airflow #Cloud #"ETL (Extract #Transform #Load)" #ML (Machine Learning) #Batch #GCP (Google Cloud Platform) #Azure #Dimensional Data Models #Java #Compliance #Data Pipeline #Data Governance #Kafka (Apache Kafka) #Spark (Apache Spark) #Tableau #Python #Data Engineering #Databricks
Role description
Role: Data Engineer (W2 Candidate only)
Duration: 9+ Months
Location: Cupertino, CA; New York City, NY; Austin, TX (Hybrid – 3 days/week onsite)
Job Description:
We are seeking an experienced Data Engineer to design and scale modern data architectures supporting Wallet, Payments, and Commerce products. The ideal candidate will build high-performance batch and near real-time data pipelines that enable advanced analytics and machine learning use cases. This role requires strong expertise in data modeling, scalable data systems, and modern data platforms.
Key Responsibilities:
• Design and implement scalable batch and streaming data pipelines.
• Build and optimize ETL/ELT workflows for performance, reliability, and cost efficiency.
• Develop dimensional data models and standardize key business metrics.
• Capture behavioral and transactional data by instrumenting APIs and user journeys.
• Ensure data governance, quality, privacy, and compliance across platforms.
• Maintain reliability and availability of mission-critical data systems.
Required Qualifications:
• 6+ years of experience in data engineering supporting analytics or ML systems.
• Strong SQL skills and proficiency in Python, Scala, or Java.
• Experience with Spark, Kafka, and Airflow or similar tools.
• Knowledge of lakehouse architectures such as Iceberg.
• Experience with cloud platforms (AWS, Azure, or GCP) and tools like Snowflake, Databricks, or Tableau.
• Exposure to MLOps, GenAI/RAG pipelines, and FinTech or Payments domain is preferred.
Role: Data Engineer (W2 Candidate only)
Duration: 9+ Months
Location: Cupertino, CA; New York City, NY; Austin, TX (Hybrid – 3 days/week onsite)
Job Description:
We are seeking an experienced Data Engineer to design and scale modern data architectures supporting Wallet, Payments, and Commerce products. The ideal candidate will build high-performance batch and near real-time data pipelines that enable advanced analytics and machine learning use cases. This role requires strong expertise in data modeling, scalable data systems, and modern data platforms.
Key Responsibilities:
• Design and implement scalable batch and streaming data pipelines.
• Build and optimize ETL/ELT workflows for performance, reliability, and cost efficiency.
• Develop dimensional data models and standardize key business metrics.
• Capture behavioral and transactional data by instrumenting APIs and user journeys.
• Ensure data governance, quality, privacy, and compliance across platforms.
• Maintain reliability and availability of mission-critical data systems.
Required Qualifications:
• 6+ years of experience in data engineering supporting analytics or ML systems.
• Strong SQL skills and proficiency in Python, Scala, or Java.
• Experience with Spark, Kafka, and Airflow or similar tools.
• Knowledge of lakehouse architectures such as Iceberg.
• Experience with cloud platforms (AWS, Azure, or GCP) and tools like Snowflake, Databricks, or Tableau.
• Exposure to MLOps, GenAI/RAG pipelines, and FinTech or Payments domain is preferred.






