

Novia Infotech
Data Engineer With Banking
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with banking experience, offering a contract length of "X months" at a pay rate of "$X/hour". Key skills include Python, PySpark, Apache Airflow, and advanced SQL. Strong ETL and data warehousing knowledge required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 9, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York City Metropolitan Area
-
🧠 - Skills detailed
#C++ #SQL Server #Big Data #Data Management #Data Modeling #Programming #Qlik #Java #Agile #MDM (Master Data Management) #Databases #SQL (Structured Query Language) #Data Processing #Tableau #SSIS (SQL Server Integration Services) #Triggers #Apache Spark #Airflow #Scala #"ETL (Extract #Transform #Load)" #Python #Spark (Apache Spark) #JSON (JavaScript Object Notation) #RDBMS (Relational Database Management System) #Data Engineering #Data Lake #Microsoft Power BI #MS SQL (Microsoft SQL Server) #Oracle #REST (Representational State Transfer) #PySpark #Apache Airflow #BI (Business Intelligence) #SQL Queries #Data Pipeline #Cloud
Role description
Role Overview
We are looking for a highly skilled Data Engineer to join an Agile, SDLC-based delivery team. The role focuses on building scalable, reliable, and reusable data pipelines and delivering data as a product/service to support analytics and business needs.
Key Responsibilities
• Contribute as an individual contributor in an Agile SDLC environment.
• Design and implement end-to-end data engineering solutions aligned with data-as-a-product and data-as-a-service principles.
• Develop ETL pipelines from scratch using Python and big data technologies.
• Build, manage, and optimize Airflow DAGs for orchestration and scheduling.
• Develop and optimize Spark / PySpark jobs for large-scale data processing.
• Design and implement data models, schemas, and entity relationships.
• Write and optimize complex SQL queries across multiple RDBMS platforms.
• Develop and maintain stored procedures, triggers, and materialized views.
• Integrate data across systems using REST, SOAP, ETL, and SSIS.
• Develop dashboards and reports using BI tools.
• Write efficient, reusable, and maintainable code following best practices.
• Collaborate with cross-functional teams and continuously learn new cloud-based tools and applications.
Required Skills & Technical Expertise
• Programming & Frameworks
• Expert in Python
• Strong experience with PySpark / Apache Spark
• Hands-on experience with Node.js and JSON
• Proficiency in at least one object-oriented language: C, C++, or Java
• Data Engineering & Big Data
• Strong understanding of ETL, Big Data, and Data Warehousing
• Expertise in data modeling, data lakes, and master data management
• Experience delivering data as a product/service
• Workflow & Orchestration
• Expert-level experience with Apache Airflow (DAG development and management)
• Databases & SQL
• Deep knowledge of RDBMS concepts
• Advanced SQL expertise across SQL Server, Oracle, DB2
• Hands-on experience with stored procedures, triggers, and DML packages
• BI & Analytics
• Experience building reports and dashboards using Qlik, Tableau, or Power BI
• Strong understanding of data analytics concepts
• Integration & Tools
• Experience integrating systems via REST, SOAP, ETL, SSIS
• Willingness to learn and adapt to cloud-based business applications and tools
Skills Summary
• Apttio
• Big Data
• ETL / Data Warehousing
• MS SQL
• Oracle PL/SQL
• Stored Procedure Coding
• Python
• PySpark
• Node.js
• RDBMS
Data Analytic
Role Overview
We are looking for a highly skilled Data Engineer to join an Agile, SDLC-based delivery team. The role focuses on building scalable, reliable, and reusable data pipelines and delivering data as a product/service to support analytics and business needs.
Key Responsibilities
• Contribute as an individual contributor in an Agile SDLC environment.
• Design and implement end-to-end data engineering solutions aligned with data-as-a-product and data-as-a-service principles.
• Develop ETL pipelines from scratch using Python and big data technologies.
• Build, manage, and optimize Airflow DAGs for orchestration and scheduling.
• Develop and optimize Spark / PySpark jobs for large-scale data processing.
• Design and implement data models, schemas, and entity relationships.
• Write and optimize complex SQL queries across multiple RDBMS platforms.
• Develop and maintain stored procedures, triggers, and materialized views.
• Integrate data across systems using REST, SOAP, ETL, and SSIS.
• Develop dashboards and reports using BI tools.
• Write efficient, reusable, and maintainable code following best practices.
• Collaborate with cross-functional teams and continuously learn new cloud-based tools and applications.
Required Skills & Technical Expertise
• Programming & Frameworks
• Expert in Python
• Strong experience with PySpark / Apache Spark
• Hands-on experience with Node.js and JSON
• Proficiency in at least one object-oriented language: C, C++, or Java
• Data Engineering & Big Data
• Strong understanding of ETL, Big Data, and Data Warehousing
• Expertise in data modeling, data lakes, and master data management
• Experience delivering data as a product/service
• Workflow & Orchestration
• Expert-level experience with Apache Airflow (DAG development and management)
• Databases & SQL
• Deep knowledge of RDBMS concepts
• Advanced SQL expertise across SQL Server, Oracle, DB2
• Hands-on experience with stored procedures, triggers, and DML packages
• BI & Analytics
• Experience building reports and dashboards using Qlik, Tableau, or Power BI
• Strong understanding of data analytics concepts
• Integration & Tools
• Experience integrating systems via REST, SOAP, ETL, SSIS
• Willingness to learn and adapt to cloud-based business applications and tools
Skills Summary
• Apttio
• Big Data
• ETL / Data Warehousing
• MS SQL
• Oracle PL/SQL
• Stored Procedure Coding
• Python
• PySpark
• Node.js
• RDBMS
Data Analytic






