Ztek Consulting

Azure Data Engineer with Wealth Management

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an "Azure Data Engineer with Wealth Management" on a contract basis in Hamilton Township, NJ. Requires 10+ years in data engineering within financial services, expertise in Azure Databricks, Python, SQL, and Bloomberg data feeds.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
March 10, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Hamilton Township, NJ
-
🧠 - Skills detailed
#Data Quality #ADF (Azure Data Factory) #Azure Databricks #Microsoft Power BI #BI (Business Intelligence) #Cloud #Logic Apps #Storage #Azure DevOps #Python #SQL (Structured Query Language) #Batch #Visualization #Data Pipeline #DevOps #Azure cloud #AI (Artificial Intelligence) #Azure #GIT #Data Engineering #Azure Data Factory #Programming #Databricks #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Data Ingestion #Kafka (Apache Kafka) #Automation
Role description
Hi, I would like to share an excellent opening Contact β€œAzure Data Engineer with Wealth Management” do go through the details and kindly send me the updated resume. Location : Hamilton Township, NJ- (Hybrid)-Local Candidate only Type of Hire : Contract Mode of interview : WebEx / Teams We are seeking an experienced Data Engineer strong expertise in Databricks, Azure cloud services and batch job support. The ideal candidate will have a background in financial markets or trading systems, with hands-on experience managing data pipelines, FTP file transfers, and real-time/batch processing in a production environment. Key Responsibilities: Manage and support batch processes and data pipelines in Azure Databricks and Azure Data Factory. Integrate and process Bloomberg market data feeds and files into trading or analytics platforms. Monitor, troubleshoot, and resolve data and system issues related to trading applications and market data ingestion. Develop, automate, and optimize ETL pipelines using Python, Spark, and SQL. Manage FTP/SFTP file transfers between internal systems and external vendors. Ensure data quality, completeness, and timeliness for downstream trading and reporting systems. Collaborate with operations, application support, and infrastructure teams to resolve incidents and enhance data workflows. Required Skills & Experience: 10+ years of experience in data engineering or production support within financial services or trading environments. Hands-on experience with Azure Databricks, Azure Data Factory, and Azure Storage, Logic Apps, Fabric. Strong Python and SQL programming skills. Experience with Bloomberg data feeds (BPIPE, TSIP,SFTP). Experience with Git, CI/CD pipelines, and Azure DevOps. Proven ability to support batch jobs, troubleshoot failures, and manage job scheduling. Experience handling FTP/SFTP file transfers and automation (e.g., using scripts or managed file transfer tools). Solid understanding of equities trading, fixed income trading, trading workflows, and financial instruments. Excellent communication, problem-solving, and stakeholder management skills. Nice to Have: Experience with Power BI or other visualization tools. Familiarity with Kafka, Event Hubs, or real-time streaming architectures. Knowledge of ITIL / incident management best practices. Knowledge of building AI Agents and LLMs