

Azure Data Engineer (Banking)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer (Banking) on a contract basis in Baltimore, MD, requiring 5-7 years of ETL development experience, proficiency in Snowflake, and knowledge of Azure Data Factory. Pay rate and contract length are unspecified.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 13, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Fixed Term
-
π - Security clearance
Unknown
-
π - Location detailed
Baltimore, MD
-
π§ - Skills detailed
#Cloud #R #Azure #Data Pipeline #Snowflake #Data Engineering #Azure Data Factory #Synapse #BigQuery #Base #Python #Automation #Data Warehouse #Data Extraction #Data Lake #Data Processing #Data Integration #Fivetran #Data Analysis #Data Transformations #"ETL (Extract #Transform #Load)" #ADF (Azure Data Factory) #Databases #SnowPipe #Storage
Role description
Job Tile: ETL Developer
Location: Baltimore, MD (5 days onsite)
Hiring Mode: Contract
Job Description
Seeking an experienced ETL/ELT Developer for a fixed term engagement in support of our Information Technology team. Alongside a team, the ETL/ELT Developer will be responsible for the development of integrations employing industry standard tools for data extraction, transformation and loading (ETL) to / from core critical enterprise systems. The ETL developer should specialize in designing, building, and maintaining data pipelines that move data to / from various sources and to / from cloud-based data warehouses or data lakes. They will focus on ensuring data is extracted, transformed, and loaded efficiently and reliably for analytics and other downstream systems uses.
The ETL/ELT Developer Responsibilities:
β’ Data Transformation: Develop and optimize data transformations using cloud-based tools and technologies to cleanse, enrich, aggregate, and reshape data according to business requirements.
β’ Load transformed data into cloud data warehouses (like Snowflake, Azure Synapse, or BigQuery) or data lakes for storage and further analysis.
β’ Performance Optimization: Ensue efficient data processing and pipeline performance within on-premises and cloud environments by leveraging cloud-native services and optimizing resource utilization.
β’ Utilizing cloud-specific tools and services (e.g., FiveTran, Snowpipe, Streams and Tasks, Azure Data Factory) for ETL processes.
β’ Orchestration and Automation: Automate ETL workflows using orchestration tools or cloud-based workflow services.
β’ Data Integration: Design and implement ETL pipelines that extract data from diverse cloud and on-premises sources (databases, APIs, files, etc.).
β’ Collaborate with data analysts, systems analysts, developers and other stakeholders to understand data requirements and ensure the successful delivery of data for analytics and other business needs.
β’ Create and/or update knowledge base articles regarding procedures, workflows, user guides, process run books, etc.
β’ Participate within the change management process.
Job Requirements
β’ BS in Information Systems, Information Technology, or a related field is required. Work experience in lieu of degree or concentration is acceptable.
β’ Minimum of 5 β 7 years of similar work experience as a ETL Developer in a comparable environment and industry.
β’ Snowflake - Intermediate level knowledge required.
β’ Proficiency in Python and/or R is a plus
β’ Excellent analytical and troubleshooting skills.
β’ Excellent organizational, teamwork and time management skills.
β’ Excellent oral and written communication skills.
β’ Ability to handle multiple parallel tasks.
Job Tile: ETL Developer
Location: Baltimore, MD (5 days onsite)
Hiring Mode: Contract
Job Description
Seeking an experienced ETL/ELT Developer for a fixed term engagement in support of our Information Technology team. Alongside a team, the ETL/ELT Developer will be responsible for the development of integrations employing industry standard tools for data extraction, transformation and loading (ETL) to / from core critical enterprise systems. The ETL developer should specialize in designing, building, and maintaining data pipelines that move data to / from various sources and to / from cloud-based data warehouses or data lakes. They will focus on ensuring data is extracted, transformed, and loaded efficiently and reliably for analytics and other downstream systems uses.
The ETL/ELT Developer Responsibilities:
β’ Data Transformation: Develop and optimize data transformations using cloud-based tools and technologies to cleanse, enrich, aggregate, and reshape data according to business requirements.
β’ Load transformed data into cloud data warehouses (like Snowflake, Azure Synapse, or BigQuery) or data lakes for storage and further analysis.
β’ Performance Optimization: Ensue efficient data processing and pipeline performance within on-premises and cloud environments by leveraging cloud-native services and optimizing resource utilization.
β’ Utilizing cloud-specific tools and services (e.g., FiveTran, Snowpipe, Streams and Tasks, Azure Data Factory) for ETL processes.
β’ Orchestration and Automation: Automate ETL workflows using orchestration tools or cloud-based workflow services.
β’ Data Integration: Design and implement ETL pipelines that extract data from diverse cloud and on-premises sources (databases, APIs, files, etc.).
β’ Collaborate with data analysts, systems analysts, developers and other stakeholders to understand data requirements and ensure the successful delivery of data for analytics and other business needs.
β’ Create and/or update knowledge base articles regarding procedures, workflows, user guides, process run books, etc.
β’ Participate within the change management process.
Job Requirements
β’ BS in Information Systems, Information Technology, or a related field is required. Work experience in lieu of degree or concentration is acceptable.
β’ Minimum of 5 β 7 years of similar work experience as a ETL Developer in a comparable environment and industry.
β’ Snowflake - Intermediate level knowledge required.
β’ Proficiency in Python and/or R is a plus
β’ Excellent analytical and troubleshooting skills.
β’ Excellent organizational, teamwork and time management skills.
β’ Excellent oral and written communication skills.
β’ Ability to handle multiple parallel tasks.