

Galent
ETL Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer in Baltimore, MD, with a contract length of "fixed term" and a pay rate of "unknown." Key requirements include 5-7 years of experience, proficiency in Snowflake, and skills in Python or R.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 16, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Fixed Term
-
🔒 - Security
Unknown
-
📍 - Location detailed
Baltimore, MD
-
🧠 - Skills detailed
#Fivetran #Data Lake #Data Warehouse #SnowPipe #Data Integration #BigQuery #Cloud #Data Pipeline #Data Transformations #Databases #Python #R #Snowflake #Data Extraction #Synapse #Azure Data Factory #Base #"ETL (Extract #Transform #Load)" #Storage #Azure #Data Analysis #Data Processing #Automation #ADF (Azure Data Factory)
Role description
Role : ETL Developer
Location : Baltimore, MD
Job Description
Client is seeking an experienced ETL/ELT Developer for a fixed term engagement in support of our Information Technology team. Alongside a team, the ETL/ELT Developer will be responsible for the development of integrations employing industry standard tools for data extraction, transformation and loading (ETL) to / from core critical enterprise systems. The ETL developer should specialize in designing, building, and maintaining data pipelines that move data to / from various sources and to / from cloud-based data warehouses or data lakes. They will focus on ensuring data is extracted, transformed, and loaded efficiently and reliably for analytics and other downstream systems uses.
The ETL/ELT Developer Responsibilities:
• Data Transformation: Develop and optimize data transformations using cloud-based tools and technologies to cleanse, enrich, aggregate, and reshape data according to business requirements.
• Load transformed data into cloud data warehouses (like Snowflake, Azure Synapse, or BigQuery) or data lakes for storage and further analysis.
• Performance Optimization: Ensue efficient data processing and pipeline performance within on-premises and cloud environments by leveraging cloud-native services and optimizing resource utilization.
• Utilizing cloud-specific tools and services (e.g., FiveTran, Snowpipe, Streams and Tasks, Azure Data Factory) for ETL processes.
• Orchestration and Automation: Automate ETL workflows using orchestration tools or cloud-based workflow services.
• Data Integration: Design and implement ETL pipelines that extract data from diverse cloud and on-premises sources (databases, APIs, files, etc.).
• Collaborate with data analysts, systems analysts, developers and other stakeholders to understand data requirements and ensure the successful delivery of data for analytics and other business needs.
• Create and/or update knowledge base articles regarding procedures, workflows, user guides, process run books, etc.
• Participate within the change management process.
Job Requirements
• BS in Information Systems, Information Technology, or a related field is required. Work experience in lieu of degree or concentration is acceptable.
• Minimum of 5 – 7 years of similar work experience as a ETL Developer in a comparable environment and industry.
• Snowflake - Intermediate level knowledge required.
• Proficiency in Python and/or R is a plus
• Excellent analytical and troubleshooting skills.
• Excellent organizational, teamwork and time management skills.
• Excellent oral and written communication skills.
• Ability to handle multiple parallel tasks
Role : ETL Developer
Location : Baltimore, MD
Job Description
Client is seeking an experienced ETL/ELT Developer for a fixed term engagement in support of our Information Technology team. Alongside a team, the ETL/ELT Developer will be responsible for the development of integrations employing industry standard tools for data extraction, transformation and loading (ETL) to / from core critical enterprise systems. The ETL developer should specialize in designing, building, and maintaining data pipelines that move data to / from various sources and to / from cloud-based data warehouses or data lakes. They will focus on ensuring data is extracted, transformed, and loaded efficiently and reliably for analytics and other downstream systems uses.
The ETL/ELT Developer Responsibilities:
• Data Transformation: Develop and optimize data transformations using cloud-based tools and technologies to cleanse, enrich, aggregate, and reshape data according to business requirements.
• Load transformed data into cloud data warehouses (like Snowflake, Azure Synapse, or BigQuery) or data lakes for storage and further analysis.
• Performance Optimization: Ensue efficient data processing and pipeline performance within on-premises and cloud environments by leveraging cloud-native services and optimizing resource utilization.
• Utilizing cloud-specific tools and services (e.g., FiveTran, Snowpipe, Streams and Tasks, Azure Data Factory) for ETL processes.
• Orchestration and Automation: Automate ETL workflows using orchestration tools or cloud-based workflow services.
• Data Integration: Design and implement ETL pipelines that extract data from diverse cloud and on-premises sources (databases, APIs, files, etc.).
• Collaborate with data analysts, systems analysts, developers and other stakeholders to understand data requirements and ensure the successful delivery of data for analytics and other business needs.
• Create and/or update knowledge base articles regarding procedures, workflows, user guides, process run books, etc.
• Participate within the change management process.
Job Requirements
• BS in Information Systems, Information Technology, or a related field is required. Work experience in lieu of degree or concentration is acceptable.
• Minimum of 5 – 7 years of similar work experience as a ETL Developer in a comparable environment and industry.
• Snowflake - Intermediate level knowledge required.
• Proficiency in Python and/or R is a plus
• Excellent analytical and troubleshooting skills.
• Excellent organizational, teamwork and time management skills.
• Excellent oral and written communication skills.
• Ability to handle multiple parallel tasks