ETL Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer with a 6-month contract, offering a pay rate of "unknown." Remote work is available. Key skills include ETL processes, cloud data integration (Snowflake, Azure), and proficiency in Python or R. Minimum 5-7 years of experience required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 14, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Fixed Term
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Baltimore, MD
-
🧠 - Skills detailed
#Synapse #Fivetran #Automation #Data Extraction #Data Transformations #Data Warehouse #Data Pipeline #"ETL (Extract #Transform #Load)" #Python #Base #Snowflake #BigQuery #R #Storage #Data Lake #Cloud #SnowPipe #Data Processing #ADF (Azure Data Factory) #Azure Data Factory #Azure #Databases #Data Analysis #Data Integration
Role description
Job Description Brown Advisory is seeking an experienced ETL/ELT Developer for a fixed term engagement in support of our Information Technology team. Alongside a team, the ETL/ELT Developer will be responsible for the development of integrations employing industry standard tools for data extraction, transformation and loading (ETL) to / from core critical enterprise systems. The ETL developer should specialize in designing, building, and maintaining data pipelines that move data to / from various sources and to / from cloud-based data warehouses or data lakes. They will focus on ensuring data is extracted, transformed, and loaded efficiently and reliably for analytics and other downstream systems uses. The ETL/ELT Developer Responsibilities: β€’ Data Transformation: Develop and optimize data transformations using cloud-based tools and technologies to cleanse, enrich, aggregate, and reshape data according to business requirements. β€’ Load transformed data into cloud data warehouses (like Snowflake, Azure Synapse, or BigQuery) or data lakes for storage and further analysis. β€’ Performance Optimization: Ensue efficient data processing and pipeline performance within on-premises and cloud environments by leveraging cloud-native services and optimizing resource utilization. β€’ Utilizing cloud-specific tools and services (e.g., FiveTran, Snowpipe, Streams and Tasks, Azure Data Factory) for ETL processes. β€’ Orchestration and Automation: Automate ETL workflows using orchestration tools or cloud-based workflow services. β€’ Data Integration: Design and implement ETL pipelines that extract data from diverse cloud and on-premises sources (databases, APIs, files, etc.). β€’ Collaborate with data analysts, systems analysts, developers and other stakeholders to understand data requirements and ensure the successful delivery of data for analytics and other business needs. β€’ Create and/or update knowledge base articles regarding procedures, workflows, user guides, process run books, etc. β€’ Participate within the change management process. Job Requirements β€’ BS in Information Systems, Information Technology, or a related field is required. Work experience in lieu of degree or concentration is acceptable. β€’ Minimum of 5 – 7 years of similar work experience as a ETL Developer in a comparable environment and industry. β€’ Snowflake - Intermediate level knowledge required. β€’ Proficiency in Python and/or R is a plus β€’ Excellent analytical and troubleshooting skills. β€’ Excellent organizational, teamwork and time management skills. β€’ Excellent oral and written communication skills. β€’ Ability to handle multiple parallel tasks.