

ETL/ELT Developer – Cloud Data Platforms
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL/ELT Developer focused on cloud data platforms, with a contract length of "unknown" and a pay rate of "unknown." Requires 5-7 years of ETL/ELT experience, strong SQL skills, and proficiency in Snowflake and Azure Data Factory.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 1, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Baltimore, MD
-
🧠 - Skills detailed
#Databases #Data Pipeline #Snowflake #SQL (Structured Query Language) #Deployment #ADF (Azure Data Factory) #R #Synapse #Fivetran #Python #Data Engineering #Scala #Azure #Azure Data Factory #"ETL (Extract #Transform #Load)" #Business Analysis #BigQuery #SnowPipe #Cloud #Documentation
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Description:
We are seeking a skilled ETL/ELT Developer to join our data engineering team. This role focuses on building, optimizing, and maintaining scalable data pipelines across cloud and hybrid environments using tools like Snowflake, Azure Data Factory, FiveTran, and other modern technologies.
Key Responsibilities:
• Design, develop, and optimize data transformation workflows to cleanse, enrich, and aggregate data per business needs.
• Load processed data into cloud data platforms like Snowflake, Azure Synapse, or BigQuery.
• Ensure pipeline performance and resource optimization using cloud-native services.
• Automate workflows using orchestration tools (e.g., ADF, Snowpipe, Streams & Tasks).
• Integrate data from diverse sources such as APIs, on-prem systems, and cloud databases.
• Collaborate with business analysts, developers, and data teams to gather and deliver data requirements.
• Maintain documentation, including process guides, runbooks, and support materials.
• Participate in change management and deployment processes.
Requirements:
• Bachelor’s degree in Information Systems/Technology or equivalent experience.
• 5 to 7 years of experience in ETL/ELT development in enterprise environments.
• Strong working knowledge of Snowflake and cloud-based ETL tools (e.g., FiveTran, ADF, Snowpipe).
• Proficient in SQL; experience with Python and/or R is a plus.
• Solid troubleshooting, communication, and multitasking skills.
• Ability to work collaboratively in a fast-paced, dynamic environment.