

ETL Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer with a contract length of "unknown" and a pay rate of "unknown." It requires travel to New Jersey 3-4 days a month, 4-5 years of ETL experience, proficiency in SQL, and familiarity with Snowflake and ETL tools like Talend or Informatica.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 3, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Spark (Apache Spark) #ADF (Azure Data Factory) #Oracle #SQL Server #Big Data #AWS (Amazon Web Services) #Documentation #Cloud #Informatica #Visualization #Dataflow #Data Modeling #BI (Business Intelligence) #AWS Glue #Data Pipeline #Azure Data Factory #SSIS (SQL Server Integration Services) #"ETL (Extract #Transform #Load)" #Snowflake #Databases #MySQL #Data Analysis #SQL (Structured Query Language) #Data Governance #Data Integration #Talend #Data Quality #Tableau #Scala #Computer Science #Azure #Microsoft Power BI #Data Mapping #Hadoop
Role description
MUST be willing to travel to New Jersey 3-4 days a month
About Us
We are a forward-thinking technology company dedicated to delivering innovative data solutions. Our focus is on data integration, analytics, and cutting-edge business intelligence, empowering organizations to make informed, data-driven decisions. Our team is made up of passionate professionals who thrive in a collaborative, dynamic, and high-performance environment.
Job Summary
We are seeking an experienced ETL Developer to design, develop, and maintain robust data integration pipelines. The ideal candidate has hands-on experience with ETL tools such as Talend, Informatica, or SSIS, a strong command of SQL, and a solid understanding of data warehousing and transformation best practices. If you have a passion for data optimization, performance tuning, and scalable architecture, weβd love to hear from you.
Key Responsibilities
β’ Design, develop, and maintain ETL workflows using tools such as Talend, Informatica, SSIS, or similar.
β’ Extract, transform, and load data from diverse sources including APIs, databases, and flat files.
β’ Ensure accuracy, quality, and integrity of the data during transformation and loading.
β’ Work closely with data analysts, engineers, and business stakeholders to gather and interpret data requirements.
β’ Optimize ETL processes to ensure high performance and scalability.
β’ Troubleshoot and debug data pipelines, resolving issues in a timely and efficient manner.
β’ Maintain documentation for ETL processes, data mappings, and transformation logic.
Qualifications
β’ Bachelor's degree in Computer Science, Information Technology, or a related discipline.
β’ 4β5 years of experience in ETL development and data integration.
β’ Proficient in SQL with experience in databases such as Oracle, SQL Server, or MySQL.
β’ Strong understanding of data warehousing concepts and best practices.
β’ Experience with ETL tools such as Informatica, Talend, or SSIS.
β’ 2β3 years of experience with Snowflake data warehousing solutions.
β’ Familiarity with data modeling, data governance, and data quality principles.
β’ Strong analytical mindset and attention to detail.
β’ Excellent verbal and written communication skills; ability to collaborate across teams.
Preferred Skills
β’ Experience with Big Data technologies (e.g., Hadoop, Spark).
β’ Knowledge of cloud-based ETL and data platforms such as AWS Glue, Azure Data Factory, or Google Cloud Dataflow.
β’ Hands-on experience with Snowflake for scalable data warehousing and analytics.
β’ Familiarity with data visualization tools such as Tableau or Power BI.
MUST be willing to travel to New Jersey 3-4 days a month
About Us
We are a forward-thinking technology company dedicated to delivering innovative data solutions. Our focus is on data integration, analytics, and cutting-edge business intelligence, empowering organizations to make informed, data-driven decisions. Our team is made up of passionate professionals who thrive in a collaborative, dynamic, and high-performance environment.
Job Summary
We are seeking an experienced ETL Developer to design, develop, and maintain robust data integration pipelines. The ideal candidate has hands-on experience with ETL tools such as Talend, Informatica, or SSIS, a strong command of SQL, and a solid understanding of data warehousing and transformation best practices. If you have a passion for data optimization, performance tuning, and scalable architecture, weβd love to hear from you.
Key Responsibilities
β’ Design, develop, and maintain ETL workflows using tools such as Talend, Informatica, SSIS, or similar.
β’ Extract, transform, and load data from diverse sources including APIs, databases, and flat files.
β’ Ensure accuracy, quality, and integrity of the data during transformation and loading.
β’ Work closely with data analysts, engineers, and business stakeholders to gather and interpret data requirements.
β’ Optimize ETL processes to ensure high performance and scalability.
β’ Troubleshoot and debug data pipelines, resolving issues in a timely and efficient manner.
β’ Maintain documentation for ETL processes, data mappings, and transformation logic.
Qualifications
β’ Bachelor's degree in Computer Science, Information Technology, or a related discipline.
β’ 4β5 years of experience in ETL development and data integration.
β’ Proficient in SQL with experience in databases such as Oracle, SQL Server, or MySQL.
β’ Strong understanding of data warehousing concepts and best practices.
β’ Experience with ETL tools such as Informatica, Talend, or SSIS.
β’ 2β3 years of experience with Snowflake data warehousing solutions.
β’ Familiarity with data modeling, data governance, and data quality principles.
β’ Strong analytical mindset and attention to detail.
β’ Excellent verbal and written communication skills; ability to collaborate across teams.
Preferred Skills
β’ Experience with Big Data technologies (e.g., Hadoop, Spark).
β’ Knowledge of cloud-based ETL and data platforms such as AWS Glue, Azure Data Factory, or Google Cloud Dataflow.
β’ Hands-on experience with Snowflake for scalable data warehousing and analytics.
β’ Familiarity with data visualization tools such as Tableau or Power BI.