

Realign LLC
ETL Informatica / Hadoop Developer-1
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a contract position for an ETL Informatica / Hadoop Developer located in Charlotte, NC, offering competitive pay. Key skills required include Informatica PowerCenter/IICS, Hadoop, SQL, and data integration. Experience in data warehousing and cloud platforms is essential.
🌎 - Country
United States
💱 - Currency
Unknown
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 30, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Spark (Apache Spark) #RDBMS (Relational Database Management System) #Sqoop (Apache Sqoop) #Data Architecture #Data Ingestion #Informatica #SQL Queries #"ETL (Extract #Transform #Load)" #Batch #SQL (Structured Query Language) #Data Migration #Informatica PowerCenter #HDFS (Hadoop Distributed File System) #Hadoop #Migration #Spark SQL #Business Analysis #Data Pipeline #Cloud #IICS (Informatica Intelligent Cloud Services) #Agile #Data Integration #Data Quality #Documentation #HiveQL #Data Processing #Scala
Role description
Job Type: Contract
Job Category: IT
Job Title: ETL Informatica / Hadoop Developer Location – Charlotte, NC FTE Only
Job Summary
We are seeking an experienced ETL Informatica / Hadoop Developer to design, develop, and maintain scalable data integration solutions across enterprise data platforms. The ideal candidate will have strong expertise in Informatica PowerCenter / IICS, Hadoop ecosystem, and data warehousing concepts, with hands-on experience building high-volume batch and real-time data pipelines.
Key Responsibilities
Design, develop, and optimize ETL workflows using Informatica PowerCenter / IICS.
Build and maintain data pipelines in Hadoop ecosystems (HDFS, Hive, Spark, Sqoop).
Perform data ingestion from multiple sources (RDBMS, APIs, flat files, cloud sources).
Develop transformations, mappings, sessions, and workflows for large-scale data processing.
Tune ETL jobs for performance, scalability, and reliability.
Write and optimize SQL, HiveQL, Spark SQL queries.
Implement data quality, data validation, reconciliation, and error handling.
Support data migration and modernization initiatives from legacy systems to Hadoop / Cloud platforms.
Work closely with business analysts, data architects, and QA teams to deliver data solutions.
Monitor production jobs and provide L2/L3 production support.
Maintain technical documentation and follow SDLC and Agile processes.
Required Skills
CLOUD DEVELOPER
SQL APPLICATION DEVELOPER
Job Type: Contract
Job Category: IT
Job Title: ETL Informatica / Hadoop Developer Location – Charlotte, NC FTE Only
Job Summary
We are seeking an experienced ETL Informatica / Hadoop Developer to design, develop, and maintain scalable data integration solutions across enterprise data platforms. The ideal candidate will have strong expertise in Informatica PowerCenter / IICS, Hadoop ecosystem, and data warehousing concepts, with hands-on experience building high-volume batch and real-time data pipelines.
Key Responsibilities
Design, develop, and optimize ETL workflows using Informatica PowerCenter / IICS.
Build and maintain data pipelines in Hadoop ecosystems (HDFS, Hive, Spark, Sqoop).
Perform data ingestion from multiple sources (RDBMS, APIs, flat files, cloud sources).
Develop transformations, mappings, sessions, and workflows for large-scale data processing.
Tune ETL jobs for performance, scalability, and reliability.
Write and optimize SQL, HiveQL, Spark SQL queries.
Implement data quality, data validation, reconciliation, and error handling.
Support data migration and modernization initiatives from legacy systems to Hadoop / Cloud platforms.
Work closely with business analysts, data architects, and QA teams to deliver data solutions.
Monitor production jobs and provide L2/L3 production support.
Maintain technical documentation and follow SDLC and Agile processes.
Required Skills
CLOUD DEVELOPER
SQL APPLICATION DEVELOPER






