TalentOla

ETL Developer with Actimize

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer with Actimize, requiring 5 days onsite in Midtown Manhattan. Contract length is unspecified, with a pay rate of "unknown." Key skills include SQL, ETL tools (Informatica, Talend, SSIS), and data warehousing experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 25, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York City Metropolitan Area
-
🧠 - Skills detailed
#Informatica PowerCenter #Hadoop #MS SQL (Microsoft SQL Server) #Scala #SQL Server #Snowflake #Scripting #Data Accuracy #Data Lake #GIT #MySQL #Web Services #Big Data #Tableau #Data Warehouse #Data Engineering #Python #AWS (Amazon Web Services) #Microsoft SQL #Data Integration #SSIS (SQL Server Integration Services) #GCP (Google Cloud Platform) #Microsoft SQL Server #Microsoft Power BI #Talend #Data Analysis #Version Control #Oracle #Azure #Microsoft Azure #Informatica #Shell Scripting #SQL (Structured Query Language) #Data Modeling #PostgreSQL #SQL Queries #Cloud #Data Pipeline #Spark (Apache Spark) #BI (Business Intelligence) #Apache Spark #Databases #"ETL (Extract #Transform #Load)" #Datasets
Role description
Position: ETL Developer Location: NYC, Midtown Manhattan (5 days onsite). Headcount: 1 – Need immediate joiner Role Overview An ETL (Extract, Transform, Load) Developer is responsible for designing, developing, and maintaining data pipelines that extract data from multiple sources, transform it into usable formats, and load it into data warehouses or data lakes for reporting and analytics. Key Responsibilities • Design, develop, and maintain ETL workflows and data pipelines • Extract data from various sources: • Databases (Oracle, SQL Server, MySQL) • APIs, flat files, cloud sources • Transform data by: • Cleaning, validating, and standardizing datasets • Applying business rules and transformations • Load data into: • Data warehouses, marts, or data lakes • Develop and optimize: • Complex SQL queries and stored procedures • Data integration processes for performance and scalability • Perform: • Data validation, reconciliation, and quality checks • Troubleshoot: • ETL job failures and data inconsistencies • Collaborate with: • Data analysts, data engineers, and business stakeholders • Support: • Reporting tools and BI platforms Required Skills & Qualifications Technical Skills • Strong proficiency in: • SQL / PL-SQL • Data warehousing concepts • Hands-on experience with ETL tools such as: • Informatica PowerCenter • Talend • Microsoft SQL Server Integration Services (SSIS) • Experience working with: • Relational databases (Oracle, SQL Server, PostgreSQL) • Knowledge of: • Data modeling (Star schema, Snowflake schema) • Data integration and transformation techniques Preferred / Additional Skills • Experience with: • Big data tools (Apache Spark, Hadoop) • Cloud platforms (Amazon Web Services, Microsoft Azure, Google Cloud Platform) • Familiarity with: • Python or shell scripting • CI/CD pipelines for data engineering Domain Knowledge (Nice to Have) • Understanding of: • Business intelligence and reporting • Financial, healthcare, or retail data domains Soft Skills • Strong analytical and problem-solving abilities • Attention to detail and data accuracy • Good communication and teamwork skills • Ability to manage multiple tasks and deadlines Typical Tools & Technologies • ETL Tools: Informatica, Talend, SSIS • Databases: Oracle, SQL Server, PostgreSQL • Scripting: Python, Shell • BI Tools: Power BI, Tableau • Version Control: Git Certifications (Optional) • Informatica Certification • Microsoft Certified: Data Engineer Associate • Cloud certifications (AWS, Azure, GCP)