

Data Engineer - Informatica
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer - Informatica, offering a 3-9 month remote contract with an hourly pay of $65. Key skills include advanced SQL, Informatica PowerCenter, cloud platforms (AWS, Azure), and data quality governance.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
520
-
ποΈ - Date discovered
September 17, 2025
π - Project duration
3 to 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Synapse #Data Governance #Automation #BI (Business Intelligence) #Visualization #Data Pipeline #Alation #Data Engineering #Looker #Microsoft Power BI #Cloud #Scala #Data Quality #Storage #Data Management #"ETL (Extract #Transform #Load)" #Informatica Cloud #Informatica PowerCenter #SQL Server #PostgreSQL #Scripting #Shell Scripting #Azure #Informatica #Tableau #Collibra #MDM (Master Data Management) #Data Warehouse #Security #Informatica IDQ (Informatica Data Quality) #Snowflake #Metadata #SQL (Structured Query Language) #Data Security #Databricks #MySQL #Complex Queries #GCP (Google Cloud Platform) #Compliance #Data Manipulation #Data Modeling #Programming #Qlik #Azure Data Factory #Databases #ADF (Azure Data Factory) #R #SQL Queries #Python #Azure Synapse Analytics #Oracle #Data Integration #AWS (Amazon Web Services)
Role description
Data Engineer - Informatica
Location: Remote -EST working hours
Length of Contract: 3-9 months
Hourly Pay: 65.00 p/hr - W2
40 hours per week
β’ No 3rd Party Vendors
β’
β’ W2 Contract Only
β’ PERFORMANCE DETAILS
1. Whatβs the major objective(s) of the role?
β’ Building and optimizing ETL processes using Informatica PowerCenter to extract, transform, and load data from multiple sources into centralized data warehouses or cloud environments.
β’ Ensuring data quality, consistency, and accuracy across systems by implementing validation, cleansing, and transformation logic.
β’ Developing and optimizing SQL queries for efficient data retrieval, analysis, and reporting.
β’ Leveraging cloud platforms (such as AWS, Azure, or GCP) to design scalable, secure, and cost-effective BI solutions.
β’ Collaborating with business stakeholders to understand reporting and analytics needs, then translating them into technical solutions.
β’ Enabling self-service analytics by delivering structured data models, dashboards, and reporting frameworks for end-users.
1. What are the MUST-HAVE technologies for this position?
β’ ETL / Data Integration Tools
β’ Informatica PowerCenter (core requirement for ETL design and data integration)
β’ Informatica Cloud (for hybrid/cloud data integration, if applicable)
β’ Databases and Query Languages
β’ SQL (advanced proficiency for writing complex queries, stored procedures, and performance tuning)
β’ Relational Databases such as Oracle, SQL Server, PostgreSQL, or MySQL
β’ Exposure to Data Warehousing concepts (Star/Snowflake schema, fact/dimension modeling)
β’ Cloud Platforms (at least one major provider)
β’ Azure (Synapse Analytics, Data Factory, Blob Storage)
β’ Data Modeling & Warehousing
β’ Dimensional modeling
β’ Data warehouse/lakehouse platforms (Snowflake, Databricks, or equivalent)
1. What are the MUST-HAVE Critical Skills for this position
β’ Performance Optimization
β’ Experience in tuning ETL jobs and optimizing SQL queries for large data volumes.
β’ Ensuring data pipelines are efficient, reliable, and scalable.
β’ Data Quality & Governance
β’ Implementing data validation, cleansing, and transformation rules.
β’ Understanding of data security, compliance, and governance best practices.
β’ Problem-Solving & Analytical Thinking
β’ Strong skills in analyzing business requirements and translating them into technical solutions.
β’ Ability to troubleshoot complex ETL, SQL, and data pipeline issues.
β’ Collaboration & Communication
β’ Ability to work closely with business stakeholders to understand reporting needs.
β’ Clear communication of technical concepts to non-technical users.
β’ Adaptability & Continuous Learning
β’ Keeping up with evolving cloud technologies and BI tools.
β’ Flexibility to work across different databases, integration tools, and visualization platforms
1. What are the NICE TO HAVE technologies
β’ Advanced Cloud Data Ecosystem
β’ Azure: Data Factory, Databricks, Cosmos DB
β’ Snowflake or Databricks for modern data warehousing and lakehouse solutions
β’ BI & Visualization Tools
β’ Power BI, Tableau, Qlik, or Looker for dashboarding and self-service analytics
β’ Programming & Scripting Languages
β’ Python or R for data manipulation, automation, and advanced analytics
β’ Shell scripting for workflow automation and ETL orchestration
β’ Data Governance & Quality Tools
β’ Collibra, Alation, or Informatica Data Quality (IDQ) for metadata management and governance
β’ Master Data Management (MDM) tools for enterprise data consistency
Data Engineer - Informatica
Location: Remote -EST working hours
Length of Contract: 3-9 months
Hourly Pay: 65.00 p/hr - W2
40 hours per week
β’ No 3rd Party Vendors
β’
β’ W2 Contract Only
β’ PERFORMANCE DETAILS
1. Whatβs the major objective(s) of the role?
β’ Building and optimizing ETL processes using Informatica PowerCenter to extract, transform, and load data from multiple sources into centralized data warehouses or cloud environments.
β’ Ensuring data quality, consistency, and accuracy across systems by implementing validation, cleansing, and transformation logic.
β’ Developing and optimizing SQL queries for efficient data retrieval, analysis, and reporting.
β’ Leveraging cloud platforms (such as AWS, Azure, or GCP) to design scalable, secure, and cost-effective BI solutions.
β’ Collaborating with business stakeholders to understand reporting and analytics needs, then translating them into technical solutions.
β’ Enabling self-service analytics by delivering structured data models, dashboards, and reporting frameworks for end-users.
1. What are the MUST-HAVE technologies for this position?
β’ ETL / Data Integration Tools
β’ Informatica PowerCenter (core requirement for ETL design and data integration)
β’ Informatica Cloud (for hybrid/cloud data integration, if applicable)
β’ Databases and Query Languages
β’ SQL (advanced proficiency for writing complex queries, stored procedures, and performance tuning)
β’ Relational Databases such as Oracle, SQL Server, PostgreSQL, or MySQL
β’ Exposure to Data Warehousing concepts (Star/Snowflake schema, fact/dimension modeling)
β’ Cloud Platforms (at least one major provider)
β’ Azure (Synapse Analytics, Data Factory, Blob Storage)
β’ Data Modeling & Warehousing
β’ Dimensional modeling
β’ Data warehouse/lakehouse platforms (Snowflake, Databricks, or equivalent)
1. What are the MUST-HAVE Critical Skills for this position
β’ Performance Optimization
β’ Experience in tuning ETL jobs and optimizing SQL queries for large data volumes.
β’ Ensuring data pipelines are efficient, reliable, and scalable.
β’ Data Quality & Governance
β’ Implementing data validation, cleansing, and transformation rules.
β’ Understanding of data security, compliance, and governance best practices.
β’ Problem-Solving & Analytical Thinking
β’ Strong skills in analyzing business requirements and translating them into technical solutions.
β’ Ability to troubleshoot complex ETL, SQL, and data pipeline issues.
β’ Collaboration & Communication
β’ Ability to work closely with business stakeholders to understand reporting needs.
β’ Clear communication of technical concepts to non-technical users.
β’ Adaptability & Continuous Learning
β’ Keeping up with evolving cloud technologies and BI tools.
β’ Flexibility to work across different databases, integration tools, and visualization platforms
1. What are the NICE TO HAVE technologies
β’ Advanced Cloud Data Ecosystem
β’ Azure: Data Factory, Databricks, Cosmos DB
β’ Snowflake or Databricks for modern data warehousing and lakehouse solutions
β’ BI & Visualization Tools
β’ Power BI, Tableau, Qlik, or Looker for dashboarding and self-service analytics
β’ Programming & Scripting Languages
β’ Python or R for data manipulation, automation, and advanced analytics
β’ Shell scripting for workflow automation and ETL orchestration
β’ Data Governance & Quality Tools
β’ Collibra, Alation, or Informatica Data Quality (IDQ) for metadata management and governance
β’ Master Data Management (MDM) tools for enterprise data consistency