BI ETL Data Engineer with Informatica Power Center Experience

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a BI ETL Data Engineer with Informatica PowerCenter experience, offering a contract of 6 months at $50-55/hr. It requires expertise in ETL processes, SQL, and cloud platforms (AWS preferred). Hybrid work is available in Kansas City, MO, Milwaukee, WI, or Wellesley, MA.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
440
-
πŸ—“οΈ - Date discovered
September 17, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
1099 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Wellesley, MA
-
🧠 - Skills detailed
#Lambda (AWS Lambda) #dbt (data build tool) #Synapse #Redshift #Automation #BI (Business Intelligence) #Visualization #Data Pipeline #Alation #Data Engineering #Looker #Microsoft Power BI #Cloud #Scala #Data Quality #Storage #Data Management #"ETL (Extract #Transform #Load)" #Informatica Cloud #Pandas #Microsoft SQL Server #Informatica PowerCenter #SQL Server #Delta Lake #PostgreSQL #Scripting #Azure #Shell Scripting #Informatica #Tableau #Collibra #MDM (Master Data Management) #Data Warehouse #Security #Kafka (Apache Kafka) #Informatica IDQ (Informatica Data Quality) #Snowflake #Metadata #Airflow #SQL (Structured Query Language) #Spark (Apache Spark) #Data Security #Databricks #S3 (Amazon Simple Storage Service) #Azure SQL #MySQL #PySpark #SSRS (SQL Server Reporting Services) #GCP (Google Cloud Platform) #Compliance #Data Manipulation #Data Modeling #MS SQL (Microsoft SQL Server) #SSIS (SQL Server Integration Services) #Microsoft SQL #Qlik #Azure Data Factory #Databases #ADF (Azure Data Factory) #R #IICS (Informatica Intelligent Cloud Services) #SQL Queries #Python #Azure Synapse Analytics #Oracle #Data Integration #Apache Iceberg #AWS (Amazon Web Services)
Role description
Position: - Business Intelligence Developer Location: Hybrid 1-2 days per week in office – locations are; Kansas City, MO / Milwaukee, Wisconsin / Wellesley, MA - Locals Only C2C/1099 Rate: $50-55/hr What’s the major objective(s) of the role? Specifically, what does this person need to do to be considered a success? What will they be working on? β€’ Building and optimizing ETL processes using Informatica PowerCenter to extract, transform, and load data from multiple sources into centralized data warehouses or cloud environments. β€’ Ensuring data quality, consistency, and accuracy across systems by implementing validation, cleansing, and transformation logic. β€’ Developing and optimizing SQL queries for efficient data retrieval, analysis, and reporting. β€’ Leveraging cloud platforms (such as AWS, Azure, or GCP) to design scalable, secure, and cost-effective BI solutions. β€’ Collaborating with business stakeholders to understand reporting and analytics needs, then translating them into technical solutions. β€’ Enabling self-service analytics by delivering structured data models, dashboards, and reporting frameworks for end-users. 1. What are the MUST-HAVE technologies for this position? (Please list must-have technologies / technical skills and what the candidate needs to do to be considered great at them) 1. Informatica PowerCenter (core requirement for ETL design and data integration) 1. Informatica Cloud (for hybrid/cloud data integration, if applicable) 1. Databases and Query Languages 1. Relational Databases such as Oracle, SQL Server, PostgreSQL, or MySQL 1. Exposure to Data Warehousing concepts (Star/Snowflake schema, fact/dimension modeling) Cloud Platforms (at least one major provider) - AWS preferred Azure (Synapse Analytics, Data Factory, Blob Storage) Data Modeling & Warehousing Dimensional modeling Data warehouse/lakehouse platforms (Snowflake, Databricks, or equivalent) 1. What are the MUST-HAVE Critical Skills for this position β€’ (For critical skills, please also describe what the person needs to do with them to be considered very good at it.) β€’ Experience in tuning ETL jobs and optimizing SQL queries for large data volumes. β€’ Ensuring data pipelines are efficient, reliable, and scalable. β€’ Implementing data validation, cleansing, and transformation rules. β€’ Understanding of data security, compliance, and governance best practices. β€’ Strong skills in analyzing business requirements and translating them into technical solutions. β€’ Ability to troubleshoot complex ETL, SQL, and data pipeline issues β€’ Ability to work closely with business stakeholders to understand reporting needs. β€’ Clear communication of technical concepts to non-technical users. β€’ Keeping up with evolving cloud technologies and BI tools. β€’ Flexibility to work across different databases, integration tools, and visualization platforms 1. What are the NICE TO HAVE technologies you wouldn’t mind seeing on a candidate’s resume? β€’ Azure: Data Factory, Databricks, Cosmos DB β€’ Snowflake or Databricks for modern data warehousing and lakehouse solutions β€’ Power BI, Tableau, Qlik, or Looker for dashboarding and self-service analytics β€’ Python or R for data manipulation, automation, and advanced analytics β€’ Shell scripting for workflow automation and ETL orchestration β€’ Collibra, Alation, or Informatica Data Quality (IDQ) for metadata management and governance β€’ Master Data Management (MDM) tools for enterprise data consistency Other Pertinent Information - Responsibilities, Skills, Qualifications, etc. β€’ ETL Tools: Informatica Power Center, IICS, Informatica Developer IDQ, SSIS, Metadata Manager β€’ Cloud & Infrastructure: AWS (Redshift, S3, Lambda, Glue, Kinesis), dbt Cloud β€’ Languages & Tools: Python, SQL, PySpark, DBT, Pandas β€’ Languages & Querying: T-SQL, Dynamic SQL, PL/SQL β€’ Databases: Microsoft SQL Server, Azure SQL, Oracle, PostgreSQL β€’ Data Warehousing: Redshift, Snowflake, Delta Lake, Apache Iceberg β€’ Data Engineering: Spark, Airflow, Airbyte, Stitch, Kafka β€’ Reporting Tools: SQL Server Reporting Services (SSRS), Power BI, β€’ Domains: Healthcare, Public Sector, Insurance, Retail, Finance