Talent Groups

Hybrid // ETL Informatica Admin (Snowflake)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Hybrid ETL Informatica Admin (Snowflake) contract in Wayne, PA, offering competitive pay. Requires 7+ years in data engineering, strong SQL, ETL tools expertise, cloud platform experience, and knowledge of data governance and security best practices.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 23, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Wayne, PA
-
🧠 - Skills detailed
#Data Governance #Scala #Scripting #SQL (Structured Query Language) #Databases #Security #Airflow #Data Quality #Disaster Recovery #Informatica #Documentation #dbt (data build tool) #Data Engineering #Synapse #Data Processing #GitHub #DevOps #Schema Design #Agile #Azure #Indexing #"ETL (Extract #Transform #Load)" #Snowflake #Azure DevOps #Redshift #Automation #Database Administration #Python #Talend #Monitoring #Storage #Data Modeling #Data Architecture #Cloud
Role description
Hybrid ETL Informatica Admin (Snowflake) Contract Wayne, PA Job Summary: Data Operations & Database Engineer Responsibilities: Data Operations Design and monitor robust ETL/ELT pipelines. Monitor data quality, pipeline health, and system performance. Troubleshoot and resolve data processing issues quickly and effectively. Database Administration Manage and optimize databases. Ensure high availability, backup, and disaster recovery strategies. Implement and maintain security policies, access controls, and auditing. Perform performance tuning, indexing, and query optimization. Support schema design and data modeling efforts. Executing ad-hoc data cleanup and corrections. Collaboration & Advisory Partner with data engineers, analysts, and architects to design scalable data solutions. Provide guidance on best practices for data architecture, storage, and processing. Identify and resolve data infrastructure bottlenecks and inefficiencies. Contribute to documentation, standards, and knowledge sharing across the team. Key Responsibilities: Data Operations & Database Engineer: 7+ years of experience in data engineering, database administration, or related roles. Strong proficiency in SQL and experience with ETL/ELT tools (e.g., DBT, Talend, Informatica, custom Python scripts) Hands-on experience with cloud data platforms (e.g., Snowflake, Azure Synapse, Redshift) Familiarity with CI/CD tools (e.g., GitHub Actions, Azure DevOps) and workflow orchestration (e.g., Airflow, Talend) Solid understanding of data modeling, data governance, and security best practices Strong problem-solving skills and a proactive approach to identifying and resolving issues Required Skills: Data Operations & Database Engineer: Knowledge of Python or other scripting languages for automation Familiarity with monitoring tools Experience working in Agile or DevOps environments