

Senior Big Data and Informatica Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Big Data and Informatica Engineer with over 10 years in big data and distributed computing, 7 years in Informatica PowerCenter, and expertise in PySpark, SQL, NoSQL, and AWS. Contract duration exceeds 6 months.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 3, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Owings Mills, MD
-
π§ - Skills detailed
#Spark (Apache Spark) #Airflow #Big Data #AWS (Amazon Web Services) #Cloud #Informatica #DevOps #Docker #Data Modeling #PostgreSQL #"ETL (Extract #Transform #Load)" #Snowflake #Informatica PowerCenter #Databases #SQL (Structured Query Language) #Agile #Python #Data Integration #Data Quality #Distributed Computing #Kubernetes #NoSQL #Apache Spark #PySpark
Role description
Education level: Bachelorβs degree
Job function: Information Technology
Industry: Information Technology and Services
Job Description
β’ 7+ years of experience in Informatica Powercentre.
β’ 10+ years of experience in big data and distributed computing.
β’ Very Strong hands-on experience with PySpark, Apache Spark, and Python.
β’ Strong Hands on experience with SQL and NoSQL databases (DB2, PostgreSQL, Snowflake, etc.).
β’ Proficiency in data modeling and ETL workflows.
β’ Proficiency with workflow schedulers like Airflow.
β’ Hands on experience with AWS cloud-based data platforms.
β’ Experience in DevOps, CI/CD pipelines, and containerization (Docker, Kubernetes) is a plus.
β’ Strong problem-solving skills and ability to lead a team
β’ Design, develop, modify, configure & debug Informatica workflows using Informatica PowerCenter and Power Exchange CDC tools.
β’ Leads the design, development, and maintenance of data integration solutions using Informatica, ensuring data quality.
β’ Troubleshoot and resolve technical issues. Debug, tune and optimize code for optimal performance.
β’ Manage the new requirements, Review the existing jobs, Perform gap analysis & Fixing performance issues, etc.
β’ Document all ETL mappings, sessions and workflows.
β’ Ticket handling and problem ticket analysis skills in Agile /POD approach.
Additional Notes
β’ Please submit the candidate's resume in PDF format.
β’ Please note that TCS does not consider former full-time employees (FTEs) for rehire. Additionally, individuals who have previously worked at TCS as contractors must observe a minimum waiting period of six months before being eligible for re-engagement.
Must Have
β’ 10+ years of experience in big data and distributed computing.
β’ 7+ years of experience in Informatica PowerCenter.
β’ Experience with PySpark, Apache Spark, and Python.
β’ Experience with SQL and NoSQL databases (DB2, PostgreSQL, Snowflake, etc.).
β’ Experience with AWS cloud-based data platforms.
β’ Experience in DevOps, CI/CD pipelines, and containerization (Docker, Kubernetes) is a plus.
Skills: distributed computing,informatica,devops,snowflake,big data,etl workflows,db2,aws,data modeling,kubernetes,docker,postgresql,pyspark,python,powercenter,apache spark,sql,nosql,airflow,ci/cd,informatica powercenter
Education level: Bachelorβs degree
Job function: Information Technology
Industry: Information Technology and Services
Job Description
β’ 7+ years of experience in Informatica Powercentre.
β’ 10+ years of experience in big data and distributed computing.
β’ Very Strong hands-on experience with PySpark, Apache Spark, and Python.
β’ Strong Hands on experience with SQL and NoSQL databases (DB2, PostgreSQL, Snowflake, etc.).
β’ Proficiency in data modeling and ETL workflows.
β’ Proficiency with workflow schedulers like Airflow.
β’ Hands on experience with AWS cloud-based data platforms.
β’ Experience in DevOps, CI/CD pipelines, and containerization (Docker, Kubernetes) is a plus.
β’ Strong problem-solving skills and ability to lead a team
β’ Design, develop, modify, configure & debug Informatica workflows using Informatica PowerCenter and Power Exchange CDC tools.
β’ Leads the design, development, and maintenance of data integration solutions using Informatica, ensuring data quality.
β’ Troubleshoot and resolve technical issues. Debug, tune and optimize code for optimal performance.
β’ Manage the new requirements, Review the existing jobs, Perform gap analysis & Fixing performance issues, etc.
β’ Document all ETL mappings, sessions and workflows.
β’ Ticket handling and problem ticket analysis skills in Agile /POD approach.
Additional Notes
β’ Please submit the candidate's resume in PDF format.
β’ Please note that TCS does not consider former full-time employees (FTEs) for rehire. Additionally, individuals who have previously worked at TCS as contractors must observe a minimum waiting period of six months before being eligible for re-engagement.
Must Have
β’ 10+ years of experience in big data and distributed computing.
β’ 7+ years of experience in Informatica PowerCenter.
β’ Experience with PySpark, Apache Spark, and Python.
β’ Experience with SQL and NoSQL databases (DB2, PostgreSQL, Snowflake, etc.).
β’ Experience with AWS cloud-based data platforms.
β’ Experience in DevOps, CI/CD pipelines, and containerization (Docker, Kubernetes) is a plus.
Skills: distributed computing,informatica,devops,snowflake,big data,etl workflows,db2,aws,data modeling,kubernetes,docker,postgresql,pyspark,python,powercenter,apache spark,sql,nosql,airflow,ci/cd,informatica powercenter