

ETL Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer with a contract length of "unknown" and a pay rate of "unknown," located onsite in "Mclean, VA/Dallas, TX." Key skills include IICS, Informatica PowerCenter, PySpark, and API integration. A Bachelor's degree in computer science is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 9, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Corp-to-Corp (C2C)
-
π - Security clearance
Unknown
-
π - Location detailed
Texas, United States
-
π§ - Skills detailed
#Computer Science #Informatica #Spark (Apache Spark) #JSON (JavaScript Object Notation) #AWS (Amazon Web Services) #API (Application Programming Interface) #Unix #XML (eXtensible Markup Language) #"ETL (Extract #Transform #Load)" #Informatica PowerCenter #Data Lake #Programming #PySpark #Unit Testing #Agile #Data Architecture #Snowflake #IICS (Informatica Intelligent Cloud Services) #Talend
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Title/Position: IICS ETL Developer
Job Location: (Onsite-Mclean, VA/Dallas TX)
Duration: Contract Role
Client- Stratacent
Open for c2c and w2 both
Relocation is also fine.
ξ
Skills
Experience (in Years)
Logistics
Comments
Total Experience
Work Authorization
Informatica
Current Location
IICS
Notice Required to Join
CAI/CDI
Currently working
API Integration
Pyspark
Reason of job change
Snowflake
LinkedIn profile
Responsibilities:
β’ Work with internal SMEs to understand overall data architecture in scope for project deliverables.
β’ Understand all existing integration points within the AWS ecosystem, in scope for project deliverables e.g. Step functions, Dynamo DB, Glue.
β’ Work with internal support teams Analyze existing ETL process written using Talend and PySpark.
β’ Develop in scope transformations using Pyspark and IICS.
β’ Perform Unit testing and assist code promotion to higher environments for user testing.
β’ Create reusable code for API component (reading from API endpoints):
β’ Reusable IICS CAI process to retrieve json/xml data from multiple endpoints.
β’ Reusable pyspark code to populate dimension/history tables in data lake.
Skills required:
β’ Bachelor's degree in computer science or equivalent is required.
β’ Dataflux exposure in reading code is good to have.
β’ IICS is a must.
β’ Informatica PowerCenter background is required.
β’ Multiple years programming experience. Broad understanding of general business and management practices is preferred.
β’ Complete understanding of all phases of the software development life cycle.
β’ Expert Knowledge of application design principles, systems development and analysis.
β’ Advanced Informatica design and development skills using Informatica power center and power exchange.
β’ Experience in developing Informatica PowerCenter ETL mappings using Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
β’ Experience using parameter files, parameters/variables and troubleshooting using advanced informatica techniques like session partitioning, push down optimization etc.
β’ Must be well versed with all Unix commands needed for running ETL jobs as well as for troubleshooting source and target files.
β’ Prior experience in AGILE environment.