

ETL Developer (Ab Initio)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer (Ab Initio) with 2-3 years of Ab Initio experience in a GCP environment, offering a contract length of "X months" at a pay rate of "$X/hour". Key skills include SQL, BigQuery, and Teradata, with a preferred background in Banking/Financial Technology.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
432
-
ποΈ - Date discovered
July 12, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Charlotte, NC
-
π§ - Skills detailed
#Cybersecurity #Debugging #Python #Data Processing #Teradata #SQL (Structured Query Language) #Agile #GCP (Google Cloud Platform) #Data Ingestion #Data Pipeline #Ab Initio #AI (Artificial Intelligence) #Cloud #Deployment #BigQuery #Security #Data Science #Data Manipulation #"ETL (Extract #Transform #Load)" #Migration #Storage #SQL Queries #Jira #Java
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
This is with one of our Banking client.
Role Overview:
β’ Need 2-3 years of Ab Initio into BigQuery in a GCP environment
Key responsibilities include:
β’ Designing, coding, and testing new data pipelines using Ab Initio Designing
β’ Implementing ETL/ELT Processes
β’ Writing, optimizing, and debugging complex SQL queries for data manipulation, aggregation, and reporting, particularly for data within Teradata and BigQuery
β’ Data ingestion - develop and manage processes to ingest large volumes of data into GCP's BigQuery
β’ Manage and monitor GCP resources specifically used for data processing and storage
β’ Optimize Cloud Data Workloads
Technical Stack & Must-Have Skills:
β’ MUST have 7+ years of overall IT experience
β’ 6+ years of Ab Initio experience (minimum 3 years of experience)
β’ 2+ years of GCP experience
β’ 2+ years of experience with BigQuery
β’ Teradata experience
β’ Expertise with SQL/ETL
β’ Agile and JIRA experience
β’ Experience with technical stakeholder interactions
β’ Enterprise level experience
β’ EXCELLENT written and verbal communication skills
Preferred Experience:
β’ Java experience highly desired
β’ Python experience highly desired
β’ Background in Banking/Financial Technology β Deposits, Payments, Cards domain, etc.
About Us:
Collabera is a leading Digital Solutions company providing software engineering solutions to the worldβs most tech-forward organizations. With more than 25 years of experience, we have hired over 17000 employees across 60+ offices globally and currently place 10000+ professionals annually to support critical IT engagements at more than 500 client sites, 80% being the Fortune 500.
With Collabera, you
β’ Will get to work on numerous challenging and exciting projects, including UI/UX transformation, Blockchain, AI/Data Science, Cloud migrations, Cyber-Security and Engineering.
β’ At Collabera you have 80% chances of project extension or redeployment to other clients
β’ Will have endless opportunities to learn new technologies through our inhouse Training arm β Cognixia.