

ETL/Abinitio Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL/Abinitio Developer with 6-8 years of experience, requiring a Bachelor's degree in a related field. Key skills include Ab Initio ETL, Hadoop, SQL, and data modeling. Contract length and pay rate are unspecified.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
440
-
ποΈ - Date discovered
June 11, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Data Integration #Kafka (Apache Kafka) #"ETL (Extract #Transform #Load)" #Ab Initio #XML (eXtensible Markup Language) #Hadoop #Visualization #Unix #Spark (Apache Spark) #Web Services #Computer Science #SQL (Structured Query Language) #BI (Business Intelligence) #Sqoop (Apache Sqoop) #Data Modeling #Big Data #NoSQL #Databases #Java
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job description
Requirements;
β’ 6-8 Yearsβ Experience
β’ Bachelorβs degree in computer science, decision support systems (DSS), information engineering, business administration, or another related discipline. Masterβs degree in a related discipline is a plus.
β’ Experience with hands-on work designing and implementing data integration, Big Data, and/or business intelligence solutions
β’ Experience in implementing complex data applications.
β’ Knowledge of implementing audit logs on input/output data flows.
β’ Experience in design, development and tuning industry-leading ETL, Big Data, and/or Business Intelligence software tools in data-intensive, large scale environments.
β’ Experience in Ab Initio ETL experience, with a deep understanding of Ab initio ICFF (Index compressed Flat Files) file
β’ Experience working with web services and integrating with IBM data power.
β’ Hands-on development experience with Hadoop and the Hadoop ecosystem including tools such as Hive, Spark, Spark Streaming, Kafka, Sqoop, Flume, etc.
β’ Strong SQL and data modeling skills like 3rd normal form and dimensional modeling
β’ Strong DB2, Java, and UNIX skills
β’ Understanding and working (hands-on) knowledge of newer or emerging trends such as Big Data, Columnar databases, NoSQL, Hadoop, MapReduce, social media, data visualization, XML, and unstructured text is a plus.