

GAC Solutions
Senior Data Engineer with Ab Initio
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with Ab Initio, offering a contract length of "unknown" and a pay rate of "unknown." Key skills include expertise in Ab Initio tools, big data solutions, and programming in Scala, Java, or Python. A bachelor's degree and 10 years of experience are required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
December 16, 2025
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Atlanta, GA
-
π§ - Skills detailed
#Data Quality #Data Engineering #Hadoop #Sqoop (Apache Sqoop) #Programming #Java #Computer Science #Scala #Ab Initio #Business Analysis #HBase #Metadata #GIT #SQL Queries #Zookeeper #SQL (Structured Query Language) #Kafka (Apache Kafka) #Statistics #Python #Spark (Apache Spark) #Big Data
Role description
Qualifications/Requirements:
β’ Bachelor's degree in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics) and a minimum of 10 years of experience
β’ Minimum 5 yearsβ experience working with and developing big data solutions
β’ Experts in the following Ab Initio tools: GDE β Graphical Development Environment; Co Operating System ; Control Center; Metadata Hub; Enterprise Meta Environment; Enterprise Meta Environment Portal; Acquire It; Express It; Conduct It; Data Quality Environment; Query It.
β’ Hands-on experience on writing shell scripts, complex sql queries, Hadoop commands, and Git
β’ Ability to write abstracted, reusable code components
β’ Programming experience in at least two of the following languages: Scala, Java or Python
Desired Characteristics:
β’ Strong business acumen
β’ Critical Thinking and Creativity
β’ Performance tuning experience
β’ Experience in developing Hive, Sqoop, Spark, Kafka, HBase on Hadoop
β’ Familiar with Ab Initio, Hortonworks, Zookeeper, and Oozie is a plus
β’ Willingness to learn new technologies quickly
β’ Superior oral, and written communication skills, as well as the willingness to collaborate across teams of internal and external technical staff, business analysts, software support and operations staff.
β’ Strong business acumen including a broad understanding of Synchrony Financial business processes and practices
Qualifications/Requirements:
β’ Bachelor's degree in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics) and a minimum of 10 years of experience
β’ Minimum 5 yearsβ experience working with and developing big data solutions
β’ Experts in the following Ab Initio tools: GDE β Graphical Development Environment; Co Operating System ; Control Center; Metadata Hub; Enterprise Meta Environment; Enterprise Meta Environment Portal; Acquire It; Express It; Conduct It; Data Quality Environment; Query It.
β’ Hands-on experience on writing shell scripts, complex sql queries, Hadoop commands, and Git
β’ Ability to write abstracted, reusable code components
β’ Programming experience in at least two of the following languages: Scala, Java or Python
Desired Characteristics:
β’ Strong business acumen
β’ Critical Thinking and Creativity
β’ Performance tuning experience
β’ Experience in developing Hive, Sqoop, Spark, Kafka, HBase on Hadoop
β’ Familiar with Ab Initio, Hortonworks, Zookeeper, and Oozie is a plus
β’ Willingness to learn new technologies quickly
β’ Superior oral, and written communication skills, as well as the willingness to collaborate across teams of internal and external technical staff, business analysts, software support and operations staff.
β’ Strong business acumen including a broad understanding of Synchrony Financial business processes and practices






