

ETL Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer in Reston, VA (Hybrid – Once a week) for a 12+ month contract, paying competitive rates. Requires 10+ years in IT, 5+ years in Ab Initio ETL development, and experience with Hadoop, SQL, and Agile practices.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date discovered
June 12, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
1099 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Reston, VA
-
🧠 - Skills detailed
#Databases #Java #Cloudera #Unix #Automated Testing #AWS (Amazon Web Services) #SQL Queries #Shell Scripting #Cloud #Computer Science #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Ab Initio #NoSQL #Business Analysis #Automation #Data Management #Data Warehouse #HDFS (Hadoop Distributed File System) #Impala #Data Engineering #Agile #Big Data #Linux #Data Integration #Scripting #Hadoop #Data Pipeline
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Title: Ab-Initio Developer
Location: Reston, VA (Hybrid – Once a week)
Duration: 12+ Months of Contract
The Senior Data Engineer is responsible for orchestrating, deploying, maintaining and scaling cloud OR on-premise infrastructure targeting big data and platform data management (Relational and NoSQL, distributed and converged) with emphasis on reliability, automation and performance. This role will focus on developing solutions and helping transform the company's platforms deliver data-driven, meaningful insights and value to company.
Essential functions:
1. Work with Business Analysts and Product team to gather data requirements
1. Design and Build AbInitio data graphs and data pipelines to extract the data various databases/flat files/message queues
1. Transform the data to create a consumable data layer for various application uses
1. Support Data pipeline with bug fixes, and additional enhancements
1. Document Technical design , Operational Runbook etc
Education Level: Bachelor's Degree
Education Details: Computer Science, Information Technology or Engineering or related field
Experience:
• Total of 10+ Years of IT Experience predominantly in Data Integration/ Data Warehouse area
• Must have at least 5 years of ETL Design and Development experience using Ab Initio
• 1-2 years of Data Integration project experience on Hadoop Platform, preferably Cloudera
• AbInitio CDC ( Change Data Capture ) experience in a Data Integration/ETL project setting is great to have
• Working knowledge of HDFS, Hive, Impala and other related Hadoop technologies
• Working knowledge in various AWS services is nice to have
• Sound understanding of SQL and ability to write well performing SQL queries
• Good knowledge of OLTP and OLAP data models and other data warehouse fundamentals
• Rigor in high code quality, automated testing, and other engineering best practices, ability to write reusable code components
• Ability to unit test the code thoroughly and to troubleshoot issues in production environments
• Must have some working experience with Unix/Linux shell scripting
• Must be able to work independently and support other junior developers as needed
• Some Java development experience is nice to have
• Knowledge of Agile Development practices is required
“Mindlance is an Equal Opportunity Employer and does not discriminate in employment on the basis of – Minority/Gender/Disability/Religion/LGBTQI/Age/Veterans.”