Matlen Silver

Cloud Engineer W/Ab Initio, Python SQL and Cloud {Richmond or McLean, VA}

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Engineer with expertise in Ab Initio, Python, SQL, and AWS, based in Richmond or McLean, VA. It is a 9-month contract requiring a Bachelor's degree, 4 years of application development experience, and 2 years with big data technologies.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 12, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Richmond, VA
-
🧠 - Skills detailed
#Redshift #Azure #Unit Testing #Python #SQL (Structured Query Language) #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Cloud #ML (Machine Learning) #Automation #Agile #Snowflake #Ab Initio #Data Engineering #Big Data #Data Processing #Microservices #Code Reviews #Scala #Data Pipeline
Role description
Cloud Data Engineer (Ab Initio / Python / SQL / AWS) Location: Richmond, VA or McLean, VA (On-Site) Duration: 9-Month Contract (with potential extension) Overview We are seeking a highly skilled Cloud Data Engineer to join a fast-paced Agile team supporting enterprise-scale data and cloud initiatives. This role requires strong expertise in Ab Initio (ETL), Python, SQL, and AWS Cloud to design, develop, and support modern data solutions that drive impactful business outcomes. You will work alongside experienced engineers in machine learning, distributed systems, and full-stack development to deliver robust, scalable cloud-based solutions. Key Responsibilities β€’ Collaborate with Agile teams to design, develop, test, implement, and support technical data solutions β€’ Develop and maintain ETL workflows using Ab Initio β€’ Build and optimize data pipelines using Python and SQL β€’ Design and implement scalable data solutions within AWS Cloud environments β€’ Work with distributed data technologies and cloud-based data warehousing platforms (Redshift, Snowflake) β€’ Contribute to architecture discussions and solution design for microservices and big data systems β€’ Perform unit testing and participate in peer code reviews to ensure high-quality, high-performance solutions β€’ Partner with product managers and stakeholders to deliver secure and scalable cloud-based data applications β€’ Stay current with emerging technologies and contribute to engineering best practices Required Skills β€’ Strong hands-on experience with Ab Initio (ETL development) β€’ Proficiency in Python for data processing and automation β€’ Advanced SQL skills (query optimization, performance tuning, complex joins) β€’ Experience designing and deploying solutions in AWS Cloud β€’ Experience working in Agile development environments β€’ Strong problem-solving skills and ability to work in a collaborative team setting Basic Qualifications β€’ Bachelor’s Degree β€’ At least 4 years of experience in application development (internships do not apply) β€’ At least 2 years of experience with big data technologies β€’ At least 1 year of experience with cloud computing (AWS preferred; Azure or Google Cloud acceptable)