VLink Inc

ETL Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Data Engineer with a contract length of "unknown" and a pay rate of "unknown," located remotely for candidates in Southern CA, UT, NV, AZ, CO. Key skills include ETL tools, SQL, Greenplum, and Unix/Linux.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 6, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Quality #Agile #Batch #Scripting #Shell Scripting #Greenplum #Scala #Consulting #SQL Queries #Linux #Databases #DataStage #Data Engineering #"ETL (Extract #Transform #Load)" #Data Pipeline #Unix #Data Processing #SQL (Structured Query Language) #Data Architecture #Scrum #Data Extraction #Business Analysis
Role description
Title: ETL Data Engineer Location: Remote For candidates in Southern CA, UT, NV, AZ, CO About VLink: Started in 2006 and headquartered in Connecticut, VLink is one of the fastest-growing digital technology services and consulting companies. Since its inception, our innovative team members have been solving our global clients' most complex business and IT challenges. Job Description: Client is looking for a skilled ETL Data Engineer with strong experience in data warehousing, Greenplum, SQL, and Unix environments. The ideal candidate will be responsible for designing, developing, and maintaining ETL pipelines, ensuring high-quality data delivery, and collaborating with cross-functional teams in an Agile/Scrum environment. Your future duties and responsibilities • Design, develop, and maintain ETL workflows using tools like DataStage or similar ETL platforms • Build and optimize complex SQL queries for data extraction, transformation, and loading • Work extensively with Greenplum databases for large-scale data processing • Develop and maintain shell scripts in Unix/Linux environments • Schedule, monitor, and troubleshoot batch jobs using Control-M • Ensure data quality, integrity, and consistency across systems • Perform performance tuning and optimization of ETL processes and database queries • Collaborate with business analysts, data architects, and stakeholders to gather requirements • Participate in Scrum ceremonies (daily stand-ups, sprint planning, retrospectives) • Follow Agile methodologies for iterative development and delivery • Troubleshoot production issues and provide timely resolution Required qualifications to be successful in this role • 10+ years of experience in data engineering, ETL/ELT development, or related roles. • Strong experience building and maintaining scalable data pipelines. • Strong experience with ETL tools such as IBM DataStage (or similar) • Proficiency in SQL and relational databases • Hands-on experience with Greenplum or other MPP databases • Solid knowledge of Unix/Linux commands and shell scripting • Experience with Control-M or other job scheduling tools • Understanding of data warehousing concepts (fact/dimension, star schema) • Familiarity with Agile/Scrum methodologies Employment Practices: EEO, ADA, FMLA Compliant VLink is an equal opportunity employer. At VLink, we are committed to embracing diversity, multiculturalism, and inclusion. VLink does not discriminate on the basis of race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law. All aspects of employment including the decision to hire, promote, or discharge, will be decided on the basis of qualifications, merit, performance, and business needs.