Ark Infotech LLC

ETL Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer located in Woodlawn, MD, with a contract length of 1+ year and a pay rate of "TBD." Key skills include ETL processes, SQL proficiency, Linux/UNIX, Tableau dashboard development, and experience with federal clients.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
March 17, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Yes
-
πŸ“ - Location detailed
Woodlawn, MD
-
🧠 - Skills detailed
#Documentation #Tableau Desktop #Computer Science #SQL Server #Data Mapping #Migration #Unix #Bash #SQL (Structured Query Language) #BitBucket #Data Architecture #PostgreSQL #Oracle #Axon #"ETL (Extract #Transform #Load)" #Linux #DevSecOps #Data Warehouse #Jira #Ab Initio #Shell Scripting #Greenplum #Data Management #Security #Scrum #Compliance #Data Pipeline #Tableau Server #Automation #Data Migration #Agile #Business Analysis #GIT #Scripting #Visualization #Tableau #Version Control #Informatica #BI (Business Intelligence) #Python #Data Analysis
Role description
Position: ETL Developer Location: Woodlawn, MD (100% On-Site, 5 days/week) Duration: 1+ year β€’ β€’ β€’ β€’ Need candidates Local to Woodlawn, MD or one hour drive from Woodlawn, MD β€’ β€’ β€’ β€’ Primary Responsibilities: β€’ Design and develop Extract, Transform, and Load (ETL) processes to support MI/BI efforts, with strong knowledge of data warehousing concepts. β€’ Develop advanced and complex ETL pipelines and custom shell/Python scripting. β€’ Design, develop, and maintain Tableau dashboards and data visualizations to support business intelligence and management reporting needs; connect Tableau to enterprise data sources including Greenplum/PostgreSQL, DB2, SQL Server, and Oracle. β€’ Perform data mapping and source-to-target analysis across multiple data sources including DB2, Oracle, Greenplum, and flat files. β€’ Read, debug, modify, and create complex SQL code associated with a data warehouse environment. β€’ Load and integrate data from multiple source systems into Greenplum MPP database; apply expert knowledge of data distribution, table partitioning, and compression strategies. β€’ Design, develop, test, and optimize ETL solutions for processing large volumes of data; implement efficient migration processes to move ETL objects across development, test, and production environments. β€’ Translate business requirements into technical specifications; collaborate with Business Analysts, Data Architects, Enterprise Architects, and ETL Architects to ensure solutions meet requirements. β€’ Identify and promote reuse of ETL components and services to avoid duplicative implementations and reduce technical debt. β€’ Work with configuration management to maintain software versions using Bitbucket/Git; prepare and deploy software packages per SSA change management standards. β€’ Prepare and maintain comprehensive technical documentation; deliver knowledge transfer sessions to SSA staff as needed. β€’ Ensure compliance with SSA security, privacy, and data management policies throughout all development activities. β€’ Mentor junior team members in developing, documenting, and modifying ETL components and best practices. β€’ Participate in all design reviews, requirement sessions, and team problem-solving efforts; communicate technical concepts effectively to both technical and non-technical stakeholders. Minimum Qualifications: β€’ Bachelor’s degree in computer science, Information Technology, or a related field with 10+ years of relevant ETL development experience. Additional years of relevant experience may be accepted in lieu of degree. β€’ Linux/UNIX proficiency including Bash shell scripting and Linux OS administration in a database and ETL context. β€’ Hands-on experience with ETL tools such as Informatica, Ab Initio, and Syncsort/Precisely; proficiency with PSQL, file transfer protocols (FTP/SFTP), and bitbucket for version control. β€’ Experience scripting in a Linux environment to automate ETL solutions and data migration jobs. β€’ Strong SQL proficiency across multiple platforms; experience with PL/pgSQL for PostgreSQL and Greenplum environments, and PL/SQL for Oracle environments; ability to select the appropriate approach based on platform and use case. β€’ Extensive experience with SQL mapping and integration across DB2, Oracle, and Greenplum database platforms. β€’ Strong experience extracting data from multiple source systems including DB2, Oracle, PostgreSQL, Greenplum, and flat files. β€’ Tableau dashboard development experience including connecting to enterprise data sources, building calculated fields and parameters, designing interactive dashboards and scorecards, and publishing workbooks to Tableau Server for BI/MI reporting. β€’ Full comprehension of SQL, cardinality, levels of granularity, normalized vs. de-normalized data models, and data architecture best practices. β€’ Hands-on experience designing and developing ETL solutions involving large volumes of data; experience with large-scale, complex data migration efforts. β€’ Strong understanding of data recovery and job rerun procedures in an ETL environment. β€’ Working knowledge of developing optimal code to maintain high performance when processing large volumes of data. β€’ Strong attention to detail with excellent analytical, organizational, and troubleshooting skills. β€’ Excellent written and verbal communication skills. β€’ Ability to obtain and maintain a Public Trust clearance (U.S. citizenship or lawful permanent residence required). Desired Qualifications: β€’ Active Public Trust or higher-level security clearance. β€’ Prior experience supporting federal government clients, particularly SSA or other large civilian agencies. β€’ Strong Database skills. β€’ Experience with Python scripting for ETL automation, data pipeline development, and operational tasks. β€’ Familiarity with Agile/Scrum methodologies and DevSecOps practices in a federal IT environment. β€’ Experience with Jira and Confluence. β€’ Knowledge of FISMA compliance requirements and experience working within an ATO (Authority to Operate) framework. β€’ Tableau Desktop Specialist or Tableau Certified Data Analyst certification. β€’ Informatica AXON or Ab Initio professional certification.