Fastechnowiz Solutions LLC

Data Engineer Lead

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer Lead position for a 6-month contract, offering $57.05 - $60.70 per hour. Key skills include data extraction, cleansing, and SAP ERP implementation. Requires 5+ years of tech experience and proficiency in Azure and AWS data solutions.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date
February 13, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Lisle, IL 60532
-
🧠 - Skills detailed
#Azure Data Factory #Data Migration #AWS Glue #Data Extraction #SAP Hana #Python #Data Engineering #Visualization #Documentation #Pandas #Informatica IDQ (Informatica Data Quality) #Spark (Apache Spark) #MDM (Master Data Management) #PeopleSoft #PySpark #ADLS (Azure Data Lake Storage) #SQL (Structured Query Language) #Informatica #Data Pipeline #Cloud #Data Mapping #ADF (Azure Data Factory) #AWS (Amazon Web Services) #Data Quality #Migration #SAP #"ETL (Extract #Transform #Load)" #Data Cleansing #Data Analysis #Data Management #Synapse #Talend #Data Governance #Data Conversion #Azure
Role description
Job Description: As a Data Engineer Lead, you will play a critical role in ensuring the accuracy and reliability of our data. You will be responsible for selecting appropriate tools, performing technical data extraction, writing programs, and managing data from various sources, including PeopleSoft, BAAN, Infor LN. You will leverage cutting-edge analytics to unlock the potential within our data, guide critical business decisions and propel sustainable growth. Your expertise will help us maintain clean and usable data, free from duplication and errors. Key Responsibilities • Lead the data conversion team through scoping, extraction, mapping, and loading of both the master data sets & transactional data sets from multiple sources into the SAP system through the project phases. • Design, develop, and implement components to migrate data from legacy platforms to SAP ERP platforms. • Define development and test templates and methods for extracting data from various sources, with a strong emphasis on PeopleSoft, BAAN, Infor LN systems. • Evaluate and select appropriate tools and technologies for data extraction and cleansing. • Write programs and scripts to automate data extraction, cleansing, transform and load processes. • Identify and resolve data duplication issues. Perform data cleansing to ensure the accuracy and quality of data. Participate in governance for preventive measures due to duplication at source. • Create and maintain data mapping documents to map source data attributes to Business partner (customer/supplier) attributes. • Monitor Data quality issues and refine data quality rules and checks as needed. • Collaborate with cross-functional teams, technology, and architecture groups to understand data requirements and implement data-centric solutions. • Document processes, methodologies, and best practices for data extraction and cleansing. • Monitor data quality and continuously improve data management processes. Minimum Requirements • 5+ years of overall technology experience that includes at least 4+ years of hands-on software development, data engineering, and systems architecture. • Experience deploying modern data solutions leveraging components like Azure functions, Azure Data Factory or AWS Glue • Experience with data migration platforms like Syniti or data stage • Experience with at least one of the data quality products, like Informatica data quality, Talend DQ, or Atacama. • Experience with implementing SAP ERP solution (at least one or two implementations), hands-on experience with SAP Hana Preferred Qualifications (What experience you’d like to see candidates have) • Proven experience in data extraction, data cleansing, and data management. • Strong Proficiency in Python, Spark, PySpark, PLSQL, Pandas or similar. • Hands-on experience in Syniti, Azure Data Factory, Synapse, ADLS and/or AWS Glue and other AWS data services • Design, develop, optimize, and maintain data pipelines in Azure/AWS Cloud Platforms • Exhibit a thorough understanding of data quality, master data management concepts and ETL/ELT operations. • Solid experience developing and maintaining data quality dashboards. • Solid experience maintaining data quality issue tracker and following up with data stewards/or data owners to fix critical issues. • Strong data analysis background, including the planning and hands on execution of data cleansing and complex data conversions. • Ability to develop reports and analyze, parse and manipulate data using SQL and tools such as Excel, Access and SQL Loader • Strong problem-solving skills and attention to detail. • Ability to work independently and as part of a team. • Excellent communication and documentation skills. • Experience with data visualization tools and techniques. Familiarity with data warehousing processes. • Knowledge of data governance and data quality best practices. Job Type: Contract Pay: $57.05 - $60.70 per hour Expected hours: 40 per week Benefits: Flexible schedule Work Location: In person