METRIX IT SOLUTIONS INC

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer based in Chicago, IL, with a contract length of "unknown" and a pay rate of "unknown." Key skills include Airflow, Python, Azure, and experience in data pipeline development, integration, and quality management.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 27, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Databases #Python #"ETL (Extract #Transform #Load)" #Security #Airflow #Data Quality #Database Management #Azure #Documentation #Storage #Datasets #Scala #Agile #Data Engineering #Data Extraction #Data Integration #Automation #Data Pipeline #Stories
Role description
Sr. Data Engineer Local to Chicago, IL (Onsite Role) In-Person Interview required • Data Pipeline Development: Design, build, and maintain robust, scalable, and efficient data pipelines to collect, process, and store large volumes of data from various sources. • Hands on experience of Airflow , Python and Azure • Data Integration: Integrate data from multiple sources, including APIs, databases, and external datasets, ensuring data consistency and reliability. • Data Modelling: Develop and maintain data models and schemas that support efficient data • storage, retrieval, and analytics. • Database Management: Manage and optimize databases, ensuring their performance, • availability, and security • Data Quality: Implement and monitor data quality checks to ensure the accuracy, completeness, and consistency of data. Perform analysis required to troubleshoot data related • issues and assist in the resolution of data issues. • Automation: configure data extraction and load jobs and automate repetitive tasks and processes to improve efficiency and reduce errors. Improve performances of jobs and • pipelines by applying optimisation and automation techniques. • Documentation: Maintain clear and comprehensive documentation of data pipelines, data models, and data integration processes. Maintenance of time allocation reports. • Develop understanding of business/ processes and high-level understanding of high-quality • digital product delivery • Ways of working: Follow Agile and SDLC processes including the creation of data related user • stories. • Perform code peer reviews and testing as per SDLC or industry standards. • Collaboration: Work closely with internal customers, product owners, various core team • members, and other stakeholders to understand their data needs and provide appropriate • solutions.