EPITEC

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Peoria, IL, offering a 24-month contract at $71.00 – 74.00/hr. Requires 5+ years of programming experience, proficiency in Python, SQL, and Snowflake, with a focus on data pipeline development and automation.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
592
-
🗓️ - Date
October 24, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Peoria, IL
-
🧠 - Skills detailed
#Data Engineering #Automation #DevOps #Snowflake #Programming #Data Architecture #Databases #Python #Computer Science #SQL (Structured Query Language) #Data Processing #Tableau #Database Systems #"ETL (Extract #Transform #Load)" #Monitoring #GitHub #Visualization #Data Pipeline #Streamlit
Role description
Title: Data Engineer Location: Peoria, IL Details: 24-month contract with ongoing need, opportunity for direct hire, fully onsite role Pay Rate: $71.00 – 74.00/hr. based on benefit inclusions Job Summary Caterpillar is seeking a skilled Data Engineer to join a dynamic team focused on automating data workflows and supporting critical reporting functions. This role is essential to ensuring our data architecture aligns with strategic goals and delivers timely, accurate insights across the organization. While initially focused on connectivity, the scope has expanded to include broader responsibilities in data operations and reporting automation. Programming experience is essential, as the role demands hands-on development and problem-solving. Responsibilities As a Data Engineer, you’ll be part of a 5-person team responsible for reporting, visualization, ETL, and data operations. You’ll work in an office-based environment with regular collaboration across project teams. Your typical week will include: • Developing and maintaining data pipelines using Python, SQL, and Snowflake. • Automating data processing and reporting workflows using Snowflake scripts and notebooks. • Building internal data applications and visualizations using Streamlit. • Scheduling and monitoring data jobs to ensure timely delivery—especially during month-end cycles. • Testing and modifying programs or databases to correct errors and improve performance. • Updating and maintaining database systems to support evolving business needs. • Collaborating with project teams to define scope and limitations of database development. • Reviewing user requests to estimate time and cost for project delivery. Years of Experience and Education • Experience: Minimum of 5 years of hands-on programming experience. • Education: A Bachelor’s degree in Computer Science, Engineering, or a related technical field is preferred. Candidates with strong programming experience will be considered regardless of degree. • Note: Internships do not count toward the experience requirement. Skills Required Top 3 Technical Skills • Python • SQL • Snowflake Additional Technical Skills • Streamlit • VS Code • GitHub • DevOps • PowerAutomate • Snowflake notebooks Desired Technical Skills • PowerBI • Tableau Required Soft Skills • Strong verbal and written communication • Excellent problem-solving and interpersonal skills • Ability to work independently and manage time effectively Desired Soft Skills • Basic mentoring skills to support and provide constructive feedback