CEI

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer for a 6-month contract to hire, onsite in Blue Ash, OH, with a pay rate of $65-75/hr. Key skills include Azure, Python, Spark, and automation experience. Local candidates only.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
600
-
πŸ—“οΈ - Date
October 10, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Blue Ash, OH
-
🧠 - Skills detailed
#Data Science #Azure #Ansible #Jenkins #Spark (Apache Spark) #Scala #NoSQL #Agile #Python #Automation #Migration #Cloud #"ETL (Extract #Transform #Load)" #API (Application Programming Interface) #Databricks #Puppet #Deployment #Azure Databricks #Programming #Data Security #Data Strategy #Data Engineering #Security #Data Catalog #Infrastructure as Code (IaC) #SQL (Structured Query Language) #Strategy #Databases
Role description
Summary The team is seeking a Data Engineer experienced in implementing data solutions in Azure. This role involves analyzing, designing, and developing enterprise data and information architecture deliverables, focusing on data as an asset for the enterprise. The Data Engineer will also support Infrastructure as Code (IaC) initiatives by engineering scalable, reliable, and resilient software running in the cloud. Candidates must be local to Blue Ash, OH, and available to work onsite. β€’ 6-month contract to hire β€’ Onsite 4 days per week in Blue Ash β€’ $65-75 / HR Responsibilities β€’ Develop and deliver technological responses to targeted business outcomes. β€’ Analyze, design, and develop enterprise data and information architecture deliverables, focusing on data as an asset for the enterprise. β€’ Understand and follow reusable standards, design patterns, guidelines, and configurations to deliver valuable data and information across the enterprise, including direct collaboration with 84.51, where needed. β€’ Apply cloud computing skills to deploy upgrades and fixes. β€’ Experience in administration and configuration of API Gateways (e.g. Apigee/Kong). β€’ Design, develop, and implement integrations based on user feedback. β€’ Troubleshoot production issues and coordinate with development teams to streamline code deployment. β€’ Implement automation tools and frameworks (CI/CD pipelines). β€’ Analyze code and communicate detailed reviews to development teams to ensure improvement and timely completion of products. β€’ Collaborate with team members to improve engineering tools, systems, procedures, and data security. β€’ Deliver quality customer service and resolve end-user issues promptly. β€’ Draft architectural diagrams, interface specifications, and other design documents. β€’ Participate in developing and communicating data strategy and roadmaps to support project portfolio and business strategy. β€’ Drive the development and communication of enterprise standards for data domains and data solutions, focusing on simplified integration and streamlined operational and analytical uses. β€’ Drive digital innovation by leveraging new technologies and approaches to renovate, extend, and transform core data assets, including SQL-based, NoSQL-based, and Cloud-based data platforms. β€’ Define high-level migration plans to address gaps between current and future states. β€’ Lead analysis of the technology environment to detect critical deficiencies and recommend solutions for improvement. β€’ Mentor team members in data principles, patterns, processes, and practices. β€’ Promote the reuse of data assets, including management of the data catalog for reference. β€’ Draft and review architectural diagrams, interface specifications, and other design documents. Qualifications β€’ Experience in implementing data solutions in Azure. β€’ 2+ years’ experience with automation production systems (Ansible Tower, Jenkins, Puppet, or Selenium). β€’ Working knowledge of databases and SQL. β€’ Experience with software development methodologies and SDLC. β€’ Problem-solving attitude and ability to work independently. β€’ Must be organized, able to balance multiple priorities, and self-motivated. β€’ Top 3 Skills: Azure Databricks, Python, and Spark. β€’ Soft Skills: Problem solving, attention to detail, and ability to work independently and as part of an agile team. β€’ Team: 10 members, working independently with peer programming throughout the day. β€’ Required Working Hours: EST 9–4, with some flexibility. β€’ Travel: None. β€’ Only local candidates will be considered. About the Client Our client is a leading organization committed to driving innovation in data science and advanced analytics for predictive marketing.