Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Cincinnati, OH, on a W2 contract. Key skills include DBT, Snowflake, DataStage, and experience with big data technologies. Strong software development background and cloud provider experience are required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 24, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Cincinnati, OH
-
🧠 - Skills detailed
#DBA (Database Administrator) #"ETL (Extract #Transform #Load)" #Monitoring #Cloud #Big Data #Docker #PHP #Kafka (Apache Kafka) #Python #Snowflake #JavaScript #Data Engineering #dbt (data build tool) #Ansible #Puppet #Automation #Data Science #Ruby #Spark (Apache Spark) #Hadoop #Security #Data Warehouse #Scala #Kubernetes #API (Application Programming Interface) #Linux #DataStage #AWS (Amazon Web Services) #Deployment
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Ekcel Technologies Inc, is seeking the following. Apply via Dice today! Role: Data Engineer Location: Cincinnati , OH (Onsite) Mode: Contract on W2 JD: β€’ Data Engineers with DBT, Snowflake & DataStage experience. Job Summary: Responsible for building outstanding software solutions to drive the success of a business. Build various aspects of the company's infrastructure to power innumerable conversations at scale. Primary Responsibilities: Maintain the company platform uptime, performance, stability, and scalability Design, guide, mentor and challenge system architecture and design with others Develop and maintain a public API Develop best possible, most robust, and extensible solutions from feature requests Work with big data technology (Kafka, Hadoop, Spark, etc) Work with Data Scientists to develop rich value-added features Work with DBA to create ETL and Data Warehouse system Work with Operations to automate solutions and increase service reliability Closely monitor all platform related production systems Building tools and processes to support analytics, monitoring, machine-learning and data-warehousing platforms. Define and implement various strategies covering everything from subnets to backups to fog networking/computing configuration and deployments. Provision, configure, maintain, backup, and monitor onsite and cloud based server resources. Define and implement deployment strategies for client-facing and internal tool systems. Continual improvement and fine-tuning of various alerting and monitoring systems. Qualifications: Experience in software/systems development. Strong software development background, experience building software systems. Working knowledge of at least one of the following languages: PHP, Ruby, Python, JavaScript, Elixir, Go or comparable. Strong background in Linux administration. Strong experience with cloud providers such as AWS, Digital Ocean, Google Cloud, etc. Strong understanding of IT security best practices. Experience with automation/configuration software (puppet, ansible), and/or orchestration software (docker swarm, kubernetes, etc). Understanding of computer networks. Experience with administration of at production scale.