

Cloud and Big Data Tools Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud and Big Data Tools Engineer with a 12-month contract in Dallas, TX or Charlotte, NC (Hybrid). Requires strong experience with big data platforms, data virtualization tools, cloud environments, and DevOps practices. Certifications in Cloudera or OpenShift preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 13, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Dallas, TX
-
π§ - Skills detailed
#Cloud #Programming #Data Engineering #Automation #Jupyter #Scripting #Hadoop #Data Lake #Dremio #Documentation #Splunk #Virtualization #DevOps #Data Science #Big Data #Cloudera #Grafana
Role description
Outstanding long-term contract opportunity! A well-known Financial Services Company is looking for a Infrastructure Engineer/Cloud and Big Data Tools Engineer in Dallas TX or Charlotte, NC (Hybrid).
Work with the brightest minds at one of the largest financial institutions in the world. This is a long-term contract opportunity that includes a competitive benefit package! Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.
Contract Duration: 12 Months+ with possible extensions W2 Only-Green Card, USC or H4EAD only
Required Skills & Experience
β’ Strong experience with big data platforms: MapR, Hortonworks, Cloudera Data Platform.
β’ Hands-on expertise with data virtualization tools: Dremio, JupyterHub, AtScale.
β’ Proficiency in deploying and managing tools in cloud and containerized environments (CDP, OCP).
β’ Solid understanding of platform engineering, automation scripting, and DevOps practices.
β’ Proven ability to troubleshoot complex issues and perform root cause analysis.
β’ Experience in leading technical efforts and mentoring team members.
β’ Dremio, Hadoop, Splunk and Grafana
What You Will Be Doing
β’ Administer and support tools on the Data private cloud , Including CDP, HWX, MapR.
β’ Install, configure, and maintain data analytical and virtualization tools such as Dremio, JupyterHub and AtScale, across multiple clusters.
β’ Develop proof-of-concept solutions leveraging CDP and OCP technologies.
β’ Deploy tools and troubleshoot issues, perform root cause analysis, and remediate vulnerabilities.
β’ Act as a technical subject matter expert, supporting programming staff during development, testing, and implementation phases.
β’ Develop automation scripts for configuration and maintenance of data virtualization tools.
β’ Lead complex platform design, coding, and testing efforts.
β’ Drive advanced modeling, simulation, and analysis initiatives.
β’ Maintain comprehensive documentation of Hadoop cluster configurations, processes, and procedures.
β’ Generate reports on cluster usage, performance metrics, and capacity utilization.
β’ Work closely with data engineers, data scientists, and other stakeholders to understand their requirements and provide necessary support.
β’ Collaborate with IT infrastructure teams for integrating Dremio Tool, Hadoop clusters with existing systems and services.
Preferred Qualifications:
β’ Certifications in Cloudera, OpenShift, or related technologies.
β’ Experience with enterprise-level data lake architectures and governance.
Outstanding long-term contract opportunity! A well-known Financial Services Company is looking for a Infrastructure Engineer/Cloud and Big Data Tools Engineer in Dallas TX or Charlotte, NC (Hybrid).
Work with the brightest minds at one of the largest financial institutions in the world. This is a long-term contract opportunity that includes a competitive benefit package! Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.
Contract Duration: 12 Months+ with possible extensions W2 Only-Green Card, USC or H4EAD only
Required Skills & Experience
β’ Strong experience with big data platforms: MapR, Hortonworks, Cloudera Data Platform.
β’ Hands-on expertise with data virtualization tools: Dremio, JupyterHub, AtScale.
β’ Proficiency in deploying and managing tools in cloud and containerized environments (CDP, OCP).
β’ Solid understanding of platform engineering, automation scripting, and DevOps practices.
β’ Proven ability to troubleshoot complex issues and perform root cause analysis.
β’ Experience in leading technical efforts and mentoring team members.
β’ Dremio, Hadoop, Splunk and Grafana
What You Will Be Doing
β’ Administer and support tools on the Data private cloud , Including CDP, HWX, MapR.
β’ Install, configure, and maintain data analytical and virtualization tools such as Dremio, JupyterHub and AtScale, across multiple clusters.
β’ Develop proof-of-concept solutions leveraging CDP and OCP technologies.
β’ Deploy tools and troubleshoot issues, perform root cause analysis, and remediate vulnerabilities.
β’ Act as a technical subject matter expert, supporting programming staff during development, testing, and implementation phases.
β’ Develop automation scripts for configuration and maintenance of data virtualization tools.
β’ Lead complex platform design, coding, and testing efforts.
β’ Drive advanced modeling, simulation, and analysis initiatives.
β’ Maintain comprehensive documentation of Hadoop cluster configurations, processes, and procedures.
β’ Generate reports on cluster usage, performance metrics, and capacity utilization.
β’ Work closely with data engineers, data scientists, and other stakeholders to understand their requirements and provide necessary support.
β’ Collaborate with IT infrastructure teams for integrating Dremio Tool, Hadoop clusters with existing systems and services.
Preferred Qualifications:
β’ Certifications in Cloudera, OpenShift, or related technologies.
β’ Experience with enterprise-level data lake architectures and governance.