

Senior Storage Analyst - Remote / Telecommute
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Storage Analyst on a contract basis, remote from Edmonton, AB. Requires a Bachelor’s degree, 5+ years in Cloudera Data Platform, Hadoop/Spark, and Linux administration, plus security management experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
September 24, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Remote
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Edmonton, KY
-
🧠 - Skills detailed
#Data Privacy #AI (Artificial Intelligence) #YARN (Yet Another Resource Negotiator) #Monitoring #LDAP (Lightweight Directory Access Protocol) #Cloud #Cloudera #HDFS (Hadoop Distributed File System) #Microsoft Azure #ML (Machine Learning) #Compliance #Snowflake #Zookeeper #Capacity Management #Kerberos #GIT #Computer Science #Data Strategy #Azure #Agile #Version Control #Spark (Apache Spark) #Automated Testing #Hadoop #Security #Scala #Strategy #Linux #Migration #Storage #Deployment #Documentation
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Cynet Systems, is seeking the following. Apply via Dice today!
We are looking for Senior Storage Analyst - Remote / Telecommute for our client in Edmonton, AB
Job Title: Senior Storage Analyst - Remote / Telecommute
Job Location: Edmonton, AB
Job Type: Contract
Job Description:
• The Data Platform Analyst will be responsible for administering and supporting enterprise-level data platforms and cloud technologies, with a primary focus on Cloudera Data Platform, Hadoop/Spark, Azure, and Snowflake.
• This role involves ensuring performance, security, scalability, and compliance across cloud and on-premises environments, while also supporting analytics, AI/ML workloads, and enterprise data strategies.
Responsibilities:
• Perform administration and support of Cloudera Data Platform and related tools.
• Support and optimize Azure and Snowflake platforms.
• Configure and tune cloud-based environments to ensure cost efficiency, security, and compliance.
• Implement and enforce data privacy and security controls in line with enterprise policies.
• Deploy and support analytics, AI/ML, and statistical programs across data platforms.
• Develop technical design, perform capacity planning, cluster setup, performance tuning, monitoring, and scaling.
• Collaborate with infrastructure, network, database, and application teams to ensure platform reliability and availability.
• Manage Hadoop ecosystem components (YARN, HDFS, Hive, Spark, Zookeeper, Kerberos, Ranger, etc.).
• Install, configure, and support Linux (RHEL) and Windows systems in enterprise environments.
• Perform Hadoop platform performance tuning, cluster security, and storage management.
• Set up and manage Hadoop users, Active Directory integrations, and security policies with Apache Ranger.
• Provide file system management, monitoring, and support for HDFS.
• Support enterprise storage solutions and maintain backup/recovery processes.
• Maintain technical documentation, processes, diagrams, and best practices.
• Work with stakeholders, vendors, and internal teams to deliver reliable data solutions.
• Contribute to enterprise data strategy design and modern platform features.
• Participate in Agile processes, sprint planning, version control (Git), CI/CD pipelines, and automated testing.
• Support multiple concurrent projects and deliverables with high quality and within timelines.
Requirement / Must Have:
• Bachelor s degree in Computer Science, Information Technology, or related field.
• Minimum 5 years of hands-on Cloudera Data Platform administration (monitoring, configuration, deployment, upgrades, user management, troubleshooting).
• Minimum 5 years of Hadoop/Spark administration, including cluster deployment, high availability, job monitoring, backup/recovery, and security.
• Minimum 5 years of Linux (RHEL/CentOS) administration, including deployment, configuration, performance tuning, and troubleshooting.
• Strong experience with system troubleshooting, capacity management, OS/storage/network fundamentals.
• Minimum 5 years of Security & Identity Management experience (Kerberos, Active Directory, LDAP).
Should Have / Nice to Have:
• Ability to participate in a 24/7 on-call rotation and work off-hours during change windows.
• 3+ years of experience with Cloudera Machine Learning (CML) administration, upgrades, migration, and troubleshooting.
• 2+ years of experience administering Microsoft Azure environments.
• 2+ years of experience administering Snowflake environments.
• 4+ years of experience working in complex IT environments requiring task prioritization.
• 4+ years of experience in multi-team IT support environments providing services to multiple stakeholders.
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Cynet Systems, is seeking the following. Apply via Dice today!
We are looking for Senior Storage Analyst - Remote / Telecommute for our client in Edmonton, AB
Job Title: Senior Storage Analyst - Remote / Telecommute
Job Location: Edmonton, AB
Job Type: Contract
Job Description:
• The Data Platform Analyst will be responsible for administering and supporting enterprise-level data platforms and cloud technologies, with a primary focus on Cloudera Data Platform, Hadoop/Spark, Azure, and Snowflake.
• This role involves ensuring performance, security, scalability, and compliance across cloud and on-premises environments, while also supporting analytics, AI/ML workloads, and enterprise data strategies.
Responsibilities:
• Perform administration and support of Cloudera Data Platform and related tools.
• Support and optimize Azure and Snowflake platforms.
• Configure and tune cloud-based environments to ensure cost efficiency, security, and compliance.
• Implement and enforce data privacy and security controls in line with enterprise policies.
• Deploy and support analytics, AI/ML, and statistical programs across data platforms.
• Develop technical design, perform capacity planning, cluster setup, performance tuning, monitoring, and scaling.
• Collaborate with infrastructure, network, database, and application teams to ensure platform reliability and availability.
• Manage Hadoop ecosystem components (YARN, HDFS, Hive, Spark, Zookeeper, Kerberos, Ranger, etc.).
• Install, configure, and support Linux (RHEL) and Windows systems in enterprise environments.
• Perform Hadoop platform performance tuning, cluster security, and storage management.
• Set up and manage Hadoop users, Active Directory integrations, and security policies with Apache Ranger.
• Provide file system management, monitoring, and support for HDFS.
• Support enterprise storage solutions and maintain backup/recovery processes.
• Maintain technical documentation, processes, diagrams, and best practices.
• Work with stakeholders, vendors, and internal teams to deliver reliable data solutions.
• Contribute to enterprise data strategy design and modern platform features.
• Participate in Agile processes, sprint planning, version control (Git), CI/CD pipelines, and automated testing.
• Support multiple concurrent projects and deliverables with high quality and within timelines.
Requirement / Must Have:
• Bachelor s degree in Computer Science, Information Technology, or related field.
• Minimum 5 years of hands-on Cloudera Data Platform administration (monitoring, configuration, deployment, upgrades, user management, troubleshooting).
• Minimum 5 years of Hadoop/Spark administration, including cluster deployment, high availability, job monitoring, backup/recovery, and security.
• Minimum 5 years of Linux (RHEL/CentOS) administration, including deployment, configuration, performance tuning, and troubleshooting.
• Strong experience with system troubleshooting, capacity management, OS/storage/network fundamentals.
• Minimum 5 years of Security & Identity Management experience (Kerberos, Active Directory, LDAP).
Should Have / Nice to Have:
• Ability to participate in a 24/7 on-call rotation and work off-hours during change windows.
• 3+ years of experience with Cloudera Machine Learning (CML) administration, upgrades, migration, and troubleshooting.
• 2+ years of experience administering Microsoft Azure environments.
• 2+ years of experience administering Snowflake environments.
• 4+ years of experience working in complex IT environments requiring task prioritization.
• 4+ years of experience in multi-team IT support environments providing services to multiple stakeholders.