

SNI Technology
Data Scientist
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Scientist, contract-to-hire, onsite in Ogden, UT, with a pay rate up to $75/hr. Requires US citizenship, active Secret Clearance, IAT Level II certification, and 3+ years in AI/ML model development, preferably in cybersecurity.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
818
-
ποΈ - Date
October 1, 2025
π - Duration
More than 6 months
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Yes
-
π - Location detailed
Ogden, UT
-
π§ - Skills detailed
#Spark (Apache Spark) #Tableau #Airflow #Data Lake #Computer Science #Python #Monitoring #Data Science #Visualization #Cybersecurity #Scala #Security #Cloud #Automation #Anomaly Detection #Apache NiFi #Data Integration #Big Data #Compliance #AWS (Amazon Web Services) #Jupyter #Azure #Leadership #NiFi (Apache NiFi) #ML (Machine Learning) #"ETL (Extract #Transform #Load)" #Bash #Scripting #Hadoop #Deployment #GCP (Google Cloud Platform) #Data Ingestion #AI (Artificial Intelligence) #Data Framework #Cloudera #Docker
Role description
Senior Data Scientist
Location: Onsite Monday-Friday in Ogden, UT
Citizenship: Must be a US Citizen
Clearance Required: Active Secret Clearance (or ability to obtain/renew)
Certification Required: IAT Level II
Type: Contract-to-Hire
Pay: Up to $75/hr. or so (Conversion Salary: $160Kβ$180K). Negotiable and based on background/experience.
About the Role
SNI Technology/Paladin is representing our client, a defense subcontractor, in their search for a Data Scientist to support a U.S. Air Force program within a secure Data Center environment. This is a mission-critical role where you will develop AI models, build a scalable data lake using Cloudera, create advanced dashboards in Tableau, and contribute to cybersecurity initiatives through automation and compliance.
This opportunity is ideal for a motivated, versatile professional who thrives in a fast-paced, highly secure, and collaborative environment.
Key Responsibilities
β’ Design, train, and deploy AI/ML models to support program objectives
β’ Build and maintain a secure, scalable data lake architecture using Cloudera
β’ Develop interactive dashboards and visualizations with Tableau
β’ Implement cybersecurity analytics use cases such as anomaly detection, threat modeling, and behavioral analysis
β’ Automate data ingestion, transformation, and monitoring processes (Python, Bash, PowerShell)
β’ Ensure compliance with DoD STIGs, NIST, and other federal cybersecurity standards
β’ Collaborate with cybersecurity teams to integrate data science outputs into security workflows
β’ Document workflows, pipelines, and model performance for transparency and reproducibility
β’ Provide technical guidance and support to leadership and stakeholders
Required Qualifications
β’ Active Secret Clearance or ability to obtain/renew one
β’ IAT Level II Certification
β’ Bachelorβs or Masterβs degree in Data Science, Computer Science, Cybersecurity, or related field
β’ 3+ years of experience in AI/ML model development and deployment
β’ Hands-on experience with Cloudera, Tableau, and scripting languages (Python, Bash, PowerShell)
β’ Experience applying data science in cybersecurity contexts
Preferred Qualifications
β’ Experience supporting government or defense-related programs
β’ Knowledge of Jupyter, Apache NiFi, or Airflow for workflow automation
β’ Familiarity with SIEM platforms and security data integration
β’ Exposure to containerization (Docker), cloud platforms (AWS, Azure, OCI, GCP), and CI/CD pipelines
β’ Experience with big data frameworks (Hadoop, Spark, Hive)
Senior Data Scientist
Location: Onsite Monday-Friday in Ogden, UT
Citizenship: Must be a US Citizen
Clearance Required: Active Secret Clearance (or ability to obtain/renew)
Certification Required: IAT Level II
Type: Contract-to-Hire
Pay: Up to $75/hr. or so (Conversion Salary: $160Kβ$180K). Negotiable and based on background/experience.
About the Role
SNI Technology/Paladin is representing our client, a defense subcontractor, in their search for a Data Scientist to support a U.S. Air Force program within a secure Data Center environment. This is a mission-critical role where you will develop AI models, build a scalable data lake using Cloudera, create advanced dashboards in Tableau, and contribute to cybersecurity initiatives through automation and compliance.
This opportunity is ideal for a motivated, versatile professional who thrives in a fast-paced, highly secure, and collaborative environment.
Key Responsibilities
β’ Design, train, and deploy AI/ML models to support program objectives
β’ Build and maintain a secure, scalable data lake architecture using Cloudera
β’ Develop interactive dashboards and visualizations with Tableau
β’ Implement cybersecurity analytics use cases such as anomaly detection, threat modeling, and behavioral analysis
β’ Automate data ingestion, transformation, and monitoring processes (Python, Bash, PowerShell)
β’ Ensure compliance with DoD STIGs, NIST, and other federal cybersecurity standards
β’ Collaborate with cybersecurity teams to integrate data science outputs into security workflows
β’ Document workflows, pipelines, and model performance for transparency and reproducibility
β’ Provide technical guidance and support to leadership and stakeholders
Required Qualifications
β’ Active Secret Clearance or ability to obtain/renew one
β’ IAT Level II Certification
β’ Bachelorβs or Masterβs degree in Data Science, Computer Science, Cybersecurity, or related field
β’ 3+ years of experience in AI/ML model development and deployment
β’ Hands-on experience with Cloudera, Tableau, and scripting languages (Python, Bash, PowerShell)
β’ Experience applying data science in cybersecurity contexts
Preferred Qualifications
β’ Experience supporting government or defense-related programs
β’ Knowledge of Jupyter, Apache NiFi, or Airflow for workflow automation
β’ Familiarity with SIEM platforms and security data integration
β’ Exposure to containerization (Docker), cloud platforms (AWS, Azure, OCI, GCP), and CI/CD pipelines
β’ Experience with big data frameworks (Hadoop, Spark, Hive)