Idexcel

Data Scientist

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Scientist (Data Bricks) in Raleigh, NC, hybrid for 12+ months, with a pay rate of "unknown." Requires 15+ years IT experience, strong Azure Cloud skills, and proficiency in Databricks, Python, and Machine Learning.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 14, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Raleigh, NC
-
🧠 - Skills detailed
#Monitoring #Shell Scripting #Database Management #Data Science #Perl #Cloud #Spark (Apache Spark) #Azure #Scripting #GIT #Programming #Tableau #Continuous Deployment #Ruby #NLP (Natural Language Processing) #Deployment #Python #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Data Bricks #Databases #ML (Machine Learning) #Apache Spark #Big Data #Databricks #SAS #Documentation #Azure cloud #Delta Lake #Hadoop #C++ #DevOps #Migration #Sqoop (Apache Sqoop) #Storage
Role description
Job Title: Data Scientist (Data Bricks) Location: Raleigh, NC (Hybrid – 4 Days Onsite / 1 Day Remote) Duration: 12+ Months Description Our Client is seeking a resource well skilled in Cloud Architecture to support the following technologies (Delta Lake Storage / Big Data / Machine Learning Analytics / Apache Spark) to help meet organization priorities. As a technical expert, provides advice and assistance in state-of-the-art software/hardware solutions involving hardware of various capacities, multiple operating environments, database management systems specialized software, data communications facilities and protocols including Value Added Networks, fourth generation technologies, and complex software tools or packages. As a business expert, works with senior client officials to identify enterprise improvement goals, assess organizational and process effectiveness, and implement change strategies. Roles and Responsibilities include but not limited to: Minimum 15 years IT Architecture/Systems experience 10+ years experience designing technical and business solutions, mentoring and training client staff, and overseeing implementation Highly diverse technical and industry experience related to studying and analyzing system needs, systems development, systems process analysis, design, and reengineering Possesses skills and experience related to Business Management, Systems Engineering, Operations Research, and Management Engineering Typically has specialization in a particular technology or business application Keeps abreast of technological developments and industry trends Assist with deployment, configuration, and management of Azure Cloud Environment Assist with migration efforts of existing ETL jobs into Azure / Databricks Cloud Environment Assist ETL staff with following best practices for Cloud, maximizing efficiency of cloud resources, and general knowledge sharing of cloud expertise with AA Cloud Staff Assist with operationalizing deployments and support of Cloud services for ETL Operations Standardize and automate processes and workflows Create documentation and knowledge articles Support Operations staff with limited Cloud experience Written and oral presentation to high-level CIO management on status of current efforts Basic Qualifications Cloud & Infrastructure Skills Strong skills and experience regarding Cloud Operations Support in Azure Strong experience supporting large-scale / enterprise-level Cloud environments Strong experience supporting several cloud services related to: Compute Network Databases Ability to troubleshoot cloud resource problems and perform complex system tests Knowledge of best practices and IT operations in an always-up, always-available service Ability to automate solutions to repetitive problems/tasks Ability to use a wide variety of open-source technologies and cloud services Big Data / Databricks / ETL Skills Experience using Databricks or other Spark-based platforms Knowledge or experience with: Sqoop Oozie Flume Significant experience with: SAS Python C++ Hadoop SQL Database / Coding Apache Spark Machine Learning Natural Language Processing (NLP) Tableau Demonstrated experience working with Unstructured Data Programming / DevOps Skills Fluency in at least one scripting language: Python Perl Ruby or equivalent Integration of Git in continuous deployment Experience with DevOps Monitoring Tools Documentation & Communication Develop and maintain accurate documentation for internal procedures and services Creative thinking skills Detail-oriented personality Ability to communicate well with other members of the development team Desired Skills Strong understanding of Cloud Networking at Scale Ability to break down complex networking concepts and present them concisely Shell Scripting experience PowerShell Data Transfer The selected resource will be required to conduct knowledge-transfer sessions as necessary to share best practices and template code/scripts in each of the following areas: Monitoring (Infrastructure and Application-specific) Troubleshooting