Jobs via Dice

Senior Data/AI Engineer - Remote - Contract to Hire - Irvine, CA

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data/AI Engineer, a remote contract-to-hire position lasting over 3 months, offering a competitive pay rate. Requires a Bachelor’s degree, 5+ years in data solutions, 3+ years with AWS, and expertise in real-time data and machine learning.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 18, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Irvine, CA
-
🧠 - Skills detailed
#Lambda (AWS Lambda) #Network Security #Libraries #Security #SQL (Structured Query Language) #IAM (Identity and Access Management) #Data Ingestion #AWS (Amazon Web Services) #AI (Artificial Intelligence) #Data Management #Python #"ETL (Extract #Transform #Load)" #Cloud #Kafka (Apache Kafka) #Compliance #SageMaker #Talend #Storage #Redshift #Data Security #AWS SageMaker #PyTorch #Data Quality #Data Integration #Data Warehouse #Computer Science #AWS Glue #Snowflake #ML (Machine Learning) #Data Pipeline #Qlik #Scripting #TensorFlow
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Synergent Tech Solutions, is seeking the following. Apply via Dice today! Senior Data/AI Engineer - Irvine, CA ( Remote) Remote 3+ months Contract to Hire Those authorized to work in the U.S. without sponsorship will be considered. We are not able to sponsor at this time. Our client does not offer H-1B sponsorship for this position. Education Bachelor s degree in computer science or a related field. Minimum Experience • Minimum 5 years of experience in building data driven solutions. • At least 3 years of experience working with AWS services. • Applicants must be authorized to work in the US without requiring employer sponsorship currently or in the future. Specialized Knowledge & Skills • Expertise in real-time data solutions, good-to-have knowledge of streams processing, Message Oriented Platforms and ETL/ELT Tools. • Strong scripting experience using Python and SQL. • Working knowledge of foundational AWS compute, storage, networking and IAM. • Understanding of Gen AI models, prompt engineering, RAG, fine tuning and pre-tuning will be a plus. • Solid scripting experience in AWS using Lambda functions. • Knowledge of CloudFormation template preferred. • Hands-on experience with popular cloud-based data warehouse platforms such as Redshift and Snowflake. • Experience in building data pipelines with related understanding of data ingestion, transformation of structured, semi-structured and unstructured data across cloud services. • Knowledge and understanding of data standards and principles to drive best practices around data management activities and solutions. • Experience with one or more data integration tools such as Attunity (Qlik), AWS Glue ETL, Talend, Kafka etc. • Strong understanding of data security authorization, authentication, encryption, and network security. • Hands on experience in using and extending machine learning framework and libraries, e.g, scikit-learn, PyTorch, TensorFlow, XGBoost etc. preferred. • Experience with AWS SageMaker family of services or similar tools to develop machine learning models preferred. • Strong written and verbal communication skills to facilitate meetings and workshops to collect data, functional and technology requirements, document processes, data flows, gap analysis, and associated data to support data management/governance related efforts. • Acts with integrity and proactively seeks ways to ensure compliance with regulations, policies, and procedures. • Demonstrated ability to be self-directed with excellent organization, analytical and interpersonal skills, and consistently meet or exceed deadline deliverables. • Strong understanding of the importance and benefits of good data quality, and the ability to champion results across functions.