Jobs via Dice

Data/AI Engineer || Remote

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data/AI Engineer on a contract-to-hire basis, offering a remote work location. Candidates should have a Bachelor's degree, 5+ years of experience in data solutions, and expertise in AWS, Python, SQL, and data integration tools.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 27, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Pipeline #Computer Science #AI (Artificial Intelligence) #SageMaker #Databricks #Scripting #Cloud #ML (Machine Learning) #Snowflake #Data Integration #Data Warehouse #Data Management #Talend #IAM (Identity and Access Management) #Redshift #AWS Glue #Data Design #Kafka (Apache Kafka) #AWS SageMaker #Scala #Network Security #Data Ingestion #Security #Libraries #PyTorch #Data Quality #Lambda (AWS Lambda) #TensorFlow #AWS (Amazon Web Services) #Python #Storage #Qlik #"ETL (Extract #Transform #Load)" #Data Security #Data Enrichment #SQL (Structured Query Language) #Compliance #BI (Business Intelligence)
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Isoftech Inc, is seeking the following. Apply via Dice today! Data/AI Engineer || Remote Contract to Hire Job Overview The Data/AI Engineer will be responsible for solution engineering of enterprise scale data management best practices. This includes patterns such as modern data integration frameworks and building of scalable distributed systems using emerging cloud-based data design patterns. This role will be responsible for developing data integration tasks in the data and analytics space. Key Job Functions Demonstrate ability in implementing data warehouse solutions using modern data platforms such as Snowflake, Databricks or Redshift. Build data integration solutions between transaction systems and analytics platforms. Expand data integration solutions to ingest data from internal and external sources and to further transform as per the business consumption needs. Develop tasks for a multitude of data patterns, e.g., real-time data integration, advanced analytics, machine learning, BI and reporting. Fundamental understanding of building of data products by data enrichment and ML. Act as a team player and share knowledge with the existing team members. Required Qualification Bachelor s degree in computer science or a related field. Minimum Experience Minimum 5 years of experience in building data driven solutions. At least 3 years of experience working with AWS services. Expertise in real-time data solutions, good-to-have knowledge of streams processing, Message Oriented Platforms and ETL/ELT Tools. Strong scripting experience using Python and SQL. Working knowledge of foundational AWS compute, storage, networking and IAM. Understanding of Gen AI models, prompt engineering, RAG, fine tuning and pre-tuning will be a plus. Solid Scripting Experience In AWS Using Lambda Functions. Knowledge of CloudFormation template preferred. Hands-on experience with popular cloud-based data warehouse platforms such as Redshift and Snowflake. Experience in building data pipelines with related understanding of data ingestion, transformation of structured, semi-structured and unstructured data across cloud services. Knowledge and understanding of data standards and principles to drive best practices around data management activities and solutions. Experience with one or more data integration tools such as Attunity (Qlik), AWS Glue ETL, Talend, Kafka etc. Strong understanding of data security authorization, authentication, encryption, and network security. Hands on experience in using and extending machine learning framework and libraries, e.g, scikit-learn, PyTorch, TensorFlow, XGBoost etc. preferred. Experience with AWS SageMaker family of services or similar tools to develop machine learning models preferred. Strong written and verbal communication skills to facilitate meetings and workshops to collect data, functional and technology requirements, document processes, data flows, gap analysis, and associated data to support data management/governance related efforts. Acts with integrity and proactively seeks ways to ensure compliance with regulations, policies, and procedures. Demonstrated ability to be self-directed with excellent organization, analytical and interpersonal skills, and consistently meet or exceed deadline deliverables. Strong understanding of the importance and benefits of good data quality, and the ability to champion results across functions