AI/ML Engineer(only W2)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AI/ML Engineer with a contract length of "unknown", offering a pay rate of $44.66 - $53.78 per hour. Located onsite in St. Louis, MO, key skills required include AWS S3, ETL, Snowflake, and Apache Airflow.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
424
-
πŸ—“οΈ - Date discovered
September 27, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#AI (Artificial Intelligence) #Airflow #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #Data Science #Snowflake #AWS S3 (Amazon Simple Storage Service) #Data Analysis #Data Architecture #Cloud #Storage #Scala #SQL (Structured Query Language) #Data Pipeline #Data Governance #Python #Data Quality #Data Storage #S3 (Amazon Simple Storage Service) #Security #Data Engineering #SQL Queries #Compliance #ML (Machine Learning) #Data Security #Data Warehouse #Apache Airflow #Computer Science #Data Modeling #Data Processing
Role description
Hi, Hope you are doing great!! Job : Data Engineer – AWS S3, ETL, Snowflake & Airflow Location : St. Louis MO(onsite) Type : Contract(only w2) Key Responsibilities: Design, develop, and maintain robust ETL pipelines to ingest, process, and transform data from multiple sources into Snowflake data warehouse. Manage and optimize data storage on AWS S3, ensuring efficient data retrieval and security. Build and orchestrate complex workflows using Apache Airflow to automate data pipeline tasks. Collaborate with data analysts, data scientists, and business teams to understand data requirements and translate them into technical solutions. Monitor pipeline performance, troubleshoot issues, and ensure data quality and integrity. Implement best practices for data governance, security, and compliance in data processing. Continuously improve existing data engineering processes and tools for scalability and efficiency. Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or related field. Proven experience with AWS S3 for data storage and management. Strong expertise in ETL pipeline development using Python, SQL, or other relevant tools. Hands-on experience with Snowflake data warehouse platform. Experience with Apache Airflow or similar workflow orchestration tools. Solid understanding of data warehousing concepts, data modeling, and data architecture. Proficient in writing complex SQL queries and performance tuning. Familiarity with cloud infrastructure and services (AWS preferred). Knowledge of data security best practices and compliance standards. Strong problem-solving skills and ability to work independently and in a team environment. Thanks & Warm Regards Ashok Kumar Tanisha Systems Inc. 99 Wood Ave South, Suite # 308, Iselin, NJ 08830 Desk: (732) 746-0367 β€’ 603 Email Id: Ashok.Kumar@tanishasystems.com Job Type: Contract Pay: $44.66 - $53.78 per hour Work Location: On the road