Technology Ventures

Senior Data/AI Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data/AI Engineer with a 6-month contract, offering a pay rate of "$X per hour". Candidates must have 5+ years of data solution experience, 3+ years with AWS, and expertise in Python, SQL, and data pipelines.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 3, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Bethesda, MD
-
🧠 - Skills detailed
#Data Warehouse #Security #IAM (Identity and Access Management) #Compliance #Qlik #Data Ingestion #ML (Machine Learning) #Talend #Libraries #Snowflake #"ETL (Extract #Transform #Load)" #Data Management #Data Security #Scripting #Data Quality #PyTorch #Redshift #TensorFlow #AWS Glue #SageMaker #Data Integration #Storage #SQL (Structured Query Language) #Data Pipeline #Kafka (Apache Kafka) #Python #Cloud #Lambda (AWS Lambda) #AWS (Amazon Web Services) #Network Security #AI (Artificial Intelligence) #AWS SageMaker
Role description
• Minimum 5 years of experience in building data driven solutions. • At least 3 years of experience working with AWS services. • Applicants must be authorized to work in the US without requiring employer sponsorship currently or in the future. U.S. FinTech does not offer H-1B sponsorship for this position. Specialized Knowledge & Skills • Expertise in real-time data solutions, good-to-have knowledge of streams processing, Message Oriented Platforms and ETL/ELT Tools. • Strong scripting experience using Python and SQL. • Working knowledge of foundational AWS compute, storage, networking and IAM. • Understanding of Gen AI models, prompt engineering, RAG, fine tuning and pre-tuning will be a plus. • Solid scripting experience in AWS using Lambda functions. • Knowledge of CloudFormation template preferred. • Hands-on experience with popular cloud-based data warehouse platforms such as Redshift and Snowflake. • Experience in building data pipelines with related understanding of data ingestion, transformation of structured, semi-structured and unstructured data across cloud services. • Knowledge and understanding of data standards and principles to drive best practices around data management activities and solutions. • Experience with one or more data integration tools such as Attunity (Qlik), AWS Glue ETL, Talend, Kafka etc. • Strong understanding of data security – authorization, authentication, encryption, and network security. • Hands on experience in using and extending machine learning framework and libraries, e.g, scikit-learn, PyTorch, TensorFlow, XGBoost etc. preferred. • Experience with AWS SageMaker family of services or similar tools to develop machine learning models preferred. • Strong written and verbal communication skills to facilitate meetings and workshops to collect data, functional and technology requirements, document processes, data flows, gap analysis, and associated data to support data management/governance related efforts. • Acts with integrity and proactively seeks ways to ensure compliance with regulations, policies, and procedures. • Demonstrated ability to be self-directed with excellent organization, analytical and interpersonal skills, and consistently meet or exceed deadline deliverables. • Strong understanding of the importance and benefits of good data quality, and the ability to champion results across functions.