Saransh Inc

Senior/Lead Data Engineer-Python,Spark & SQL (Only W2)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with expertise in SQL, Python, and Spark, focusing on AI-related tasks and API integrations. Contract length is unspecified, with a pay rate of "Only W2." Remote work is available.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 17, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #IAM (Identity and Access Management) #Spark (Apache Spark) #AI (Artificial Intelligence) #API (Application Programming Interface) #Web Services #Cloud #AWS Glue #Data Engineering #Visualization #Athena #Security #Data Processing #Python #SQL (Structured Query Language) #Automation #Scala #AWS (Amazon Web Services)
Role description
Description We are in search of a Senior Data Engineer who is adept in SQL, Python, and Spark, and ideally with an experience in AI-related tasks and API integrations for chatbot development/integration. • SQL Expertise: Essential for complex query handling and database manipulation. • Python Proficiency: Required for automation, data processing, AWS Bedrock API integration, and chatbot development workflows using Boto3 SDK • Spark Familiarity: Needed for large-scale data processing. Responsibilities • Partner closely with stakeholders to define requirements for data, data processing, reporting, and analytics. • Design and maintain scalable data models and extraction processes. • Develop and integrate chatbot functionality using AWS Bedrock foundation models, Knowledge Bases, and Agents • Implement AWS Bedrock Converse API for multi-turn conversations and tool use patterns • Build AI-driven features including conversational interfaces, intelligent automation workflows, and contextual response systems • Approach tasks with flexibility, adapting to ad-hoc requirements as needed. • Foster a continuous improvement mindset for data practices and visualizations. Requirements • A solid background in database engineering and software development. • Experience with AI Agents Design and Integration • Demonstrated ability with Python and Spark within data-intensive environments • Experience with cloud services, particularly integrating foundation models through Bedrock runtime APIs • Understanding of prompt engineering, system messages, and model parameter tuning for optimal AI responses • Eager to learn and apply new technologies in data visualization, automation, AI. Nice To Have • Experience working on Amazon Web Services (in particular using AWS Glue, AWS Athena) • Familiarity with AWS Bedrock Agents, Knowledge Bases, and RAG patterns • Prior experience building chatbots using various foundation models (Claude, Llama, Titan) available on AWS Bedrock • Knowledge of multi-modal AI applications and streaming response implementations • Understanding of AWS security best practices, IAM policies, and model access management for Bedrock.