Technocraft Industries India Ltd

Lead Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer on a contract basis, located in MacLean or Richmond, VA (hybrid). Pay ranges from $54.52 to $65.66 per hour. Key skills include Python, SQL, AWS RDS, and experience in enterprise data environments, preferably banking.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
520
-
🗓️ - Date
October 16, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Richmond, VA
-
🧠 - Skills detailed
#Monitoring #AWS Glue #REST API #Scala #Strategy #AI (Artificial Intelligence) #Data Pipeline #Snowflake #Spark (Apache Spark) #Python #Data Ingestion #Data Strategy #REST (Representational State Transfer) #AWS RDS (Amazon Relational Database Service) #Databricks #RDS (Amazon Relational Database Service) #Data Engineering #"ETL (Extract #Transform #Load)" #Lambda (AWS Lambda) #Data Access #SQL (Structured Query Language) #Big Data #API (Application Programming Interface) #Data Architecture #AWS Lambda #Trend Analysis #AWS (Amazon Web Services) #Automation
Role description
Job Description Role:- Technical Lead/ Lead Data Engineer/ Data Architect Location:- MacLean or Richmond VA (hybrid)Visa:- USC, GC, H4EAD with W2 Video interview Key Responsibilities Design and implement scalable data pipelines and schemas using AWS RDS, Lambda, and related services. Develop and optimize Python and SQL solutions for data ingestion, transformation, and analysis. Collaborate with data strategy leads to build dashboards and enable self-service data capabilities. Integrate data from sources such as OneLake and Snowflake for resiliency metrics and trend analysis. Contribute to automation of audit controls and support AI-driven drift detection. Work on future-state initiatives involving Big Data technologies like Spark and Databricks. Ensure data solutions support scalability, resiliency posture reporting, and infrastructure health monitoring. Required Skills Python, SQL AWS RDS, AWS Lambda Experience in enterprise data environments (banking preferred) Preferred Skills AWS Glue, Spark, Databricks REST API development experience Snowflake AWS Solutions Architect Certification Former Capital One experience highly preferred Qualifications 6–10 years of data engineering experience Strong understanding of data resiliency, network reliability, and scalable architecture Growth mindset with willingness to learn and leverage AI for data insights Success Criteria Fully automated audit controls for the data pod Enable self-service data access Faster data delivery and trend visibility Support resiliency posture reporting across teams Job Type: Contract Pay: $54.52 - $65.66 per hour Work Location: On the road