Lead Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer (AWS) in Jersey City, NJ, on a full-time contract for over 6 months. Requires 8+ years in Data Engineering, insurance domain experience, strong SQL, Python, Snowflake, and AWS skills. Hybrid work model.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
September 24, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Fixed Term
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
California, United States
-
🧠 - Skills detailed
#Debugging #Data Security #Programming #Data Storage #"ETL (Extract #Transform #Load)" #Consulting #Data Ingestion #DevOps #Cloud #Big Data #Data Pipeline #Jenkins #Leadership #Python #Compliance #RDS (Amazon Relational Database Service) #Snowflake #GIT #Computer Science #SQL (Structured Query Language) #Data Engineering #Data Migration #Aurora RDS #AWS Glue #PySpark #Vault #Data Quality #Agile #S3 (Amazon Simple Storage Service) #Data Governance #Spark (Apache Spark) #Data Modeling #Security #Data Processing #Data Vault #Migration #AWS (Amazon Web Services) #Storage #Aurora
Role description
Hi Job Title – Lead Data Engineer (AWS) Location: Jersey City NJ Work mode: Hybrid (3days a week onsite) Insurance Domain Experience Must Contract type: Full-Time Responsibilities: • Lead the design, development, and implementation of data solutions using AWS and Snowflake. • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. • Develop and maintain data pipelines, ensuring data quality, integrity, and security. • Optimize data storage and retrieval processes to support data warehousing and analytics. • Provide technical leadership and mentorship to junior data engineers. • Work closely with stakeholders to gather requirements and deliver data-driven insights. • Ensure compliance with industry standards and best practices in data engineering. • Utilize knowledge of insurance, particularly claims and loss, to enhance data solutions. Must have: • 8+ years of relevant experience in Data Engineering and delivery. • 8+ years of relevant work experience in Big Data Concepts. Worked on cloud implementations. • Strong experience with SQL, python and Pyspark • Good understanding of Data ingestion and data processing frameworks • Good experience in Snowflake, SQL, AWS (glue, EMR, S3, Aurora, RDS, AWS architecture) • Good aptitude, strong problem-solving abilities, analytical skills, and ability to take ownership as appropriate. • Should be able to do coding, debugging, performance tuning, and deploying the apps to the Production environment. • Experience working in Agile Methodology Good to have: Have experience in DevOps tools (Jenkins, GIT etc.) and practices, continuous integration, and delivery (CI/CD) pipelines. Worked on cloud implementations, data migration, Data Vault 2.0, etc. Requirements: • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. • Proven experience as a Data Engineer, with a focus on AWS and Snowflake. • Strong understanding of data warehousing concepts and best practices. • Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. • Experience in the insurance industry, preferably with knowledge of claims and loss processes. • Proficiency in SQL, Python, and other relevant programming languages. • Strong problem-solving skills and attention to detail. • Ability to work independently and as part of a team in a fast-paced environment. Preferred Qualifications: • Experience with data modeling and ETL processes. • Familiarity with data governance and data security practices. • Certification in AWS or Snowflake is a plus. -- Thanks & Regards, Britto V Sr. Technical Recruiter ABOTTS Consulting Inc 16755 Von Karman Ave, Suite# 200 Irvine, CA 92606 US Cell: 4083618140  Ext No. 484 Email: britto.vincent@abotts.com URL: www.abotts.com