Voto Consulting LLC

Lead AWS Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead AWS Data Architect in Jersey City, hybrid (4 days onsite), with a contract length of unspecified duration. Pay rate is "unknown." Requires 12+ years of experience in data engineering, strong skills in Snowflake, SQL, AWS, and insurance industry knowledge.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 4, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Jersey City, NJ
-
🧠 - Skills detailed
#GIT #Big Data #Programming #Compliance #Python #Data Quality #AWS (Amazon Web Services) #Migration #Aurora RDS #Snowflake #Agile #Data Migration #Data Pipeline #Data Vault #Leadership #Data Engineering #Data Processing #Vault #DevOps #SQL (Structured Query Language) #Data Governance #Spark (Apache Spark) #S3 (Amazon Simple Storage Service) #Storage #"ETL (Extract #Transform #Load)" #Security #Aurora #Data Ingestion #Data Storage #Data Security #AWS Glue #PySpark #Debugging #Jenkins #Data Modeling #Cloud #Computer Science #Data Architecture #RDS (Amazon Relational Database Service)
Role description
Job Title – Lead Data Engineer (AWS) Location: Jersey City Work mode: Hybrid (4 days a week onsite) No. of Positions: 3 Exp: Min 12+ Yrs. Key Skills: Snowflake, SQL, AWS- Glue, PySpark, Big Data Concepts Responsibilities: β€’ Lead the design, development, and implementation of data solutions using AWS and Snowflake. β€’ Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. β€’ Develop and maintain data pipelines, ensuring data quality, integrity, and security. β€’ Optimize data storage and retrieval processes to support data warehousing and analytics. β€’ Provide technical leadership and mentorship to junior data engineers. β€’ Work closely with stakeholders to gather requirements and deliver data-driven insights. β€’ Ensure compliance with industry standards and best practices in data engineering. β€’ Utilize knowledge of insurance, particularly claims and loss, to enhance data solutions. Must have: β€’ 8+ years of relevant experience in Data Engineering and delivery. β€’ 8+ years of relevant work experience in Big Data Concepts. Worked on cloud implementations. β€’ Strong experience with SQL, python and PySpark β€’ Good understanding of Data ingestion and data processing frameworks β€’ Good experience in Snowflake, SQL, AWS (glue, EMR, S3, Aurora, RDS, AWS architecture) β€’ Good aptitude, strong problem-solving abilities, analytical skills, and ability to take ownership as appropriate. β€’ Should be able to do coding, debugging, performance tuning, and deploying the apps to the Production environment. β€’ Experience working in Agile Methodology Good to have: Have experience in DevOps tools (Jenkins, GIT etc.) and practices, continuous integration, and delivery (CI/CD) pipelines. Worked on cloud implementations, data migration, Data Vault 2.0, etc. Requirements: β€’ Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. β€’ Proven experience as a Data Engineer, with a focus on AWS and Snowflake. β€’ Strong understanding of data warehousing concepts and best practices. β€’ Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. β€’ Experience in the insurance industry, preferably with knowledge of claims and loss processes. β€’ Proficiency in SQL, Python, and other relevant programming languages. β€’ Strong problem-solving skills and attention to detail. β€’ Ability to work independently and as part of a team in a fast-paced environment. Preferred Qualifications: β€’ Experience with data modeling and ETL processes. β€’ Familiarity with data governance and data security practices. β€’ Certification in AWS or Snowflake is a plus.