

Voto Consulting LLC
Palantir Lead Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Palantir Lead Data Engineer with a contract duration in Jersey City, New Jersey, offering a competitive pay rate. Candidates must have 6-8 years of data engineering experience, expertise in AWS, Snowflake, SQL, Python, and insurance industry knowledge.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
February 27, 2026
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Jersey City, NJ
-
π§ - Skills detailed
#Debugging #DevOps #S3 (Amazon Simple Storage Service) #Vault #Migration #Data Security #Python #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Security #Compliance #Data Storage #Data Quality #Computer Science #Data Vault #Snowflake #Data Ingestion #Data Migration #RDS (Amazon Relational Database Service) #Big Data #Storage #Leadership #Aurora #Agile #Data Governance #Data Engineering #Data Processing #Programming #Aurora RDS #PySpark #SQL (Structured Query Language) #GIT #TypeScript #Spark (Apache Spark) #Palantir Foundry #AWS Glue #Cloud #Jenkins #Data Pipeline
Role description
Title: Palantir Lead Data Engineer
Duration :- Contract
Location: Jersey City, New Jersey
Interview :- 02 Rounds, Virtual with Video required for both.
Target Start Date / Urgency to Fill Role :- As early as possible.
Must Have:
β’ Palantir/Typescript OR Insurance
β’ Snowflake, SQL, Python, Spark, AWS- Glue, Big Data Concepts
Responsibilities
β’ Lead the design, development, and implementation of data solutions using AWS & Snowflake.
β’ Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
β’ Develop and maintain data pipelines, ensuring data quality, integrity, and security.
β’ Optimize data storage and retrieval processes to support data warehousing and analytics.
β’ Provide technical leadership and mentorship to junior data engineers.
β’ Ensure compliance with industry standards and best practices in data engineering.
β’ Utilize knowledge of insurance, particularly claims & loss, to enhance data solutions.
Must have
β’ 06 - 08 years of relevant experience in Data Engineering and Delivery.
β’ 05+ years of relevant work experience in Big Data Concepts.
β’ Should have worked on Cloud Implementations.
β’ Expertise with SQL, Python and PySpark.
β’ Good understanding of Data Ingestion and Data Processing Frameworks.
β’ Expertise in Snowflake, SQL, AWS (glue, EMR, S3, Aurora, RDS, AWS architecture).
β’ Good aptitude, strong problem-solving abilities, analytical skills, and ability to take ownership as appropriate.
β’ Should be able to do coding, debugging, performance tuning, and deploying the apps to the Production Environment.
β’ Experience working in Agile Methodology.
β’ Proven expertise or certification in Palantir Foundry / Typescript is highly preferred.
Good to have
β’ Have experience in DevOps tools (Jenkins, GIT etc.) and practices, continuous integration, and delivery (CI/CD) pipelines.
β’ Worked on cloud implementations, data migration, Data Vault 2.0, etc.
Requirements
β’ Bachelorβs or Masterβs degree in Computer Science, Information Technology, or a related field.
β’ Proven experience as a Data Engineer, with a focus on AWS & Snowflake.
β’ Strong understanding of data warehousing concepts and best practices.
β’ Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders.
β’ Experience in the insurance industry, preferably with knowledge of claims & loss processes.
β’ Proficiency in SQL, Python, and other relevant programming languages.
β’ Strong problem-solving skills and attention to detail.
β’ Ability to work independently and as part of a team in a fast-paced environment.
β’ Experience with data modelling and ETL processes.
β’ Familiarity with data governance and data security practices.
β’ Certification in AWS or Snowflake is a plus.
Title: Palantir Lead Data Engineer
Duration :- Contract
Location: Jersey City, New Jersey
Interview :- 02 Rounds, Virtual with Video required for both.
Target Start Date / Urgency to Fill Role :- As early as possible.
Must Have:
β’ Palantir/Typescript OR Insurance
β’ Snowflake, SQL, Python, Spark, AWS- Glue, Big Data Concepts
Responsibilities
β’ Lead the design, development, and implementation of data solutions using AWS & Snowflake.
β’ Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
β’ Develop and maintain data pipelines, ensuring data quality, integrity, and security.
β’ Optimize data storage and retrieval processes to support data warehousing and analytics.
β’ Provide technical leadership and mentorship to junior data engineers.
β’ Ensure compliance with industry standards and best practices in data engineering.
β’ Utilize knowledge of insurance, particularly claims & loss, to enhance data solutions.
Must have
β’ 06 - 08 years of relevant experience in Data Engineering and Delivery.
β’ 05+ years of relevant work experience in Big Data Concepts.
β’ Should have worked on Cloud Implementations.
β’ Expertise with SQL, Python and PySpark.
β’ Good understanding of Data Ingestion and Data Processing Frameworks.
β’ Expertise in Snowflake, SQL, AWS (glue, EMR, S3, Aurora, RDS, AWS architecture).
β’ Good aptitude, strong problem-solving abilities, analytical skills, and ability to take ownership as appropriate.
β’ Should be able to do coding, debugging, performance tuning, and deploying the apps to the Production Environment.
β’ Experience working in Agile Methodology.
β’ Proven expertise or certification in Palantir Foundry / Typescript is highly preferred.
Good to have
β’ Have experience in DevOps tools (Jenkins, GIT etc.) and practices, continuous integration, and delivery (CI/CD) pipelines.
β’ Worked on cloud implementations, data migration, Data Vault 2.0, etc.
Requirements
β’ Bachelorβs or Masterβs degree in Computer Science, Information Technology, or a related field.
β’ Proven experience as a Data Engineer, with a focus on AWS & Snowflake.
β’ Strong understanding of data warehousing concepts and best practices.
β’ Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders.
β’ Experience in the insurance industry, preferably with knowledge of claims & loss processes.
β’ Proficiency in SQL, Python, and other relevant programming languages.
β’ Strong problem-solving skills and attention to detail.
β’ Ability to work independently and as part of a team in a fast-paced environment.
β’ Experience with data modelling and ETL processes.
β’ Familiarity with data governance and data security practices.
β’ Certification in AWS or Snowflake is a plus.





