Wise Equation Solutions Inc.

Foundry Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Foundry Data Engineer with a contract length of "unknown", offering a pay rate of "unknown". Key skills required include Python, PySpark, and Palantir Foundry experience. Minimum 4 years in data engineering and proficiency in SQL and Linux are essential.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
January 27, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Scala #AWS (Amazon Web Services) #Jenkins #Data Pipeline #Linux #Python #GIT #PySpark #GitLab #SQL (Structured Query Language) #Visualization #Spark (Apache Spark) #Monitoring #Automation #Palantir Foundry #Data Engineering #BitBucket
Role description
Key things: Python, Pyspark, Palantir Foundry Palantir Foundry experience a must. Responsibilities β€’ Develop and enhance data-processing, orchestration, monitoring, and more by leveraging popular open-source software, AWS, and GitLab automation. β€’ Collaborate with product and technology teams to design and validate the capabilities of the data platform β€’ Identify, design, and implement process improvements: automating manual processes, optimizing for usability, re-designing for greater scalability β€’ Provide technical support and usage guidance to the users of our platform’s services. β€’ Drive the creation and refinement of metrics, monitoring, and alerting mechanisms to give us the visibility we need into our production services. Qualifications β€’ Experience building and optimizing data pipelines in a distributed environment β€’ Experience supporting and working with cross-functional teams. β€’ They have deep knowledge of the Palantir Foundry platform, including its tools like Workshop, Quiver, and Slate, which are used to design data workflows, build visualizations, and create user interfaces. β€’ Proficiency working in Linux environment β€’ 4+ years of advanced working knowledge of SQL, Python, and PySpark β€’ Experience using tools such as: Git/Bitbucket, Jenkins/CodeBuild, CodePipeline