Enterprise Solutions Inc.

Data and Knowledge Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data and Knowledge Engineer with a contract length of "unknown," offering a pay rate of "unknown." Required skills include Python, Snowflake, ETL, and knowledge graphs. Experience with data engineering and semantic technologies is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date
January 17, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, NY
-
🧠 - Skills detailed
#Python #"ETL (Extract #Transform #Load)" #Data Pipeline #Data Science #Metadata #AI (Artificial Intelligence) #Snowflake #HBase #Data Quality #Monitoring #RDF (Resource Description Framework) #Knowledge Graph #Data Management #Scala #Data Engineering #Data Ingestion
Role description
Job Summary We are looking for a Data and Knowledge Engineer with strong hands-on experience in data engineering, knowledge graphs, and semantic technologies. The ideal candidate will design and build scalable data pipelines and knowledge graph solutions that enable advanced analytics, semantic search, and AI-driven insights across enterprise data platforms. Key Responsibilities • Design, build, and maintain scalable ELT/ETL data pipelines using Python and modern data engineering frameworks. • Develop and optimize data ingestion, transformation, and orchestration workflows for structured and semi-structured data sources. • Implement and manage Snowflake data models, performance tuning, and cost optimization. • Design, build, and maintain Knowledge Graphs using Stardog and GraphDB. • Develop and optimize SPARQL queries for semantic search, inference, and analytics use cases. • Implement R2RML mappings to transform relational data into RDF for knowledge graph ingestion. • Integrate enterprise data pipelines with semantic and graph-based systems. • Collaborate with data scientists, AI engineers, and business stakeholders to enable graph-powered analytics and AI applications. • Ensure data quality, lineage, governance, and metadata management across data and knowledge platforms. • Support production systems with monitoring, troubleshooting, and performance optimization.