Keylent Inc

Senior Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Architect with 14+ years of technology experience, including 12+ years in Data Engineering and related fields. Contract length is unspecified, pay rate is competitive, and remote work is allowed. Key skills include Data Lake, Data Modeling, and expertise in Talend and SQL.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 12, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Exton, PA
-
🧠 - Skills detailed
#QlikView #NoSQL #Tableau #Data Lake #SQL (Structured Query Language) #Data Governance #Databases #Qlik #Data Integration #Cloud #MDM (Master Data Management) #Kafka (Apache Kafka) #Data Lineage #Data Architecture #Agile #Data Modeling #Oracle #Data Engineering #Data Warehouse #Talend #Big Data #Workday #Data Ingestion #Scala #Visualization #SAP #Data Management #Data Pipeline
Role description
Sr Data Architect - β€’ 14+ years of overall technology experience required β€’ 12+ years of Data Engineering, Data Modeling, Data Warehousing, Master Data Management, Reference Data Management, Data Lineage, Data Governance and Meta Data Management experience required β€’ 7+ years of experience in defining Data & Analytics architecture implementing multiple large technology projects. β€’ 5+ years of experience working with Agile teams preferred β€’ Expertise in designing, validating and implementing multiple projects across the hybrid infrastructure ( On-cloud to On-Premise and vice versa) β€’ Experience in seamless integration of enterprise data models with data models of packaged solutions (e.g. Oracle Apps, Workday, SAP, Service Now) β€’ Expertise in setting up Data Lakes and analytical environments β€’ Expertise with Data Engineering tools such as Talend, BODS β€’ Expertise with relational SQL and NoSQL databases β€’ Extensive experience in building data pipelines, data ingestions, data integrations, data preparations, and traditional Data warehouses and DataMarts β€’ Experience with visualization tools such as QlikView, Qlik Sense, Tableau, etc. β€’ Experience in message queuing, stream processing, and highly scalable β€˜big data’ data stores β€’ Experience with big data tools such as Kafka Datawarehouse, Data Lake , Data Modelling (traditional and latest), Data Management practice are mandatory