Expleo

Data Architect — Operational Technology to Cloud

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect — Operational Technology to Cloud, offering a contract longer than 6 months, with a pay rate of $100–$110 per hour. Key skills include expertise in OT environments, Databricks, AWS networking, and data pipeline architecture.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
880
-
🗓️ - Date
March 19, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Minneapolis, MN 55401
-
🧠 - Skills detailed
#AI (Artificial Intelligence) #Data Layers #Compliance #Data Quality #VPC (Virtual Private Cloud) #Security #Spark (Apache Spark) #Scala #Computer Science #Cloud #Automation #Data Modeling #Data Governance #Batch #Observability #Databricks #IAM (Identity and Access Management) #AWS (Amazon Web Services) #ML (Machine Learning) #Data Architecture #Data Engineering #Delta Lake #Strategy #Data Pipeline
Role description
Overview: Location: Remote Employment Type: Full-Time Join Trissential and Help Shape a Cloud-Ready Future for Operational Technology Data If you’re a seasoned Data Architect who thrives at the intersection of industrial data, cloud architecture, and secure data movement—this is your opportunity. At Trissential, we partner with forward-thinking organizations that are modernizing how operational technology (OT) data is leveraged for analytics, automation, and AI. You'll join our client's team as the technical leader responsible for turning complex OT environments into governed, AI-ready cloud platforms built on Databricks. What’s in It for You? High-impact architecture ownership across OT, cloud, and enterprise data domains A role where your expertise shapes reference patterns, data governance, and long-term platform strategy Opportunity to work across multiple senior stakeholder groups—Security, Networking, Operations, and Data Architecture A collaborative project environment backed by Trissential’s culture of growth, transparency, and support A chance to help an enterprise build an industrial-grade data fabric that scales for analytics and AI Your Role & Responsibilities Partner with business and technology leaders to translate requirements into secure, scalable cloud architectures Define target-state designs for safe and governed data movement from on-prem OT networks into Databricks Evaluate and select approaches for ingesting/virtualizing historian data (especially OSI PI and AVEVA Connect) Architect streaming, micro-batch, and batch data pipelines from edge to lakehouse Design data layers (landing, curated, serving) aligned with Databricks lakehouse and Unity Catalog governance Define AWS network and cloud security controls—VPC patterns, subnet designs, routing, encryption, private endpoints Ensure Databricks E2 control plane and data plane security standards are followed, with compensating controls documented Develop canonical time-series and asset-centric data models to support analytics and AI Establish data quality SLAs, lineage standards, and AI data readiness frameworks Produce ADRs, architecture blueprints, and engineering playbooks Coach engineering teams and participate in architecture reviews Collaborate with Security, Networking, and Compliance to validate controls and guide remediation Measure and optimize cost, performance, and reliability across data pipelines and platforms Skills & Experience You Should Possess Extensive background architecting data solutions within operational technology (OT) environments Expertise designing solutions for industrial/asset-centric data domains Deep experience with OSI PI, AVEVA, or similar historian platforms Strong hands-on knowledge of Databricks, Spark, Delta Lake, and Unity Catalog Proven mastery of data pipeline architecture—batch, micro-batch, streaming, CDC, and edge-to-cloud patterns Advanced data modeling experience with time-series data and asset hierarchies Strong AWS networking and security knowledge: VPCs, subnets, routing, IAM, KMS, private connectivity Ability to interpret and implement enterprise Databricks security guidance (E2 architecture) Excellent communication and negotiation skills with senior business and technical leaders Familiarity with ML/AI platform requirements such as feature stores, lineage, and observability Bonus Points If You Have Experience integrating AVEVA PI AF or AVEVA Data Views with Databricks Prior contributions to enterprise data fabric or Databricks governance standards Experience in regulated industrial or utility environments Background working with safety, reliability, or compliance-heavy data ecosystems Education & Certifications You Need Bachelor’s degree in Computer Science, Engineering, Information Systems, or related field Cloud or Databricks certifications beneficial but not required What We Offer At Trissential, we care about delivering meaningful work experiences. When you join our client’s team through us, you receive industry-leading support and flexibility—without sacrificing benefits. Competitive Compensation –$100–$110 per hour, depending on your skills, experience, and location. Final compensation is determined based on skill alignment, years of experience, and fair, market-based rates by geography. Comprehensive Benefits for you and your dependents – Medical, dental, vision, free tele-health, HSA with company contribution, life and disability insurance, and 401k with matching Paid Time Off –Offers paid time away from work Remote-first engagement with a high-performing architecture and data engineering community Opportunities for ongoing professional development and certifications A people-first culture built on partnership, transparency, and growth Please note: This role is only open to individuals authorized to work in the United States. Ready to Shape the Future of Industrial Data? If you’re excited to architect secure, scalable, cloud-ready OT data platforms, we want to meet you. Apply today and take the next step in your architecture career with Trissential!