Groove Jones

Apple Vision Pro Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Apple Vision Pro Developer on a contract basis, offering a competitive pay rate. Key skills include visionOS, SwiftUI, RealityKit, and ARKit. Requires 3+ years on Apple platforms and strong collaboration abilities. Remote work location.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 11, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Documentation #Computer Science #Scala #"ETL (Extract #Transform #Load)" #Agile #iOS #Programming #Object Detection #ML (Machine Learning)
Role description
About the Company Groove Jones is a leader in creative technology, partnering with global brands to build immersive experiences that redefine how people interact with digital content. We are investing heavily in Apple Vision Pro and spatial computing, from Apple Immersive Video production to real-time 3D applications, and are expanding our network of visionOS specialists for upcoming client projects. About the Role We are looking for an Apple Vision Pro / visionOS Developer (contract) to design, prototype, and ship immersive experiences that take full advantage of Apple’s spatial computing ecosystem. You will work with visionOS, SwiftUI, RealityKit, and ARKit to build performant, production-ready experiences that run across Apple Vision Pro and future spatial devices. This role is highly collaborative—you’ll partner with UX/UI, 3D, and creative technology teams to translate concepts into interactive spatial products for brand, enterprise, and entertainment clients. Responsibilities • Design and develop spatial apps and features using visionOS, SwiftUI, RealityKit, and ARKit • Implement clean, modular architectures (ECS, MVVM, or similar) for scalable mixed-reality applications. • Integrate 3D content, scenes, and interactions that feel native to the visionOS ecosystem. • Build intuitive, gesture- and gaze-driven interfaces that support eye, hand, and device input patterns defined by Apple’s Human Interface Guidelines. • Collaborate with designers and 3D artists to optimize assets, lighting, and interactions for Apple Vision Pro’s high‑resolution, low-latency requirements • Implement shared spatial experiences, multi-user collaboration patterns, or session persistence when needed using Apple’s spatial computing APIs. • Profile and optimize performance (frame rate, memory, load times) using Instruments and best practices for Swift concurrency (async/await, actors, structured concurrency) • Contribute to technical scoping, documentation, and implementation plans for client-facing projects in an agile, remote-friendly workflow. Qualifications • Bachelor’s degree in Computer Science, Software Engineering, or equivalent practical experience. • 3+ years of professional experience on Apple platforms (iOS, iPadOS, macOS, watchOS, tvOS), with at least 1+ year working with visionOS or shipped prototypes/experiments for Apple Vision Pro. • Strong proficiency in Swift and comfort working with SwiftUI, RealityKit, and ARKit in production codebases. • Experience architecting interactive 3D or AR applications, with familiarity in ECS-style or component-based systems. • Solid understanding of multithreaded programming, async/await, and performance optimization for real-time interactive experiences. • Strong grasp of spatial UX principles, depth, scale, and comfort in mixed reality environments. • Excellent communication skills and proven success collaborating with distributed, cross-functional teams. Preferred Skills • Background with particle effects, physics-based interactions, procedural animation, or VFX-style 3D behaviors in real-time engines or frameworks. • Familiarity with integrating machine learning or computer vision models into interactive applications (e.g., object detection, body/hand tracking, scene understanding). • Experience working with immersive media pipelines (stereoscopic or 180/360 video, spatial audio) and optimizing content for HMDs or other XR devices. • Understanding of rendering pipelines, display characteristics, and latency constraints for head-mounted displays or other spatial computing devices. If you want to help push what’s possible on Apple Vision Pro with a studio that lives and breathes immersive tech, we’d love to connect.