Session details and recordings
Session descriptions and links to session recordings from Mixed Reality Dev Days, which was held on May 21-22, 2020.
Session title | Speaker | Description |
---|---|---|
Opening Keynote | Alex Kipman | Alex Kipman starts our first ever virtual Mixed Reality Dev Days event. |
Intro to Azure Mixed Reality Services: Azure Remote Rendering | Jonathan Lyons, Christopher Manthei, and Marc Appelsmeier | How Azure Remote Rendering renders and streams interactive 3D models with hundreds of millions of polygons to devices like HoloLens 2 in real time. |
Intro to Unreal + MRTK for HoloLens 2 | Summer Wu & Luis Valverde | Unreal Engine support for HoloLens 2 reached production-ready status with the release of UE 4.25 in May 2020. In tandem, our team released the first component of the Mixed Reality Toolkit for Unreal: UX Tools 0.8. This talk provides an overview of the features in Unreal Engine 4 and MRTK for Unreal, and how to use them to build epic experiences for HoloLens 2. |
Getting started with the HoloLens 2 and Unity | Dan Miller - Unity | Basics of setting up Unity and building for the HoloLens 2. This presentation covers best practices, basic features of the HoloLens 2, and how to quickly add hand tracking support and interactivity with native Unity APIs. |
Intro to Azure Mixed Reality Services: Azure Spatial Anchors | Archana Iyer & Vicente Rivera | An overview of Azure Spatial Anchors and relevant scenarios. This talk reviewed new capabilities, with code samples, best practices, and how to start integrating ASA into your products. |
Intro to MRTK-Unity | Catherine Diaz | Tutorial on how to create an MRTK app from start to finish. This talk covers interaction concepts and shows MRTK’s multi-platform capabilities. |
Learnings from the MR Surfaces App | Lars Simkins | Join the engineers behind the MRDL Surfaces app for HoloLens 2 as they talk about the app’s design story and technical highlights. |
Azure Kinect Body Tracking Unity Integration | Angus Antley | Learn how to drive characters in Unity using the Azure Kinect Body Tracking SDK. |
MRTK’s UX Building Blocks | Yoon Park | Deep dive on the MRTK’s UX components that help you build beautiful mixed reality experiences. |
MRTK Performance Tools | Kurtis Eveleigh & David Kline | An introduction to performance tools, both in MRTK and external, and an overview of the MRTK Standard Shader. |
The State of Mixed Reality--Where Companies are finding Success | Ori Amiga & Matt Fleckenstein | Ultra-low latency edge computing, coupled with AI and mixed reality, is the foundation for the next generation of experiences. By blending the digital and the physical worlds into ubiquitous computing experiences, mixed reality is enabling possibilities we could have only dreamed of previously. This session provides unique insight into the mixed reality market opportunity today and in the future. The session highlights how Microsoft is helping leading enterprises in manufacturing, health care, and retail to harness the power of mixed reality to drive business efficiency and transform customer and employee experiences. |
Fireside Chat | Alex Kipman & René Schulte | Microsoft MVP, Regional Director, and community member extraordinaire René Schulte stands around a fire and chats about the topics the community is interested in. René gathered questions from the community for about a week, and it was a great conversation. |
Designing AR/VR experiences using Microsoft Maquette | Ricardo Acosta | Designing a phone app or a website has a well-defined workflow. Unfortunately, designing spatial reality experiences can be tricky if you use the same 2D workflow or toolset. Luckily, the Microsoft Maquette app focuses on helping UX designers to design. |
MRTK Unity v2 & beyond - How community feedback helped us improve MRTK | Bernadette Thalhammer | A talk about how we improved the developer experience by listening to feedback from the community, and how developers can leverage these improvements. Dive into the documentation and unit testing, the new object manipulator component, using the migration window, and explore some code snippets around frequently asked questions from the dev community. |
Dark Slope's Unreal Engine plugin for the Azure Kinect DK | Ben Unsworth - Dark Slope | Learn how Dark Slope uses the Azure Kinect DK and its SDKs to build real-time interactive engagements in Unreal Engine. |
Introducing StereoKit - MR Made Easy! | Nick Klingensmith | StereoKit is an easy-to-use open-source mixed reality library for building HoloLens and VR applications with C# and OpenXR. StereoKit prioritizes mixed reality application development, allowing for features such as a first-class mixed reality input system, fast performance by default even on mobile devices, quick iteration time on-device, and a runtime asset pipeline that lets users and developers load real assets from the file system. All of this and more are packaged in a terse API that’s well documented, easy to learn, and easy to write. |
Building Immersive MR Experiences with Babylon.js and WebXR | Jason Carter & Raanan Weber | Discover how easy and powerful it can be to develop MR experiences directly on the web. Babylon.js strives to be one of the most powerful, beautiful, simple, and open web rendering platforms in the world, making it easy to unlock full MR capabilities across platforms, devices, and ecosystems. Check out the latest developments of Babylon.js and its support of WebXR. |
Using Project Acoustics with HoloLens 2 | Mike Chemistruck | See how to apply Project Acoustics to Mixed Reality. Learn how the system recreates real-world effects within the compute budget of a HoloLens 2. Examples include diffracted occlusion and redirection of sounds around physical doorways and corners, and reverberation in complex geometries with multiple connected spaces. |
Holographic Remoting - Rapid iteration & supercharged graphics on HoloLens | Brent Jackson | HoloLens delivers a revolutionary mobile computing platform like no other, but it’s limited to the processing power of a mobile device. Holographic remoting brings the raw power of a VR capable computer to HoloLens. With Unity in-editor remoting, you don't have to build and deploy your apps to test them on a device. Learn how Holographic remoting can increase the performance of your applications, and your developers. |
OpenXR on HoloLens 2: Cross-platform native mixed reality | Alex Turner | If you build mixed reality support into your own engine or native app from the ground up, learn about the key details of OpenXR 1.0. See the OpenXR native API surface, the extensions that bring the full feature set of HoloLens 2 to life, and the partners from Firefox Reality to StereoKit already shipping apps and frameworks built on OpenXR. With OpenXR, you can build cross-vendor mixed reality engines and native apps that span the breadth of devices in the industry. |
Tips from a Year of HoloLens 2 Development | Peter Vale | The HoloLens commercialization team shares tips and lessons learned from working with our partners. Gain insight into the most common issues, along with best practices and techniques that you can use to get your HoloLens 2 application ready to share with your customers. |