• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

Yellow Bricks

by Duncan Epping

  • Home
  • ESXTOP
  • Stickers/Shirts
  • Privacy Policy
  • About
  • Show Search
Hide Search

augmented reality

Whitepaper: Running Augmented and Virtual Reality Applications on VMware vSphere using NVIDIA CloudXR

Duncan Epping · May 15, 2020 ·

As many of you know by now, I worked on this project with the VXR team at VMware to try to run Augmented and Virtual Reality Applications on VMware vSphere. The white paper demonstrates that, using VMware vSphere backed by NVIDIA Virtual GPU technology, AR/VR applications can be run on a Windows 10 virtual machine with an NVIDIA vGPU, and streamed to a standalone AR/VR device, such as the Oculus Quest or Vive Focus Plus, using NVIDIA’s CloudXR protocol. It was a very interesting project as we had some real challenges I did not expect. I am not going to reveal the outcome of the project and our findings, you will need to read the white paper for that, it will also give you a good understanding of the use cases around these technologies in my opinion. One thing I can reveal right here though is that these workloads are typically graphic intense. I want to share with you one image which in my opinion explains why this is:

Traditional apps/workloads usually run on a single monitor with a frame rate of 30 frames per second. VR applications are presented in a VR headset. A VR headset has a display for both eyes, that doubles the number of megapixels per second immediately, but these displays also expect 72 frames per second or more typically. This is to avoid motion sickness. All of this is described in-depth in the white paper, of course including our findings around GPU utilization when running VR/AR applications using NVIDIA CloudXR, NVIDIA and VMware vGPU on top of VMware vSphere. I hope you enjoy reading the paper as much as I enjoyed the project!

Go here to sign up for the white paper: https://pathfinder.vmware.com/activity/projectvxr

Last day of my Take 3 with the @ProjectVXR team!

Duncan Epping · Feb 28, 2020 ·

Today is the last day of my Take 3 with the Project VXR team. Ridiculous how fast these 3 months went. It seems like only yesterday that I posted I was going to start this journey of learning more about the world of Spatial Computing and Remote Rendering of VR in particular. If I say so myself, I feel that over the past 3 months I managed to accomplish quite a lot. The last three months I spend figuring out how to virtualize a Virtual Reality application and how to stream the application to a head-mounted display. I tested different solutions in this space, some of which I discussed on my blog in the past months, and I was surprised how smooth it worked, to be honest. If you are interested in this space, my recommendation would be to look into NVIDIA CloudXR in combination with vSphere 6.7 U3 and NVIDIA vGPU technology.

I can’t share all the details just yet, I wrote a white paper, which now needs to go through reviews and copy editing, and hopefully, it will be published soon. I will, however, discuss some of my findings and my experience during some of the upcoming VMUGs I will be presenting at. Hopefully, people will enjoy it and appreciate it.

One thing I would like to do is thank a few people who helped me tremendously in the past few months. First of all, of course, the folks on the Project VXR team, they gave me all the pointers/hints/tips I needed and shared a wealth of knowledge on the topic of spatial computing. I also like to thank Grid Factory, Ben in particular, for the many discussions, emails etc that we had. Of course also NVIDIA for the discussions and help around the lab equipment. Last but not least, I want to thank the VMware OCTO team focussing on Dell Technologies for providing me with a Dell Precision Workstation, and shipping it out literally within a day or two. Much appreciated everyone!

Now it is time to get back to reality.

VR and AR, what is it good for besides gaming? (Take 3 learnings)

Duncan Epping · Dec 9, 2019 ·

I first got introduced to Virtual Reality (VR) in the 90’s. Back then it was all about gaming of course. Even today though the perception is that it is mainly about gaming, and to be honest that was also my perception. When I spoke with Alan Renouf the first time about the project he was working on and I saw his keynote demo I didn’t really see the opportunity. It all felt a bit gimmicky, to be honest, but can you blame me when the focus of the demo is moving workloads to the cloud by picking up a VM and throwing it over “the fence”.

In the last few days, as part of my Take 3, I have been mainly reading up on VR and AR use cases. I listened to podcasts and watched a dozen youtube videos. While listening, reading and watching it became clear to me that the perception I have(had) is way off. I had never given this much thought I guess, but the more I read, watch, hear, the more I get excited about the opportunities for VR/AR out there.

I believe right now training is a big opportunity. When I heard about this I related it back to my own job, but that is not really where the opportunity is today. The opportunity here is training for dangerous, challenging, or hazardous scenarios, which are often expensive and difficult to create. Okay, let’s get a bit more specific here, one of the examples I learned about last week was training for firefighters. Not just the actual fire fighting, but also the investigation of for instance how and where the fire started.

It isn’t something I ever thought about, but in order to train firefighters they create a room inside a container, burn down the container and then have groups of firefighters try to figure out how and where the fire started. The problem is though if they train 10 groups per day, only the last group can touch the objects and do a proper investigation. With VR this problem is solved, as after every training session you reset and start over. Same for instance could apply to police force training for things like crime scene investigation. Or for instance training of personnel working (nuclear) power plants, oil platforms, etc etc. Or even customer services training for retailers like Walmart, let them deal with difficult customers in VR first, let them handle dozens of difficult situations in VR before they are exposed to “real” customers.

There are many companies that have a need for (realistic) training of personnel in an easy, repeatable, and relatively affordable way. VR and AR allows you to do just that. If you want to learn more, I recommend listening to the Virtually Speaking Podcast episode covering Spatial Computing.

Taking a break with VMware Take-3

Duncan Epping · Dec 3, 2019 ·

Over the past 6 years, my focus has very much been VMware vSAN. I started focussing on vSAN when we internally started working on Project Marvin in 2012, or EVO:RAIL as it was officially called, which then became Dell EMC VxRail. After a brief stop in the corporate Office of CTO I then joined the Office of CTO for Storage and Availability to focus solely on vSAN. I think it is fair to say that vSAN has been on top of mind for what feels forever. As such, I figured I needed a break, some time to think and talk about something different for a change, some time to learn new technologies, some time to work on something else.

Fortunately, VMware has this great concept called “Take 3”. Take 3 provides you the opportunity, if you have been with VMware for at least 5 years, to spend 3 months working on something else. No, I am not going to a cabin in the woods and think for 3 months. That would be nice, but that is not an option. Take 3 provides you the option to join projects, or teams, which have published an opening and are looking for help. I looked around the Take 3 portal to see what kind of opportunities were listed, and I found one that immediately caught my attention: Spatial Computing aka VR/AR/MR. (If you work for VMware and want to know more, or are interested simply go to the Take 3 portal, note that the Spatial Computing team has other T3 opportunities open.)

Some of you may recall the awesome demo Alan Renouf gave at VMworld during the keynote a few years ago. Well, that demo ultimately turned in to an incubation project which Alan is running together with my old professional services colleague Matt Coppinger. I had a conversation with Alan, Matt and their lead developer Arjun Dube and decided to jump on-board for 3 months. Note, jump on board for 3 months! This doesn’t mean I will be leaving the HCI BU or move away from vSAN. I will return for duty in March, but until then I will dive into virtual/augmented reality. I am aiming to update you folks occasionally, over the course of the next 3 months, on my experience of taking on this project. If you want to know more about what it is all about, listen to Alan on the VMTN Podcast below.

For now, I am looking forward to learning new technologies like AR, VR, GPUs etc. I am very thankful that VMware provides its employees with opportunities like these. The only thing I wonder is, why I waited 11 years before trying this. Ah well, time to put on my goggles and submerge in virtual reality!

Primary Sidebar

About the author

Duncan Epping is a Chief Technologist in the Office of CTO of the Cloud Platform BU at VMware. He is a VCDX (# 007), the author of the "vSAN Deep Dive", the “vSphere Clustering Technical Deep Dive” series, and the host of the "Unexplored Territory" podcast.

Upcoming Events

May 24th – VMUG Poland
Aug 21st – VMware Explore
Sep 20th – VMUG DK
Nov 6th – VMware Explore
Dec 7th – Swiss German VMUG

Recommended Reads

Sponsors

Want to support Yellow-Bricks? Buy an advert!

Advertisements

Copyright Yellow-Bricks.com © 2023 · Log in