Skip to main content

6 posts tagged with "vr"

View All Tags

PlayCanvas now supports Microsoft volumetric video playback

· 10 min read
Steven Yau
Partner Relations Manager

Open in new tab

We are very excited to release our showcase demo for Microsoft Mixed Reality Capture Studios (MRCS) volumetric video technology.

PlayCanvas now supports MRCS volumetric video with a playback library for captured footage at their studios. Watch it on desktop, mobile with AR or even in a WebXR-enabled VR headset, all from a single URL!

The library can be easily added to any PlayCanvas project and used to create fantastic immersive mixed reality experiences.

About Microsoft Mixed Reality Capture Studios

MRCS records holographic video - dynamic holograms of people and performances. Your audiences can interact with your holograms in augmented reality, virtual reality and on 2D screens.

They are experts at capturing holographic video, advancing capture technology and have been pioneering its applications since 2010.

Learn more about Microsoft Mixed Reality Capture Studios here.

How was this created?

The demo was created with a combination of several tutorials and kits available on the PlayCanvas Developer Site, the MRCS playback library and freely available online assets.

You can find the public project for the demo here. We've removed the URL to the volumetric video file (due to distribution rights) and the proprietary MRCS devkit library. Please contact MRCS to gain access to the library and example videos.

Microsoft Video Playback Library

In the folder 'holo video', you will find the scripts and assets needed for playing back volumetric video. You will need to add the devkit library file name 'holo-video-object-umd.js' that will be provided from MRCS to complete the integration and be able to playback video.

Holo Video In Assets Panel

Due to the size and how the data files for the video need to be arranged, they have to be hosted on a separate web server (ideally behind a CDN service like Microsoft Azure).

The 'holo-video-player.js' script can be added to any Entity and be given a URL to the .hcap file. At runtime, the script will create the necessary meshes, materials, etc to render and playback the volumetric video.

Holo Video Script UI

Expect full documentation to be released soon on our site!

Creating a Multi Platform AR and VR experience

As you see in the video, we've made the experience available to view in the standard browser, AR on WebXR-enabled mobile devices (Android) and VR on devices like the Oculus Quest. iOS support for WebXR is in progress by the WebKit team.

This was done by combining several of our WebXR example projects and the scripts and assets can be found in the 'webxr' folder:

WebXR Folder In Assets Panel

'xr-manger.js' is controls how the XR experience is managed and handled throughout the experience:

  • Entering and leaving AR and VR.
  • Which UI buttons to show based on the XR capabilities of the device it is running on (e.g hides the VR UI button if AR is available or VR is not available).
  • Showing and hiding Entities that are specific to each experience.
  • Moving specific Entities in front of the user when in AR so the video can be seen more easily without moving.

Adding AR

AR mode was added first, taking the 'xr-manager.js' script as a base from WebXR UI Interaction tutorial. Key changes that had to be made to the project were:

  • Ensuring ‘Transparent Canvas’ is enabled in the project rendering settings.
  • Creating a second camera specifically for AR which is set to render the layers that are needed for AR (i.e. not including the skybox layer) and having a transparent clear color for video passthrough).

After copying and pasting 'the xr-manager.js' file from the tutorial project into the demo project, I hooked up the UI elements and buttons to enter AR and added extra functionality to disable and enable Entities for AR and non-AR experiences.

This was handled by adding tags to those Entities that the manager finds and disables/enables when the user starts and exits the XR experiences.

For example, I only want the AR playback controls entity to be available in AR so the tag 'ar' was added to it.

Entity Tagged With AR

There is also an additional tag 'ar-relative' that is used for entities that need to move in front of the user when the floor is found in AR. It provides a much better experience for the user as they don't have to move or look around to find the content.

When the user leaves the AR session, the Entities are moved back to their original position that were saved when they entered.

Adding VR

This was a little trickier than expected as we didn't have a complete example of the needed functionality and it also had to work with the existing AR functionality.

The goal was for the user to be able to move around holo video and also show the controllers that matched the VR input devices being used.

Our Starter Kit: VR has the scripts and functionality to interact with objects, teleport and move around an environment. We can tag entities in the scene with 'pickable' for the VR object picker logic in object-picker.js to test against when the VR input device moves or the select button is pressed.

Pickable And Teleportable Tags

Whether it is an object that we can teleport to or interact with is dependent on the other tags on the Entity.

In this case, the aim was to be able to teleport around the video so an Entity with a box render mesh was added to represent the area and 'pickable' and 'teleportable' tags were added too.

Next up was handling how the controllers should look in VR. The starter kit uses cubes to represent the controllers as they are meant to be replaced with something else by the developer.

VR Controllers

In my case, I wanted to use skinned hands or the representations of the VR controllers instead. Max (who built the PlayCanvas WebXR integration) created a project that does just that: WebXR Controller/Hand Models. And it was just a matter of merging the code and assets together.

WebXR Hand Tracking

Projected skybox

The skybox was obtained from Poly Haven and converted to a cube map with our texture tool. Donovan wrote a shader that projected the cubemap so there was a flat floor that the user could move around in.

It's a nice and easy effect that can be applied in similar scenes without having to build a model or geometry. See the scene without the effect applied (left) and with it (right):

Infinite SkyboxGround Projected Skybox

The shader code is applied by overriding the global engine chunk in projected-skybox-patch.js on application startup.

World Space UI in VR

In VR, there's no concept of 'screen space' for user interfaces so the playback/exit controls would need to be added somewhere in the world.

It was decided the controls should be placed near the holo-video and would always face the user as, generally, that is where their focus would be.

VR UI

This was done by simply having UI buttons in world space as offset child Entities of a 'pivot' Entity. The pivot Entity is positioned at the feet of the holo-video and can be rotated to face the VR camera.

This was done by simply having UI buttons in world space as offset child Entities of a 'pivot' Entity. The pivot Entity is positioned at the feet of the holo-video and can be rotated to face the VR camera.

Setting Up UI In Editor

There's a script on the pivot Entity that gets a copy of the VR camera position and sets the Y value to be the same as the pivot Entity's. It then uses that position to look at so that the UI controls always stay parallel to the floor.

The other common place to have UI controls would be somewhere relative to a tracked controller such as on the left hand/controller. I decided against this because it's not always guaranteed that the VR device would have two hands/controllers such as Google Cardboard.

As the 'floor' is just a projected skybox, a solution was needed to render the shadows of the holo-video onto the scene.

Shadow 'catcher' material

Gustav provided a material shader that would sample the shadow map and make any area that doesn't have a shadow fully transparent.

To make this a bit easier to see, I've shown where the plane would be positioned below. Anywhere where it's white on the floor plane would be fully transparent as there is no shadow being cast there.

Shadow Receiver QuadFinal Shadow Effect

Other tutorials used

There is other functionality in the experience that has been taken from our tutorial/demo project section that have been slightly modified for this project.

These include:

  • Orbit Camera for the non XR camera controls. The orbit camera controls are disabled when the camera entity is disabled so that the camera wouldn't move while in a XR session.
  • Video Textures for the Microsoft video on the information dialog. It was modified so that it would apply the video texture directly to the Element on the Entity it was attached to.

Although not PlayCanvas related, it is worth shouting out: the awesome QR code (that is displayed if the device is not XR compatible) is generated with Amazing-QR. It's able to create colorful and animated QR codes that are more interesting and attractive than the typical black and white versions.

QR Code

Issues found

There were a couple of issues found while this project was being developed. We will be searching for solutions in the near future. For now, we've worked around them in a couple of ways.

In VR, clustered lighting with shadows enabled causes a significant framerate drop. As the shadows in the project are from the directional light and they are processed outside the clustered lighting system, clustered lighting shadows can be disabled with no visual change.

The demo uses screen space UI in AR and there's an issue with accuracy of UI touch/mouse events when trying to press UI buttons. This is because, when the user enters AR, the engine uses a projection matrix that matches the device camera so that objects are rendered correctly relative to the real world.

Unfortunately, the screen-to-world projections are using the projection matrix directly and instead, using the FOV properties on the camera component. The mismatch is what is causing the inaccuracy.

My workaround is to calculate the relevant camera values from the projection matrix on the first AR render frame and apply that back to the camera component. The code can be seen here in xr-manager.js.

Wrapping up

If you have reached here, thank you very much for reading and we hope you have found some useful takeaways that you can use in your own projects!

Useful links:

We would love to get your thoughts and feedback so come join the conversation on the PlayCanvas forum!

WebVR Lab launches with Chrome 56 for Daydream

· 2 min read

Today we're excited to launch the WebVR Lab, a living project built by the PlayCanvas team to help developers learn about creating scalable and responsive WebVR applications for all devices.

Try the lab right now:

Try it in fullscreen

WebVR is a new standard that makes VR experiences accessible to the billions of web browser users and enables developers to reach a user on any type of VR headset. Using the web to deliver VR makes sharing experiences as easy as clicking a link, with no downloads required.

Chrome 56 for Android, launched this week, supports WebVR using the Daydream View headsets.

“WebVR allows developers to build an experience that scales across all VR platforms from Google Cardboard and Daydream to desktop VR headsets, while also supporting 2D displays. Different platforms have different capabilities and the PlayCanvas WebVR Lab project gives developers an example of how to manage that diversity.” Megan Lindsay, Google Product Manager for WebVR

We launched our official WebVR support back in November and the WebVR Lab marks the company’s next step to ensure that developers can quickly and efficiently build the most beautiful WebVR experiences possible.

The WebVR Lab elegantly scales from a simple Cardboard headset to a full desktop VR setup. Devices supported by the project include Google Cardboard, Daydream View, GearVR, Oculus Rift (with Touch controllers) and HTC VIVE.

The project is to be continually updated with new experiments that implement core interactions for VR on the web. Including teleportation, manipulating virtual objects, user interface and controllers.

The first fruits of our work on the WebVR Lab is the Tracked Controllers project, where developers can take our sample code and quickly integrate the Daydream Controller into their project.

Read more on TechCrunch, UploadVR and on the Google Chrome blog.

See you in Virtual Reality!

WebVR support in PlayCanvas

· 2 min read

Today we're really excited to announce support for WebVR into the PlayCanvas Editor.

This week Google announced that WebVR 1.1 (the latest current version of the spec) should be released in Chrome for Android in January 2017. But for a feature as complex as virtual reality, browser support is only one piece of the puzzle. At PlayCanvas, we know how important great tools are to making high quality experiences so today we're launching our WebVR engine integration to make sure that you can create applications right now.

PlayCanvas WebVR

Optimized Engine Support

The PlayCanvas graphics engine is an advanced WebGL graphics engine. We've worked hard to make sure our renderer is optimized specifically for stereo rendering. Unlike most engines we don't simply render the scene twice for each eye. Instead, our renderer knows that a lot of the main render loop is the same for each eye. So, for example, expensive operations like culling, sorting draw calls and setting uniforms and render states only have to be done once before we draw the scene for each eye. This can lead to a significant performance increase, particularly on mobile.

VR Performance Comparison

Polyfill for unsupported platforms

It's still early days for WebVR which means it's not yet supported on all platforms. When you enable WebVR in your PlayCanvas project, we make sure your browser can support it using the WebVR polyfill library from Google. PlayCanvas is smart enough to load the library only if you need it.

Tutorials and Documentation

PlayCanvas is renowned for its extensive documentation and VR is no different. Basic instructions, API reference and specific optimization tips, we've got it all.

Samples and Starter Kits

These sample projects show you how to construct a VR scene and give you sample code to start from.

Hello World - A very simple 3D scene

360 Image - Just drop in your own 360 panorama

360 Video - Add a link to your own video

Room Scale VR - A more complex scene designed for HTC Vive and other Room Scale VR

The Future

We believe the future for WebVR is very bright and we're committed to making PlayCanvas the best tool for creating WebVR applications. Sign up for free today, we'd love to see what you build!

Getting started with WebVR

· 2 min read

helper

Did you hear? VR is BIG! But what is bigger than VR? The web, that's what. What happens when you mix the web and VR?

WebVR is an emerging standard that lets you create 3D virtual experiences on the web and control them using your mobile phone or VR headset. But creating virtual reality is a complex process involving knowledge of WebGL rendering, new input APIs and (at the moment) a constantly changing spec.

That's why we've introduced the VR Starter Kit in PlayCanvas.

The VR Starter Kit is available when you select New Project in your PlayCanvas dashboard. It sets you up straight away with a boilerplate scene containing a VR Camera (using our open source WebVR plugin). It works immediately with Google Cardboard-style VR headsets for mobile phones or with the special WebVR builds of Chrome or Firefox.

To get started, select the VR Starter Kit in the New Project dialog.

project-select

Once you've created your first scene take a look at our WebVR Tutorial to see how you can include the camera code yourself and learn how we interact with the virtual world.

Looking for more features around WebVR? Get in touch with us on twitter or over at our forum.

Easy Cardboard VR in WebGL

· 2 min read

Today we've launched a new library and developer tutorial and sample project showing you how to implement your own Cardboard VR web applications using PlayCanvas.

cardboard-vr

Google's Cardboard VR is an excellent low cost device for experiencing virtual reality via your phone and a simple head mounted display. At PlayCanvas we immediately saw the benefit of using WebGL to display 3D VR experiences right in your browser. With WebGL VR you can distribute VR content quickly and easily to every user with a mobile web browser. With nothing to install there is no barrier to entry.

The PlayCanvas WebVR plugin makes it simple to add support for VR to your application. Simply add a couple of JavaScript files to your PlayCanvas project and add the VR Camera script to your camera entity. That's all it takes to add VR support to your project

Our demonstration project shows you a example of a simple interactive VR scene that you can use to learn.

On a mobile device just tap the view above to enable the Cardboard VR mode. Our tutorial will walk you through how to add VR to your projects.

This is the start of VR support in PlayCanvas and we'll be working to integrate Cardboard VR and WebVR closer into the editor as they get more popular.

Virtual Reality and the future of Web Based Gaming

· 4 min read
Community Manager

On Thursday 19th of June we will be showcasing some of our recent work with the amazing and exciting Oculus Rift Development Kit. In the build up and anticipation to this event we hope to convey why Virtual Reality and revolutionary hardware from Oculus VR are set to be a part of our future at PlayCanvas. Playing a game in VR is one thing. Making a game in VR...now that really is the future.

playcanvas oculus1

What is the Oculus Rift?

Developed first by then 18 year old Palmer Luckey, two evolutions of its development kit amongst other improvements makes it arguably the most promising virtual reality system to date. The Oculus Rift is a low latency,head-mounted display that receives two independently rendered images to a screen to be viewed through stereoscope lenses.

Why Virtual reality?

Many virtual reality experiences target immersion, where users interaction can open the door to a reality (even if only partially) that is not their own.  However the technology behind the Oculus lets the user into a much deeper experience. Where extreme latency and narrow fields of vision have prevented previous Virtual Reality technologies from being immersive, they have often proved successful in creating nausea. As humans are sensitive to latencies as low as 20 milliseconds it is important for the technology in question to be as precise and fast as possible.

Usually leaving users craning necks and grabbing thin air in disbelief, the Oculus takes over 1000 readings per second and so far is effective enough to trick the mind and simulate a physical presence. The VR industry is now closer to 'telexistance' than ever before. Mark Zuckerberg, the current owner of the technology (following an acquisition worth $2 billion by Facebook) described its potential, exclaiming  "Imagine enjoying a courtside seat at a game, studying in a classroom of students and teachers all over the world, or consulting with a doctor face-to-face—just by putting on goggles in your home.”. The Oculus promises that in (hopefully few) years to come, gamers may be able to act and react naturally in what is still a virtual setting.

The Oculus Rift and PlayCanvas

Dave in VR
Oculus Rift support coming soon to PlayCanvas

Imagine a future where you open your Internet browser, select a VR ready online game, enable your Oculus headset and transport yourself into the game immediately. Mind blowing? Potentially yes. This is why we here at PlayCanvas are committed to intertwining the paths of both WebGL and HTML5 technologies with the capabilities of the Oculus Rift. It's crazy to think that VR games could be played and developed by simply opening your favorite web browser.

Firefox are already thinking about VR on the web. Chrome are too. Be sure that when VR support fully comes to the web, PlayCanvas will be ready to help you get there faster. Game Developer's have so many challenges to overcome while developing their game that adding VR (or Head-Mounted Display) support could become just another one of the features that you'd love to try but never quite have time to do. However, using PlayCanvas it's simple. Just drop the OculusCamera script onto your camera and we'll do all the magic to make your game render ready for the headset.

Hyper-realistic gaming experiences should not be limited to core gaming platforms. When web-based gaming can involve products like the Oculus we're opening up an whole new class of immersive gaming experiences. The best of the features of the web; low-friction, accessible and shareable; with the best features of the Oculus, immersive, high-end experiences. It's brave new world!

Dave Evans (CTO at PlayCanvas) will be showcasing some of our work with the Oculus Rift on June 19th at the Scenario Bar. Check out the Event link here and maybe we will see you there!