Skip to main content

Announcing the New PlayCanvas Asset Store

· 4 min read

The PlayCanvas Asset Store is the first place that users tend to go to find content for their projects. This is especially true for new users who want to get started as quickly as possible. Up until now, the Store has not been particularly easy to use and the content has not changed in quite a long time. In short, a complete overhaul and refresh has been long overdue. So today, we are incredibly excited to announce a major upgrade for the PlayCanvas Asset Store!

First up, check out a little example of building a city scene using content taken from the Asset Store. A skybox, a pack of 3D city block meshes and a camera control script are imported and the city is built via drag and drop. And not one single line of code is needed!

Now, let's examine some of the key highlights that make the new Asset Store so special.

Built Right In To The Editor

It should be possible to grab assets quickly, right from within the Editor itself. Why should you have to open a new tab and go hunting around the web? So to keep things as convenient as possible, the ASSET STORE button (in the Editor's Assets Panel) now opens a nicely designed, responsive Asset Store panel.

Asset Store Panel

One really cool benefit of selecting assets from within the Editor is that the currently selected Asset folder is known. This means you have compete control over where your imported assets will be saved.

Preview Store Assets Before Import

Sometimes, an asset thumbnail just isn't sufficient to tell if a particular asset is what you want.

Asset Store Viewer

Our new Store allows you to select a Store Item and preview it in an appropriate viewer (glTF Viewer for 3D models, Texture Viewer for textures and cubemaps).

Find What You Want Quickly

As the number of store items continues to grow, it's going to be incredibly important for you to be able to narrow down your assets searches. As a result, the new Store comes with powerful searching, sorting and filtering capabilities.

Asset Store Search

You can filter by asset type, search asset names and descriptions and then order search results on a host of criteria.

Fresh New Content

Recently, the Asset Store content was, let's just say, beginning to show its age. We are now in a world of HDR skyboxes, PBR materials and high polygon meshes. So it made sense to erase old store items and refresh the Store with better, more modern content. We have selected a broad variety of Creative Commons assets from fantastic content sources such as, HDRi Haven, Sketchfab and Khronos' glTF Sample Models.

Asset Store Content

If you would have any suggestions for content you would like to be added to the Store, please do let us know!

The Future

There's still so much we want to do with the new Asset Store! But here are some things we have in mind:

  1. More Content. The important thing to say about today's update is that it delivers the core infrastructure on which we can iterate. It is now exceptionally easy for us to populate the store with more content. So, in the near term, you can expect to see the range of content expand quite rapidly.

  2. Third Party Stores. Now that we have a solid foundation in place for the Store, we have the ability to host third party stores within the same UI and maximize your choice.

  3. More Asset Types. Today's launch offers models, fonts, textures, skyboxes and scripts. Next, we want to add audio assets and template assets (AKA prefabs). Template assets in particular are very exciting because you would be able to import fully interactive, visual entities into your projects (such as a drivable vehicle or a controllable character).

What would you like to see us add to the Asset Store next? Let us know on the forum.

Happy creating, friends!

Initial WebGPU support lands in PlayCanvas Engine 1.62!

· 4 min read
Martin Valigursky
Software Engineer

WebGPU is a cutting-edge technology that promises to revolutionize the way 3D graphics are handled on the web. As the successor to WebGL, WebGPU provides faster and more efficient rendering capabilities for complex 3D graphics and simulations.

PlayCanvas has been at the forefront of this new technology and has been working on adding WebGPU support to its platform.

With WebGPU, we can expect to see more immersive and interactive 3D experiences on the web in the future.

WebGPU Area Lights
PlayCanvas WebGPU Clustered Area Lights Demo

Refactoring of WebGL engine

Before adding support for WebGPU, it's important to discuss the significant amount of refactoring work that was required on our existing WebGL engine. Implementing deeper architectural changes while preserving backwards compatibility required a significant amount of meticulous care.

  • To enable support for WebGPU, we needed to establish a clear separation of graphics technology that could be shared between WebGL and WebGPU. This involved a significant refactoring effort to extract WebGL-specific code into a separate set of classes.
  • PlayCanvas utilizes a collection of shader chunks to produce GLSL shaders that implement advanced material properties and lighting modes, as well as custom shader chunks defined by users. However, since WebGPU employs the WGSL language, we used glslang and tint WASM modules to dynamically convert these shaders on-the-fly with injecting support for uniform buffers and other modifications.
  • The PlayCanvas engine lacked explicit render passes, making the rendering process more rigid and harder to extend. This was solved by implementing a FrameGraph that allowed us to describe the rendering process as a set of render passes, their dependencies, and associated targets, which created a more flexible and performant rendering architecture.
  • Unlike WebGL, which sets render state and shaders using a custom API, WebGPU specifies all those through render pipelines. To support both rendering APIs with optimal performance, we needed to refactor the render states into standalone objects that are efficient to compare and set up.
  • To support the WebGPU platform, we need to undergo a significant refactoring to organize uniforms into uniform buffers.
  • To facilitate the asynchronous creation of WebGPU device, we have introduced a new async API to create a graphics device, which is the primary breaking change required to adopt WebGPU.

What is left to do

  • Our primary objective is to align the WebGPU implementation with that of WebGL, and while we have made significant progress towards this goal, there are still some features that are missing. Furthermore, several smaller details require cleanup and rectification.
  • We need to incorporate it into the Editor environment for both launched and published applications. Currently, only WebGL is available in this environment.
  • Our primary objective is to achieve full parity with WebGL, but initially, we are focusing on basic implementations of some concepts to deliver a working implementation, which will need to be extended to meet our performance objectives.
  • WebGPU provides developers with access to Compute Shaders, which enables more efficient parallel processing of data on the GPU. This feature can significantly improve the performance of complex algorithms and simulations, which may have been impractical to run on the CPU. With access to Compute Shaders, we can bring new visual features to the next level, such as advanced particle systems, post-processing and global illumination techniques.

Engine examples

As an early pre-release of WebGPU, we have updated several engine examples to use it, which can be accessed on To use WebGPU, the Chrome Canary browser is required, with the 'chrome://flags/#enable-unsafe-webgpu' flag enabled.

Examples Browser WebGPU

PlayCanvas Examples Browser

Let us know what you think in the forums!


Oldsmobile Cutlass Supreme Sedan '71 by Barbo is licensed under Creative Commons Attribution

WebXR AR Made Easy with PlayCanvas

· One min read
Steven Yau
Partner Relations Manager

We are excited to announce the launch of our WebXR AR Starter Kit, available in the New Project dialog today!

New Project WebXR

WebXR is a technology that powers immersive and interactive AR and VR experiences to be accessed through supported web browsers. This allows us to build memorable, engaging content and share them with just a URL. No installs needed!

The starter kit comes with all you need to kickstart your AR experience for WebXR including:

  • Real world light estimation
  • AR shadow renderer
  • AR object resizing and positioning controls
  • Physics raycasting
  • And more!

Look how quickly you can create AR experiences below!

Pacman Arcade + animation by Daniel Brück is licensed under CC BY 4.0

Try it on your device

Give the Starter Kit a try today at where you can use it for free!

Draco Mesh Compression Arrives in the PlayCanvas Editor

· 2 min read

We are thrilled to announce the immediate availability of Draco Mesh Compression in the PlayCanvas Editor! Our latest feature allows developers to compress meshes using Google's Draco technology, reducing file sizes and enhancing the end-user experience.

At its core, Draco Mesh Compression reduces the amount of data needed to represent 3D graphics without compromising visual quality. The technology achieves this by applying a lossy compression algorithm to the mesh data. With less data to transfer, the result is faster load times and lower bandwidth costs for your applications.

The open source PlayCanvas Engine has been able to load Draco-compressed glTF 2.0 files for quite some time now. But now you can generate these Draco-compressed glTF files in the Editor at import time. Check out how easy it is to use:

"1972 Datsun 240k GT" by Karol Miklas is licensed under Creative Commons Attribution-ShareAlike.

In the example above, a 49.9MB GLB file is crunched down to only 3.67MB. That's a 92.6% reduction is file size! And for the majority of scenes, you should notice no difference in terms of visual quality. The only cost is decompression time when the compressed GLB is downloaded by an end user, but this should be significantly less than what is saved in terms of download time.

To enable the feature, open your Project Settings in the Inspector, expand the Asset Tasks panel and edit the Mesh Compression setting. Then, simply Re-Import any existing FBX or GLB and compression will be applied. Any FBX or GLB subsequently imported will also respect your mesh compression setting. Read more on the Developer Site.

We believe that mesh compression is going to take many types of applications to the next level, particularly e-commerce applications like product configurators, which need to load detailed meshes as fast as possible.

Get started with PlayCanvas today and make your WebGL dreams a reality!

How to make your HTML5 Games Awesome!

· 12 min read
Associate Partner Support Engineer

How To Make Your HTML5 Games Awesome

The quality of a video game is often determined by how polished it is. It's the attention to detail and the finishing touches that can make a good game great. In this article, we'll take a look at the importance of polish in game development and how it can significantly enhance the overall experience.

We'll use Space Rocks!, a simple Asteroids game created with the PlayCanvas game engine to showcase how even the smallest details can make a big impact.

Game juice is a design term to refer to the small visual and audio effects that are added to a game to make it feel more satisfying to play. This can include things like screen shakes, particle effects, and sound effects that are triggered when the player takes certain actions. Game juice is all about enhancing the overall feel of a game and making it more immersive and enjoyable.

Particularly, we'll explore how game polish can be achieved through game juice.

Play it here!

How it started

This was our starting point before we added game juice. While the game is fully functional and plays well, it lacks the visual and audio effects that would make it truly engaging. As a result, it feels a bit dull and uninteresting.

However, with the right attention to detail and some careful implementation of game juice, we can transform this basic Asteroids game into something much more exciting and satisfying to play.

What can we improve?

To think about what should have game juice, I always try to narrow down the most common interaction or core mechanic of the game. In our case, that would probably be:

  • Shooting
  • Destroying asteroids
  • Colliding with asteroids

With those three key pieces in mind, let's start thinking about how we can improve them.

For shooting

It's not very interesting right now:

Basic Shooting

If we want to change that, there's a few key things we can do. We can increase the fire rate through a script that allows us to easily control by decreasing the fire cooldown.

Gun.attributes.add('cooldown', {
type: 'number',
default: 0.25,
title: 'Cooldown',
description: 'How long the gun has to wait between firing each bullet'

Gun.prototype.update = function (dt) {
this._cooldownTimer -= dt;

if ( && this.canFire()) {

In fact, while we're at it, let's make shooting a bit more unpredictable. Let's add some spread to our shots!

Gun.attributes.add('spread', {
type: 'number',
default: 10,
title: 'Bullet Spread',
description: 'Up to how many degrees each bullet should vary in Y rotation.'

Gun.prototype.applySpreadOn = function (bullet) {
var rotation = this.entity.getEulerAngles();
rotation.y += getRandomDeviation(this.spread);

A simple but impactful change! Here's how it looks with values I put in for some fun:

Shooting Spread Effect

I highly encourage you to play with these values to see what's fun for you!

It's getting better, but still not there. Let's think about more visual aspects now. What more can we do to make it more visually appealing?

PlayCanvas has a nice feature that allows you to have tons of lights in your scene with very little performance impact! It's called ✨ Clustered Lighting ✨.

We can actually leverage this amazing tech to give every single one of our bullets a point light - this makes the scene considerably more dynamic as everything gets lit up when we fire!

As another touch, let's add a few sparkles when our shots hit something! The extra visual effect will make a big difference instead of just letting our bullets disappear. Particle explosions are always awesome.

Shooting With Particles

Awesome! Our bullets look pretty nice. But we’re still shooting at fairly dull asteroids. Let's make a few changes.

Destroying Asteroids

Firstly, we want our asteroids to stand out from our background. Let's change the background texture to something a bit brighter.

New Background Texture

Much better! But can we make the asteroids themselves prettier? They're currently mapped with a fairly low resolution texture. Moreover, there's no variety - all asteroids are the same, only rotated differently.

Let's import a new mesh and texture for the asteroids.

New Asteroid Model

Nice! Much more visible, and much more variety - I should note I went ahead and added a simple component that further randomizes the scale of the asteroids being spawned!

var ScaleRandomizer = pc.createScript('scaleRandomizer');

ScaleRandomizer.attributes.add('baseScale', {
type: 'number',
title: 'Base Scale',
description: 'The base scale to deviate from'

ScaleRandomizer.attributes.add('scaleDeviation', {
type: 'number',
title: 'Scale Deviation',
description: 'The amount by which the effective scale should deviate from the base scale'

// initialize code called once per entity
ScaleRandomizer.prototype.initialize = function () {

ScaleRandomizer.prototype.getRandomScale = function () {
var deviation = getRandomFloatDeviation(this.scaleDeviation, 3);
var randomScale = this.baseScale + deviation;

return new pc.Vec3(randomScale, randomScale, randomScale);

Awesome, the asteroids look much nicer. Let's turn our attention back to our background for a moment as it looks very static.

  • Could we give it some life by adding some ‘background’ asteroids?
  • Could we even make destroyed asteroids leave pieces behind as we destroy them?
  • The game currently gets harder as time goes on. Maybe we can indicate that visually somehow? Maybe by changing the background texture?

Let's implement these ideas!

For the background asteroids, I simply reused our asteroid spawner class, but moved the spawn points a bit below.

Spawner Script UI

To make it as non-impactful on performance as possible, I duplicated our template, renamed it to FakeAsteroid and removed all components, except the Mover and Rotator components.

This is one of the beauties of using a component-based architecture! It allows you to quickly alter the behavior of objects without having to write or modify code at all!

I also made the FakeAsteroid texture much darker, so as not to distract or confuse the player.

My approach to the ‘fragment’ asteroids was similar, except I made them much smaller, and gave the regular asteroids a component to spawn fragments on death.

FragmentSpawner.attributes.add('minMaxCount', {
type: 'vec2',
title: 'Min, Max Count',
description: 'The minimum and maximum amount of fragments to spawn.'

FragmentSpawner.prototype.spawnFragments = function () {
if (FragmentSpawner.fragmentParent === null) {

var spawnCount = getRandomInRange(this.minMaxCount.x, this.minMaxCount.y);

for (i = 0; i < spawnCount; i++) {

FragmentSpawner.prototype.spawnSingleFragment = function () {
var fragment = this.fragmentTemplate.resource.instantiate();
var position = this.getFragmentPosition();

And, while we’re at it, why don't we add a dust puff and small particles when the asteroid gets destroyed to complement our fragments?

I gathered a few textures online, duplicated our bullet hit particle effects and modified them. To spawn the particle effects, I used the same component I had used in the bullet:

// A script that spawns a particle effect on death
var DeathEffect = pc.createScript('deathEffect');

DeathEffect.attributes.add('particleEffects', {
type: 'asset',
assetType: 'template',
array: true,
title: 'Particle Effect Templates',

DeathEffect.effectParent = null;

// initialize code called once per entity
DeathEffect.prototype.initialize = function () {
this.entity.on('destroy', this.onDestroy, this);

if (!DeathEffect.effectParent) {
DeathEffect.effectParent = this.entity.parent;

DeathEffect.prototype.onDestroy = function () {
for (var i = 0; i < this.particleEffects.length; i++) {
var effect = this.particleEffects[i].resource.instantiate();

And lastly for the background, I added a script to lerp the transparency of our blue space material towards 0. This slowly reveals a purple material underneath.

// A script that manages ambient color.
var AmbientManager = pc.createScript('ambientManager');

AmbientManager.attributes.add('startingColor', {
type: 'rgba',
title: 'Starting Color',
description: 'The starting color for the ambient'

AmbientManager.attributes.add('finalColor', {
type: 'rgba',
title: 'Final Color',
description: 'The final color for the ambient'

AmbientManager.attributes.add('targetMaterial', {
type: 'asset',
assetType: 'material',
title: 'Target Material',
description: 'The material whose color to set (Matching the ambient color)'

// initialize code called once per entity
AmbientManager.prototype.initialize = function () {

AmbientManager.prototype.updateTransition = function (transitionProgress) {
var color = new pc.Color();
color.lerp(this.startingColor, this.finalColor, transitionProgress);

var mat = this.targetMaterial.resource;
mat.emissive = color;
mat.opacity = color.a;

Here's the end result with all of our asteroid changes:

It looks amazingly better! Already a massive difference from our starting point.

Colliding With Asteroids

The last piece of the puzzle is when asteroids hit us! It needs to feel impactful! As if you were in a car, and the car just went over a bump.

We'll want to communicate it a bit better. Right now, all that happens is that the ‘n lives left’ counter in the top left gets decremented. Not only does it need to be obvious that we've got hit, but the player must be able to consult how many lives left he has at a glance.

I downloaded the model for our spaceship, and made a top-down render of it in Blender. The result was a simple plain icon:

Spaceship Icon

Plain, but enough to make a health counter with. Let's make it semi transparent and add it to the world. Our health counter will display from one to three of these icons to indicate how much life we've got left.

To give it some more juice, let's also make it ‘jump up’ when our health changes, and rotate it inwards towards the game world, to give it a 3D appearance.

And, since using components makes it easy, let's do the same to our score counter:

Score Counter

Much simpler, and much nicer!

Next up, let's try to emulate that ‘bump’ feeling. We can do this by adding some screen shake whenever we get hit! And for extra impact, we can make it slow-mo as well!

Making it slow mo is fairly simple - one component does it:

var BulletTimeEffect = pc.createScript('bulletTimeEffect');

BulletTimeEffect.attributes.add('effectDuration', {
type: 'number',
default: 1,
title: 'Effect Duration',
description: 'How long the bullet time effect should last.'

BulletTimeEffect.attributes.add('timeCurve', {
type: 'curve',
title: 'Time Curve',
description: 'How much the time scale should be over unscaled time.'

// initialize code called once per entity
BulletTimeEffect.prototype.initialize = function () {
this._time = 0;
this.entity.on('destroy', function () { = 1;
}, this);

// update code called every frame
BulletTimeEffect.prototype.update = function (dt) {
this._time += (dt / / this.effectDuration; = this.timeCurve.value(this._time); = Math.min(1,;

As for making the screen shake, it's a bit more complex (though certainly not magic!). The underlying logic is to simply move the camera randomly. To do so, we can use a script that tracks the camera's original position, and translates it randomly. We reset to the original position at the beginning of each new frame, and repeat.


The above getRandomTranslation() method could simply return a random Vector3, and it would work. Problem is, though, this approach can make the camera feel like it is jittering, not shaking, particularly if the shake distance is large. This can cause motion sickness.

What else can we do then? Well, there is a more mathematically complex way of getting a random number - one that makes it so that our shaking is smooth, not jittery. This way of getting a random number is Perlin Noise!

Perlin Noise is used to create awesome things all over media, from explosion visual effects to Minecraft's world generation. If you're interested in the math or simply curious, you can learn more in this excellent article.

Let's go with Perlin Noise for our game. You can see the implementation we went with in the perlin-camera-shake.js and perlin-noise.js script.

Lastly, let's add a small shockwave whenever we get hit! Let's use a particle system for this - just like with the asteroid explosion and bullet hit effects. I grabbed a simple circular texture, coloured it red to indicate something negative, and added a script that spawns the effect whenever the player gets hit.

The combined effects look like this:

You'll notice I've added screen shake to more than just the player getting hit! I'm a big fan of this effect, so I've added it to asteroid explosions and firing bullets as well!

And that about does it

With the effects we added above, the game looks and plays entirely different. Destroying asteroids feels good, and everything else in the game is there to enhance that experience.

Finished Game

As a last finishing touch, I went ahead and added a few post-processing effects that PlayCanvas offers. Namely, Vignette, Bloom and Chromatic Aberration. I also added CRT Scanlines as an overlay for a retro effect.

I hope this guide has been useful to you! Take a look at the project, it is public and is accessible as a Game Demo in our documentation.

PlayCanvas is an excellent cloud-based game engine that allows you to build games for the browser. It has an amazing editor that allows you to use it as if it were Unity or Unreal - which most developers are accustomed to.

Want to learn more?

Here's a few resources if you want to try and make something similar to juice up your game!

PlayCanvas is an awesome web-first game engine that runs on the cloud. You don't need to download anything, and it's free!

PlayCanvas - The Web-First Game Engine

This one reddit post sums up many tricks you can do in about 60 seconds!

Juice your game in 60 seconds

There's this very nice GDC talk that goes into game juice a bit more deeply. Tons of useful information there!

Juice it or lose it - a talk by Martin Jonasson & Petri Purho

And there's this awesome INDIGO class that goes in-depth about my favorite game juice - screen shake!

Vlambeer - The art of screenshake by Jan Willem Nijman

PlayCanvas now supports Microsoft volumetric video playback

· 10 min read
Steven Yau
Partner Relations Manager

Open in new tab

We are very excited to release our showcase demo for Microsoft Mixed Reality Capture Studios (MRCS) volumetric video technology.

PlayCanvas now supports MRCS volumetric video with a playback library for captured footage at their studios. Watch it on desktop, mobile with AR or even in a WebXR-enabled VR headset, all from a single URL!

The library can be easily added to any PlayCanvas project and used to create fantastic immersive mixed reality experiences.

About Microsoft Mixed Reality Capture Studios

MRCS records holographic video - dynamic holograms of people and performances. Your audiences can interact with your holograms in augmented reality, virtual reality and on 2D screens.

They are experts at capturing holographic video, advancing capture technology and have been pioneering its applications since 2010.

Learn more about Microsoft Mixed Reality Capture Studios here.

How was this created?

The demo was created with a combination of several tutorials and kits available on the PlayCanvas Developer Site, the MRCS playback library and freely available online assets.

You can find the public project for the demo here. We've removed the URL to the volumetric video file (due to distribution rights) and the proprietary MRCS devkit library. Please contact MRCS to gain access to the library and example videos.

Microsoft Video Playback Library

In the folder 'holo video', you will find the scripts and assets needed for playing back volumetric video. You will need to add the devkit library file name 'holo-video-object-umd.js' that will be provided from MRCS to complete the integration and be able to playback video.

Holo Video In Assets Panel

Due to the size and how the data files for the video need to be arranged, they have to be hosted on a separate web server (ideally behind a CDN service like Microsoft Azure).

The 'holo-video-player.js' script can be added to any Entity and be given a URL to the .hcap file. At runtime, the script will create the necessary meshes, materials, etc to render and playback the volumetric video.

Holo Video Script UI

Expect full documentation to be released soon on our site!

Creating a Multi Platform AR and VR experience

As you see in the video, we've made the experience available to view in the standard browser, AR on WebXR-enabled mobile devices (Android) and VR on devices like the Oculus Quest. iOS support for WebXR is in progress by the WebKit team.

This was done by combining several of our WebXR example projects and the scripts and assets can be found in the 'webxr' folder:

WebXR Folder In Assets Panel

'xr-manger.js' is controls how the XR experience is managed and handled throughout the experience:

  • Entering and leaving AR and VR.
  • Which UI buttons to show based on the XR capabilities of the device it is running on (e.g hides the VR UI button if AR is available or VR is not available).
  • Showing and hiding Entities that are specific to each experience.
  • Moving specific Entities in front of the user when in AR so the video can be seen more easily without moving.

Adding AR

AR mode was added first, taking the 'xr-manager.js' script as a base from WebXR UI Interaction tutorial. Key changes that had to be made to the project were:

  • Ensuring ‘Transparent Canvas’ is enabled in the project rendering settings.
  • Creating a second camera specifically for AR which is set to render the layers that are needed for AR (i.e. not including the skybox layer) and having a transparent clear color for video passthrough).

After copying and pasting 'the xr-manager.js' file from the tutorial project into the demo project, I hooked up the UI elements and buttons to enter AR and added extra functionality to disable and enable Entities for AR and non-AR experiences.

This was handled by adding tags to those Entities that the manager finds and disables/enables when the user starts and exits the XR experiences.

For example, I only want the AR playback controls entity to be available in AR so the tag 'ar' was added to it.

Entity Tagged With AR

There is also an additional tag 'ar-relative' that is used for entities that need to move in front of the user when the floor is found in AR. It provides a much better experience for the user as they don't have to move or look around to find the content.

When the user leaves the AR session, the Entities are moved back to their original position that were saved when they entered.

Adding VR

This was a little trickier than expected as we didn't have a complete example of the needed functionality and it also had to work with the existing AR functionality.

The goal was for the user to be able to move around holo video and also show the controllers that matched the VR input devices being used.

Our Starter Kit: VR has the scripts and functionality to interact with objects, teleport and move around an environment. We can tag entities in the scene with 'pickable' for the VR object picker logic in object-picker.js to test against when the VR input device moves or the select button is pressed.

Pickable And Teleportable Tags

Whether it is an object that we can teleport to or interact with is dependent on the other tags on the Entity.

In this case, the aim was to be able to teleport around the video so an Entity with a box render mesh was added to represent the area and 'pickable' and 'teleportable' tags were added too.

Next up was handling how the controllers should look in VR. The starter kit uses cubes to represent the controllers as they are meant to be replaced with something else by the developer.

VR Controllers

In my case, I wanted to use skinned hands or the representations of the VR controllers instead. Max (who built the PlayCanvas WebXR integration) created a project that does just that: WebXR Controller/Hand Models. And it was just a matter of merging the code and assets together.

WebXR Hand Tracking

Projected skybox

The skybox was obtained from Poly Haven and converted to a cube map with our texture tool. Donovan wrote a shader that projected the cubemap so there was a flat floor that the user could move around in.

It's a nice and easy effect that can be applied in similar scenes without having to build a model or geometry. See the scene without the effect applied (left) and with it (right):

Infinite SkyboxGround Projected Skybox

The shader code is applied by overriding the global engine chunk in projected-skybox-patch.js on application startup.

World Space UI in VR

In VR, there's no concept of 'screen space' for user interfaces so the playback/exit controls would need to be added somewhere in the world.

It was decided the controls should be placed near the holo-video and would always face the user as, generally, that is where their focus would be.


This was done by simply having UI buttons in world space as offset child Entities of a 'pivot' Entity. The pivot Entity is positioned at the feet of the holo-video and can be rotated to face the VR camera.

This was done by simply having UI buttons in world space as offset child Entities of a 'pivot' Entity. The pivot Entity is positioned at the feet of the holo-video and can be rotated to face the VR camera.

Setting Up UI In Editor

There's a script on the pivot Entity that gets a copy of the VR camera position and sets the Y value to be the same as the pivot Entity's. It then uses that position to look at so that the UI controls always stay parallel to the floor.

The other common place to have UI controls would be somewhere relative to a tracked controller such as on the left hand/controller. I decided against this because it's not always guaranteed that the VR device would have two hands/controllers such as Google Cardboard.

As the 'floor' is just a projected skybox, a solution was needed to render the shadows of the holo-video onto the scene.

Shadow 'catcher' material

Gustav provided a material shader that would sample the shadow map and make any area that doesn't have a shadow fully transparent.

To make this a bit easier to see, I've shown where the plane would be positioned below. Anywhere where it's white on the floor plane would be fully transparent as there is no shadow being cast there.

Shadow Receiver QuadFinal Shadow Effect

Other tutorials used

There is other functionality in the experience that has been taken from our tutorial/demo project section that have been slightly modified for this project.

These include:

  • Orbit Camera for the non XR camera controls. The orbit camera controls are disabled when the camera entity is disabled so that the camera wouldn't move while in a XR session.
  • Video Textures for the Microsoft video on the information dialog. It was modified so that it would apply the video texture directly to the Element on the Entity it was attached to.

Although not PlayCanvas related, it is worth shouting out: the awesome QR code (that is displayed if the device is not XR compatible) is generated with Amazing-QR. It's able to create colorful and animated QR codes that are more interesting and attractive than the typical black and white versions.

QR Code

Issues found

There were a couple of issues found while this project was being developed. We will be searching for solutions in the near future. For now, we've worked around them in a couple of ways.

In VR, clustered lighting with shadows enabled causes a significant framerate drop. As the shadows in the project are from the directional light and they are processed outside the clustered lighting system, clustered lighting shadows can be disabled with no visual change.

The demo uses screen space UI in AR and there's an issue with accuracy of UI touch/mouse events when trying to press UI buttons. This is because, when the user enters AR, the engine uses a projection matrix that matches the device camera so that objects are rendered correctly relative to the real world.

Unfortunately, the screen-to-world projections are using the projection matrix directly and instead, using the FOV properties on the camera component. The mismatch is what is causing the inaccuracy.

My workaround is to calculate the relevant camera values from the projection matrix on the first AR render frame and apply that back to the camera component. The code can be seen here in xr-manager.js.

Wrapping up

If you have reached here, thank you very much for reading and we hope you have found some useful takeaways that you can use in your own projects!

Useful links:

We would love to get your thoughts and feedback so come join the conversation on the PlayCanvas forum!

glTF 2.0 Import Arrives in the PlayCanvas Editor

· 4 min read

We are excited to announce a major update for the PlayCanvas Editor: glTF 2.0 import. This new feature allows users to easily import and use 3D models created in other applications such as Blender and SketchUp, as well as from digital asset stores like Sketchfab, directly into the PlayCanvas Editor.

Model by Loïc Norgeot and mosquito scan by Geoffrey Marchal for Sketchfab licensed under CC BY 4.0
Author: Sketchfab License: CC-BY-4.0 Source: Mosquito in Amber

glTF (GL Transmission Format) is a file format developed by The Khronos Group for 3D models that is quickly becoming the industry standard. It is an open format that is designed to be efficient and easy to use, making it the perfect choice for use in the PlayCanvas Editor.

The PlayCanvas Editor and run-time now supports the full glTF 2.0 specification, including 100% of ratified glTF extensions (such as sheen, transmission, volume and iridescence). This means that developers can import and use high-quality 3D models and take advantage of the latest advancements in the glTF format to create truly stunning interactive experiences.

One of the major benefits of glTF import is the ability for users to edit glTF materials in the PlayCanvas Editor's powerful Material Inspector. Here you can see the famous Stanford Dragon imported from GLB with refraction properties edited live in the Inspector:

The dragon model based on the one from Morgan McGuire's Computer Graphics Archive. Original dragon mesh data based on a Stanford Scan © 1996 Stanford University.

Once a glTF model is imported into the editor, all of its materials are available to be tweaked and customized. This added flexibility and control will greatly enhance the workflow of developers and allow them to tweak the appearance of assets without having to fire up Blender.

Another great benefit of the new glTF import feature is its integration with PlayCanvas' Template system. The PlayCanvas Template system allows developers to create reusable and modular components that can be trivially reused across multiple scenes. With the integration of glTF import, developers can now import their 3D models and scenes, and then directly edit the hierarchy, attaching scripts and other component types to the imported scene nodes. This will allow them to create complex and interactive 3D scenes quickly and easily, with minimal coding.

Additionally, the glTF import feature is also integrated with PlayCanvas' animation system. The PlayCanvas animation system allows developers to create and control animations on their entities and characters. When importing animated glTF/GLB, developers can now quickly set up an Animation State Graph to build simple loops or complex transitions. This will allow them to create more dynamic and interactive 3D scenes, with minimal effort. Check out how it can be done in just a few seconds:

CC0: Low poly fox by PixelMannen CC-BY 4.0: Rigging and animation by @tomkranis on Sketchfa.
glTF conversion by @AsoboStudio and @scurest

In short, glTF import is a major addition to the PlayCanvas Editor, and will greatly enhance the workflow of our users. It allows developers to:

  • Faithfully import glTF/GLB files from many different applications and stores.
  • Edit the materials and hierarchy of imported glTF/GLB files.
  • Import glTF/GLB animations and quickly configure loops and transitions.

We are thrilled to bring this new feature to our users and we can't wait to see the amazing projects that will be created with it. If you have any questions or feedback, please let us know on our community forum.

Thank you for choosing PlayCanvas, and happy creating!

PlayCanvas Review of 2022

· 6 min read
Steven Yau
Partner Relations Manager

Happy New Year to you all!

As we begin 2023, let’s take a moment to look back at last year’s highlights for PlayCanvas, the powerful WebGL engine and platform for creating interactive web content.

From new features and improvements to exciting projects and partnerships, PlayCanvas has had a busy and successful year. In this review, we will cover some of the key developments and achievements of the platform, and how they have helped to advance the capabilities and potential of WebGL-based content creation.

The fantastic work done by you

One of the most exciting aspects of PlayCanvas is seeing the amazing projects and work created by you!

From games and interactive experiences to architectural visualizations and simulations, the PlayCanvas community is constantly pushing the boundaries of what is possible with WebGL.

To celebrate this work, we've created a showcase video with the standout projects and work from 2022.

PlayCanvas Showcase 2022

We are looking to do more of these in 2023 so don't be shy! Share with us and the rest of the community on Twitter, forums and Discord.

We also wanted to take a deeper dive into the creative process and workflows behind these projects.

To do this, we reached out to a selection of developers who have used PlayCanvas to create fantastic content across e-commerce, WebAR, games and the metaverse.

In these Developer Spotlights, developers share their experience with PlayCanvas, the challenges and solutions they encountered during development, and the unique insights and approaches they brought to their projects.

These interviews provide valuable insights and inspiration for other PlayCanvas users and anyone interested in WebGL-based content creation.

Graphics Engine Enhancements

This year, we've been laser-focused on adding WebGPU support and glTF 2.0 spec compliance to the PlayCanvas graphics engine, and we're thrilled with the progress we've made.

With Google Chrome set to enable WebGPU by default in 2023, we're excited to be at the forefront of the future of interactive 3D content on the web, and we can't wait to see what WebGPU will allow developers to create.

WebGPU Grabpass
WebGPU Grabpass
WebGPU Clustered Lighting
WebGPU Clustered Lighting

In addition to WebGPU support, we've also added support for all ratified glTF 2.0 extensions to the PlayCanvas engine, complete with Editor support for iridescence and dynamic refraction. These features allow developers to create even more realistic and visually stunning 3D content.

glTF Asset Viewer

But we didn't stop there! We also released Editor support for Clustered Lighting and Area Lights, which allow developers to easily incorporate hundreds of dynamic lights into their projects. And as it turns out, our users have already been using these new features to add extra flair and fidelity to their projects.

Space Rocks
Space Rocks
Pool Demo
Pirron Pool

glTF Viewer 2.0 with AR support

We released a major update to the Model Viewer, taking it to version 2.0! This update not only improved the user experience, but also added a host of new features.

The most notable new feature is AR support with WebXR (Android) and USDZ export (iOS). This allows users to view glTF models in AR directly from the browser.

glTF Viewer AR on iOSglTF Viewer AR on Android

We've also made the UI more streamlined and mobile-friendly, grouping related functionality together for easier use. Rendering has been improved with the 'High Quality Rendering' option, which enables supersampling to smooth out jagged edges along polygons and high resolution reflections for more realistic rendering.

glTF Viwer Lamborghini Urus

Tools Updates

We've been continuously improving the Editor, making it even more powerful and user-friendly for our developers.

These include:

  • Infrastructure upgrades across the board with benefits to all users including:
    • Faster download speeds for published build zips across the world.
    • Faster asset delivery with up to 50% improvement in loading projects in the Editor and the Launch Tab.
    • Zero downtime deployment for services.
  • More powerful Scene Hierarchy Search that searches components and script names.
  • Creating Texture Tool to inspect textures and convert HDRIs to/from cubemaps (also open source!).
  • Adding GitHub sign-in.

The project dashboard has gotten a huge refresh and can be accessed in-Editor. It includes searching and sorting of the project list as well as being able to manage teams and settings without leaving the Editor!

Project Dashboard

Version Control also got some major features this year, including the addition of the Item History and Graph View, which make it easier to track changes to your projects. And looking ahead to this year, we're planning to make some of our REST API public, so developers can automate flows for CI and tools.

Version Control History

Thank You

As we wrap up our 2022 review of PlayCanvas, we want to take a moment to thank all of our users for their continued support and for the amazing projects and work they have created with PlayCanvas.

Your creativity and innovation inspire us to continue improving and expanding the capabilities of our WebGL engine and platform.

We can't wait to see what the new year brings and the incredible projects and work that our users will create with PlayCanvas. Whether you are new to PlayCanvas or a seasoned pro, we hope that you will continue to be a part of our community and push the boundaries of what is possible with WebGL-based content creation.

Thank you again, and we look forward to seeing what you will accomplish in the new year!

PCUI Framework Migrated to TypeScript

· 2 min read

PCUI is the open source, front-end framework for building amazing web-based tools like the PlayCanvas Editor, glTF Viewer, Texture Tool and more!

glTF Viewer PCUI Interface

Today, we are excited to announce the release of PCUI version 2.10.0! This new release includes a number of significant updates and improvements that will make building web tools with PCUI even easier and more efficient.

One of the biggest changes in this release is the migration of the entire source code from JavaScript to TypeScript. This will provide a number of benefits to developers, including improved type checking, better code completion and IntelliSense in IDEs, and easier maintenance and refactoring of code.

PCUI API Reference

In addition to the source code migration, we have also released a new API reference manual built with Typedoc. This will make it easier for developers to understand and use the various APIs and components available in PCUI.

TypeScript developers will also be pleased to know that we have improved the TypeScript declarations in this release, making it even easier to use PCUI in a TypeScript project.

Finally, we want to highlight the open source nature of PCUI. We believe in the power of the open source community to build great software together, and we encourage open source developers to get involved with the project. Whether you want to contribute code, report issues, or just provide feedback, we welcome your participation. Explore the PCUI repo today on GitHub!

Thank you for using PCUI, and we hope you enjoy the new release!

Porting Unreal Scenes to the Browser with PlayCanvas - Developer Spotlight with Leonidas Maliokas

· 5 min read
Steven Yau
Partner Relations Manager

Welcome to Developer Spotlight, a new series of blog articles where we talk to developers about how they use PlayCanvas and showcase the fantastic work they are doing on the web.

Today we are excited to be joined by Leonidas Maliokas, a freelance web and games developer for Solar Games.

He will show us how Solar Games ported a metaverse experience from Unreal to PlayCanvas in the video presentation below. Specific areas covered are:

  • Converting scenes and assets from Unreal to PlayCanvas
  • Runtime and load-time optimization
  • Lighting and post processing
  • Multiplayer with Colyseus
  • Ready Player Me avatar integration
  • Spatial-aware audio chat with Agora

Presentation Slides

Hi Leonidas, welcome to Developer Spotlight! Tell us about yourself and your studio

Hey, I’m Leonidas from Solar Games (formerly known as Pirron 1)! I’ve been working with interactive 3D websites since 2012. I used to work as a civil engineer before turning my hobby and passion for gamedev into a full time job using PlayCanvas.

Together with doing PlayCanvas contracts of all sorts, like product configurators, games and promotional sites, I’ve been researching how to extend the PlayCanvas engine and editor. Adding open world terrains, special effects and easy to use networked controllers to match features normally found in native only game engines, led to founding Solar Games.

We offer Uranus Tools for PlayCanvas, a collection of plug and play scripts to supercharge your PlayCanvas creation pipeline. You can find out more about our company’s services at

We are also working on Aritelia, a procedurally generated open world social MMO in PlayCanvas. This is still in development but you can already give it a try with the pre-alpha tech demonstration that was released last year.

Why did you choose PlayCanvas?

It was actually an easy choice for us: by reviewing the mainstream WebGL libraries and platforms, PlayCanvas did stand out for:

  • Offering an integrated editor and publishing solution. Even after all these years, the ability to easily share projects and builds and collaborate with your colleagues in real-time is something unique to PlayCanvas.
  • The PlayCanvas team is very productive and professional in the way it moves the platform forward.
  • The open source PlayCanvas engine provides a very effective and easy to use API.

What were the initial challenges and how did the team overcome them?

The main challenge was the lack of certain features and tools. For example things that you’d take for granted in a native game engine like a terrain system, post effects, automatic instancing and level of detail were missing.

The good news was that even before the PlayCanvas Editor API was officially released, it has always been possible to extend the PlayCanvas Editor. We were able to write our own editor extensions and quickly make them productive in our development pipeline.

Other developers and companies became interested in our extensions and we started offering them in our company’s asset store.

How is building an HTML5 game/experience different from a native game/experience?

Several concepts like rendering, resource loading, game logic and state management are quite similar. But there are some unique concepts when it comes to web-based experiences that can be challenging.

In particular, download times, different display sizes and pixel ratios, a broad spectrum of device specs, and also platform and browser compatibility.

Taking into account these factors is mandatory when building a high-quality HTML5 experience.

What are the team's favorite features of PlayCanvas?

Our favorite feature is the editor, by far. The fact that it is collaborative in real time makes PlayCanvas the best tool for teams to work together. Also, the fact that PlayCanvas has version control integrated is pretty cool! Something else I would add is that PlayCanvas provides a very clean API to work with. Seriously, not only HTML5 devs but also native game devs should give PlayCanvas a try. It’s a great tool to quickly be productive!

Other than that:

  • Asset pipelines like enabling texture compression.
  • The engine API and the constant addition of new features by the PlayCanvas team.
  • The community - many greetings to everybody!

What is on the feature wishlist for PlayCanvas this year?

  • Having the PlayCanvas WebGPU renderer available to play with.
  • Full support of the new node based shader editor.
  • Asset variants for specific platforms e.g. serve smaller textures on mobile.

How do you see HTML5 games/experiences evolve over the next few years?

It’s definitely an exciting time for developers and companies working with HTML5 content. Both the technology has moved forward with standards and frameworks being more robust and powerful than ever, and the devices capable of running HTML5 experiences are very capable.

The metaverse is already leveraging HTML5 to deploy worlds and experiences across traditional web2 and newer web3 websites.

Pixel streaming is the only valid contender when it comes to what HTML5 can offer. I would definitely welcome a feature where pixel streaming is a viable option since it’s a great concept. But right now I don’t see this happening soon.

There are so many opportunities around HTML5 and I see a very positive future for everyone involved.

Thank you very much for your time and we look forward to your presentation

Thank you for this opportunity to showcase our work!

Useful links:

Stay tuned for more Developer Spotlights in the future!