Implementing Cloth Simulation in WebGL

The PlayCanvas WebGL game engine integrates with ammo.js – a JavaScript/WebAssembly port of the powerful Bullet physics engine – to enable rigid body physics simulation. We have recently been working out how to extend PlayCanvas’ capabilities by using soft body simulation. The aim is to allow developers to easily set up characters to use soft body dynamics.

Here is an example of a character with and without soft body cloth simulation running in PlayCanvas:


Want to know how it was done? Read on!

Step 1: Create a soft body dynamics world

By default, PlayCanvas’ rigid body component system creates an ammo.js dynamics world that only supports generic rigid bodies. Cloth simulation requires a soft body dynamics world (btSoftRigidDynamicsWorld). Currently, there’s no easy way to override this, so for the purpose of these experiments, a new, parallel soft body dynamics world is created and managed by the application itself. Eventually, we may make the type of the internal dynamics world selectable, or maybe even allow multiple worlds to be created, but for now, this is how the demo was structured.

Step 2: Implement CPU skinning

PlayCanvas performs all skinning on the GPU. However we need skinned positions on CPU to update the soft body anchors (btSoftBody::Anchor) to match the character’s animation. CPU skinning may be supported in future PlayCanvas releases.

Step 3: Patch shaders to support composite simulated and non-simulated mesh rendering

Soft body meshes will generate vertex positions and normal data in world space, so in order to render the dynamically simulated (cloth) parts of character meshes correctly, we have to patch in support by overriding the current PlayCanvas vertex transform shader chunk. In a final implementation, no patching should be necessary, as we would probably add in-built support for composite simulated and non-simulated mesh rendering.

Step 4: Implement render meshes to soft body meshes conversion

PlayCanvas character meshes cannot be used directly by the soft body mesh creation functions (btSoftBodyHelpers::CreateFromTriMesh) and so require some conversion, so the PlayCanvas vertex iterator was used to access and convert the mesh data. Eventually this conversion could be done on asset import into the PlayCanvas editor.

Step 5: Implement per-bone attachments

PlayCanvas currently doesn’t have a way to attach objects to specific character bones via the Editor (it’s on our roadmap for the coming months!). Therefore, per-bone attachments was implemented in order to attach simplified rigid body colliders to different parts of the character to prevent the cloth from intersecting the character mesh. We are using simplified colliders instead of trying to use the full skinned character mesh because it runs much faster.

If you are feeling adventurous, you can find the prototype source code for the example above in this PlayCanvas project:

It is a prototype implementation and so expect many changes (some of which are mentioned above) in a final implementation.

Want to try soft body dynamics on your own character? Here’s how:

Step 1: Fork the PlayCanvas prototype project.

Step 2: Mark out what parts of the character you want simulated:

This is done by painting colors into the character mesh vertices – the example character was downloaded from Mixamo, and imported into Blender:

Black = fully simulated, White = not simulated

Step 3: Import the character into the PlayCanvas editor and set up collision: 

On this character, only colliders for the legs and body are needed.

What’s Next

We are really excited about developing this technology in the coming months. We will focus on these areas:

  • Take learnings from the prototype and add good support for soft body dynamics in PlayCanvas
  • Create easy to use tools for PlayCanvas developers to import and setup characters with soft body dynamics
  • Further optimize and improve quality

We would love to get your thoughts and feedback so come join the conversation on the PlayCanvas forum!

Introducing the PlayCanvas Localization System

Are you shipping your PlayCanvas app or game in just one language? You may be preventing international users from enjoying it! Today, we are happy to announce the arrival of a localization system built right into the PlayCanvas Editor!

PlayCanvas-powered Bitmoji Party localized into English, Spanish and French

The system works in tandem with PlayCanvas’ text element component and it’s super-easy to use. The text element interface now provides a ‘Localized’ property and when checked, you can enter a Key instead of a Text string.

New text element properties for enabling localized text

The Key is the string used to look up a localized string based on the user’s currently selected locale. The localized string data is stored in JSON assets and is documented, along with the rest of the system, here. You can even preview your localized User Interface by choosing a locale in the Editor Settings panel:

Locale setting in the Scene Settings panel

We look forward to playing your newly localized games!

Arm and PlayCanvas Open Source Seemore WebGL Demo

Cambridge/Santa Monica, August 1 2019 – Arm and PlayCanvas are announcing the open sourcing of the renowned Seemore WebGL demo. First published in 2016, the graphical technical demo has been completely rebuilt from the ground up to deliver even more incredible performance and visuals. With it, developers are empowered to build their projects in the most optimised way, as well as using it to incorporate some of its performant features and components into their own projects.



“I’m so excited to be relaunching the Seemore demo. Open sourcing it in partnership with Arm will bring a host of benefits to the WebGL development community,” said Will Eastcott, CEO of PlayCanvas. “It’s an incredible learning resource that provides many clean, easy to follow examples of some very advanced graphical techniques. Sharing this project publicly is going to help move web graphics forwards for everybody.”

“PlayCanvas and Arm have always strived to push the boundaries of graphics and the original demo is a testament to that,” said Pablo Fraile, Director of Developer Ecosystems at Arm. “It’s encouraging to see how PlayCanvas have advanced mobile web rendering performance since the original demo. This re-release provides a unique resource into graphics for the mobile web that is both easy to follow and incorporate into your own projects.”

The Seemore demo was originally created as a graphical showcase for the mobile browser and takes full advantage of Arm Mali GPUs. It has been upgraded to utilize the full capabilities of WebGL 2, the latest iteration of the web graphics API. Some of the main features of the demo include:

  • Physical shading with image based lighting and box-projected cube maps.
  • Stunning refraction effects.
  • HDR lightmaps.
  • Interpolated pre-baked shadow maps as a fast substitute for real time shadow-mapping.
  • ETC2 texture compression to ensure that highly detailed textures occupy only a small amount of system memory.
  • Draw call batching.
  • Asynchronous streaming of assets to ensure the demo loads in mere seconds (approximately 5x faster than the original version).
  • Fully GPU-driven mesh animation.

Meet the PlayCanvas team: Maksims Mihejevs


Today we are talking to the Russian (from Latvia) Senior Engineer at PlayCanvas Maks!

How did you get into the video games industry?

I started making games when I was 13 years old and always knew what I wanted. A long journey but here I am, making game development better with PlayCanvas.

Can you briefly describe your role at PlayCanvas?

I’m a Full-stack developer and love to be involved in anything specific or generic. Making PlayCanvas service work fast and scale well is what makes me feel good.

What is your favourite aspect of PlayCanvas’ service?

It is in the browser (1-click-away), and JavaScript.

Where do you see web based gaming in the future?

There are so many ways gaming in web can be moved forward, that we even can’t see where it will be in few years, only guess. The most important thing is well-connected and social games, where by just sharing a link you can invite your friends to challenge your record or even play in real-time with you.

How is PlayCanvas going to change the way people make games?

Collaboration and the fact you can make games straight away and test them out in minutes on hundreds of users, like your twitter followers. It’s something so powerful. We can’t predict what users will come up with being so accelerated by those features.

Can you describe one interesting thing about yourself?

I do care about things going on around and will always get obsessed by things I work on, I want to get as much as possible from my efforts.


The Quick Fire round (this is where things get a little interesting)

Zelda or Final fantasy?


COD or Battlefield?


Mario or Sonic?


Favourite game of all time?

Tough question, which ever one has most meta-gaming (UO or EVE for instance)

Greatest Gaming Achievement?


Virtual Reality and the future of Web Based Gaming

On Thursday 19th of June we will be showcasing some of our recent work with the amazing and exciting Oculus Rift Development Kit. In the build up and anticipation to this event we hope to convey why Virtual Reality and revolutionary hardware from Oculus VR are set to be a part of our future at PlayCanvas. Playing a game in VR is one thing. Making a game in VR…now that really is the future. playcanvas oculus1

What is the Oculus Rift?

Developed first by then 18 year old Palmer Luckey, two evolutions of its development kit amongst other improvements makes it arguably the most promising virtual reality system to date. The Oculus Rift is a low latency,head-mounted display that receives two independently rendered images to a screen to be viewed through stereoscope lenses.

Why Virtual reality?

Many virtual reality experiences target immersion, where users interaction can open the door to a reality (even if only partially) that is not their own.  However the technology behind the Oculus lets the user into a much deeper experience. Where extreme latency and narrow fields of vision have prevented previous Virtual Reality technologies from being immersive, they have often proved successful in creating nausea. As humans are sensitive to latencies as low as 20 milliseconds it is important for the technology in question to be as precise and fast as possible.

Usually leaving users craning necks and grabbing thin air in disbelief, the Oculus takes over 1000 readings per second and so far is effective enough to trick the mind and simulate a physical presence. The VR industry is now closer to ‘telexistance’ than ever before. Mark Zuckerberg, the current owner of the technology (following an acquisition worth $2 billion by Facebook) described its potential, exclaiming  “Imagine enjoying a courtside seat at a game, studying in a classroom of students and teachers all over the world, or consulting with a doctor face-to-face—just by putting on goggles in your home.”. The Oculus promises that in (hopefully few) years to come, gamers may be able to act and react naturally in what is still a virtual setting.

The Oculus Rift and PlayCanvas

Dave wearing Oculus Rift head mounted display

Oculus Rift support coming soon to PlayCanvas

Imagine a future where you open your Internet browser, select a VR ready online game, enable your Oculus headset and transport yourself into the game immediately. Mind blowing? Potentially yes. This is why we here at PlayCanvas are committed to intertwining the paths of both WebGL and HTML5 technologies with the capabilities of the Oculus Rift. It’s crazy to think that VR games could be played and developed by simply opening your favourite web browser.

Firefox are already thinking about VR on the web, Chrome are too. Be sure that when VR support fully comes to the web, PlayCanvas will be ready to help you get there faster. Game Developer’s have so many challenges to overcome while developing their game that adding VR (or Head-Mounted Display) support could become just another one of the features that you’d love to try but never quite have time to do. However, using PlayCanvas it’s simple. Just drop the OculusCamera script onto your camera and we’ll do all the magic to make your game render ready for the headset.

Hyper-realistic gaming experiences should not be limited to core gaming platforms. When web-based gaming can involve products like the Oculus we’re opening up an whole new class of immersive gaming experiences. The best of the features of the web; low-friction, accessible and shareable; with the best features of the Oculus, immersive, high-end experiences. It’s brave new world!

Dave Evans (CTO at PlayCanvas) will be showcasing some of our work with the Oculus Rift on June 19th at the Scenario Bar. Check out the Event link here and maybe we will see you there!