PlayCanvas Review of 2022

Happy New Year to you all! 

As we begin 2023, let’s take a moment to look back at last year’s highlights for PlayCanvas, the powerful WebGL engine and platform for creating interactive web content. 

From new features and improvements to exciting projects and partnerships, PlayCanvas has had a busy and successful year. In this review, we will cover some of the key developments and achievements of the platform, and how they have helped to advance the capabilities and potential of WebGL-based content creation.

The fantastic work done by you!

One of the most exciting aspects of PlayCanvas is seeing the amazing projects and work created by you! 

From games and interactive experiences to architectural visualizations and simulations, the PlayCanvas community is constantly pushing the boundaries of what is possible with WebGL.

To celebrate this work, we’ve created a showcase video with the standout projects and work from 2022.

PlayCanvas Showcase 2022

We are looking to do more of these in 2023 so don’t be shy! Share with us and the rest of the community on Twitter, forums and Discord.

We also wanted to take a deeper dive into the creative process and workflows behind these projects. 

To do this, we reached out to a selection of developers who have used PlayCanvas to create fantastic content across e-commerce, WebAR, games and the metaverse.

In these Developer Spotlights, developers share their experience with PlayCanvas, the challenges and solutions they encountered during development, and the unique insights and approaches they brought to their projects. 

These interviews provide valuable insights and inspiration for other PlayCanvas users and anyone interested in WebGL-based content creation.

Graphics Engine Enhancements

This year, we’ve been laser-focused on adding WebGPU support and glTF 2.0 spec compliance to the PlayCanvas graphics engine, and we’re thrilled with the progress we’ve made.

With Google Chrome set to enable WebGPU by default in 2023, we’re excited to be at the forefront of the future of interactive 3D content on the web, and we can’t wait to see what WebGPU will allow developers to create.

In addition to WebGPU support, we’ve also added support for all ratified glTF 2.0 extensions to the PlayCanvas engine, complete with Editor support for iridescence and dynamic refraction. These features allow developers to create even more realistic and visually stunning 3D content.

But we didn’t stop there! We also released Editor support for Clustered Lighting and Area Lights, which allow developers to easily incorporate hundreds of dynamic lights into their projects. And as it turns out, our users have already been using these new features to add extra flair and fidelity to their projects.

glTF Viewer 2.0 with AR support

We released a major update to the Model Viewer, taking it to version 2.0! This update not only improved the user experience, but also added a host of new features. 

The most notable new feature is AR support with WebXR (Android) and USDZ export (iOS). This allows users to view glTF models in AR directly from the browser. 

We’ve also made the UI more streamlined and mobile-friendly, grouping related functionality together for easier use. Rendering has been improved with the ‘High Quality Rendering’ option, which enables supersampling to smooth out jagged edges along polygons and high resolution reflections for more realistic rendering.

Tools Updates

We’ve been continuously improving the Editor, making it even more powerful and user-friendly for our developers.

These include:

  • Infrastructure upgrades across the board with benefits to all users including:
    • Faster download speeds for published build zips across the world.
    • Faster asset delivery with up to 50% improvement in loading projects in the Editor and the Launch Tab. 
    • Zero downtime deployment for services.
  • More powerful Scene Hierarchy Search that searches components and script names.
  • Creating Texture Tool to inspect textures and convert HDRIs to/from cubemaps (also open source!).
  • Adding GitHub sign-in.

The project dashboard has gotten a huge refresh and can be accessed in-Editor. It includes searching and sorting of the project list as well as being able to manage teams and settings without leaving the Editor!

Version Control also got some major features this year, including the addition of the Item History and Graph View, which make it easier to track changes to your projects. And looking ahead to this year, we’re planning to make some of our REST API public, so developers can automate flows for CI and tools.

Thank You

As we wrap up our 2022 review of PlayCanvas, we want to take a moment to thank all of our users for their continued support and for the amazing projects and work they have created with PlayCanvas. 

Your creativity and innovation inspire us to continue improving and expanding the capabilities of our WebGL engine and platform.

We can’t wait to see what the new year brings and the incredible projects and work that our users will create with PlayCanvas. Whether you are new to PlayCanvas or a seasoned pro, we hope that you will continue to be a part of our community and push the boundaries of what is possible with WebGL-based content creation. 

Thank you again, and we look forward to seeing what you will accomplish in the new year!

Porting Unreal Scenes to the Browser with PlayCanvas – Developer Spotlight with Leonidas Maliokas

Welcome to Developer Spotlight, a new series of blog articles where we talk to developers about how they use PlayCanvas and showcase the fantastic work they are doing on the web.

Today we are excited to be joined by Leonidas Maliokas, a freelance web and games developer for Solar Games.

He will show us how Solar Games ported a metaverse experience from Unreal to PlayCanvas in the video presentation below. Specific areas covered are:

  • Converting scenes and assets from Unreal to PlayCanvas
  • Runtime and load-time optimization
  • Lighting and post processing
  • Multiplayer with Colyseus
  • Ready Player Me avatar integration
  • Spatial-aware audio chat with Agora

Presentation Slides

Hi Leonidas, welcome to Developer Spotlight! Tell us about yourself and your studio!

Hey, I’m Leonidas from Solar Games (formerly known as Pirron 1)! I’ve been working with interactive 3D websites since 2012. I used to work as a civil engineer before turning my hobby and passion for gamedev into a full time job using PlayCanvas.

Together with doing PlayCanvas contracts of all sorts, like product configurators, games and promotional sites, I’ve been researching how to extend the PlayCanvas engine and editor. Adding open world terrains, special effects and easy to use networked controllers to match features normally found in native only game engines, led to founding Solar Games.

We offer Uranus Tools for PlayCanvas, a collection of plug and play scripts to supercharge your PlayCanvas creation pipeline. You can find out more about our company’s services at https://solargames.io.

We are also working on Aritelia, a procedurally generated open world social MMO in PlayCanvas. This is still in development but you can already give it a try with the pre-alpha tech demonstration that was released last year.

Why did you choose PlayCanvas? 

It was actually an easy choice for us: by reviewing the mainstream WebGL libraries and platforms, PlayCanvas did stand out for:

  • Offering an integrated editor and publishing solution. Even after all these years, the ability to easily share projects and builds and collaborate with your colleagues in real-time is something unique to PlayCanvas.
  • The PlayCanvas team is very productive and professional in the way it moves the platform forward.
  • The open source PlayCanvas engine provides a very effective and easy to use API.

What were the initial challenges and how did the team overcome them? 

The main challenge was the lack of certain features and tools. For example things that you’d take for granted in a native game engine like a terrain system, post effects, automatic instancing and level of detail were missing.

The good news was that even before the PlayCanvas Editor API was officially released, it has always been possible to extend the PlayCanvas Editor. We were able to write our own editor extensions and quickly make them productive in our development pipeline.

Other developers and companies became interested in our extensions and we started offering them in our company’s asset store.

How is building an HTML5 game/experience different from a native game/experience? 

Several concepts like rendering, resource loading, game logic and state management are quite similar. But there are some unique concepts when it comes to web-based experiences that can be challenging. 

In particular, download times, different display sizes and pixel ratios, a broad spectrum of device specs, and also platform and browser compatibility.

Taking into account these factors is mandatory when building a high-quality HTML5 experience.

What are the team’s favorite features of PlayCanvas?

Our favorite feature is the editor, by far. The fact that it is collaborative in real time makes PlayCanvas the best tool for teams to work together. Also, the fact that PlayCanvas has version control integrated is pretty cool! Something else I would add is that PlayCanvas provides a very clean API to work with. Seiously, not only HTML5 devs but also native game devs should give PlayCanvas a try. It’s a great tool to quickly be productive!

Other than that:

  • Asset pipelines like enabling texture compression.
  • The engine API and the constant addition of new features by the PlayCanvas team.
  • The community – many greetings to everybody!

What is on the feature wishlist for PlayCanvas this year? 

  • Having the PlayCanvas WebGPU renderer available to play with.
  • Full support of the new node based shader editor.
  • Asset variants for specific platforms e.g. serve smaller textures on mobile.

How do you see HTML5 games/experiences evolve over the next few years? 

It’s definitely an exciting time for developers and companies working with HTML5 content. Both the technology has moved forward with standards and frameworks being more robust and powerful than ever, and the devices capable of running HTML5 experiences are very capable.

The metaverse is already leveraging HTML5 to deploy worlds and experiences across traditional web2 and newer web3 websites.

Pixel streaming is the only valid contender when it comes to what HTML5 can offer. I would definitely welcome a feature where pixel streaming is a viable option since it’s a great concept. But right now I don’t see this happening soon.

There are so many opportunities around HTML5 and I see a very positive future for everyone involved.

Thank you very much for your time and we look forward to your presentation! 

Thank you for this opportunity to showcase our work!

Useful links:

Stay tuned for more Developer Spotlights in the future!

Web AR Experiences – Developer Spotlight with Animech

Welcome to the third instalment of Developer Spotlight, a series of blog articles where we talk to developers about how they use PlayCanvas and showcase the fantastic work they are doing on the Web.

Today we are excited to be joined by Staffan Hagberg, CMO of Animech.

Hi Staffan, welcome to Developer Spotlight! Tell us about yourself and Animech!

Animech was founded back in 2007, in the city of Uppsala, Sweden. With a mix of 3D artists, engineers, developers, and UI/UX experts, we have a team of 40 people and all the competence in-house. The studio started in the early days of real-time 3D. It was a mix of CAD engineers and developers who realized the power of visualization for selling complex products in the life sciences segment. 

Since then, we have visualized pretty much anything you can think of online and offline. We’ve worked in VR, AR, MR, phones, tablets, desktops, and pretty much any other device that has a browser. We have developed VR applications for cars, the first real-time 3D configurator in native WebGL ever developed, one of the world’s first configurators for Oculus Rift Devkit and much more.

We have also visualized experiences for hotel safes, medical instruments and lab products for 7 of the 10 largest life science companies, as well as built 3D converters from Unreal to glTF and a bunch of custom tools specially built for PlayCanvas. 

Our core business is real-time 3D. We push the boundaries every day trying to invent new ways of using 3D, where our solution makes the difference.

Bathroom Planner for Iconic Nordic Rooms

Why did Animech choose PlayCanvas?

After an extensive search for a WebGL-based engine, we evaluated a few and selected PlayCanvas for its performance, out-of-the-box features, its extensibility and its valuable editor. Our customers expect the highest level of visual quality along with a smooth browsing experience – without the need for an app or plugins. PlayCanvas truly helps us deliver.

As for our artists’ perspective, they think it was (and still is) the most artist-friendly WebGL editor out there, with the added bonuses that it is open source, and supports many important features, such as PBR, tonemapping, render layers, etc.

Did your team face any initial challenges? How did you overcome them?

It’s always challenging when customers have high quality and performance expectations. Though, at the same time, that is what drives us. Being able to create stunning 3D experiences linked to real business value is a unique opportunity and challenge. Adding AR to that process helps you to stand out against competitors. 

Our particular challenge was to dynamically create an AR model of a procedurally generated mesh as a generic function. Our solution was to create a SaaS service that can take of whatever 3D object you’re looking at in PlayCanvas, and on the fly create AR models for both iOS and Android devices (ARKit or ARCore).  

You’ve built several Web AR experiences. Can you tell us a little about them and how important you think Web AR is today?

We have been early adopters of both AR and VR, both as standalone applications and on the web. We believe it’s important to use AR not as a gimmick, but as an application that provides real value for the user. For example, looking at how that greenhouse would look in your actual backyard or similar. In that sense, Web AR will get more and more important, both as something that stands out but also as something that provides value for users.

Why do you think that your clients want Web AR in their experiences?

To offer something more to their customers – both in marketing value and actual value. To help users make smarter, more informed decisions.

We have also developed our own web based 3D converter that takes our PlayCanvas 3D models to glTF and USD on the fly. It is a server side solution that takes everything we develop to AR.

How is building a web experience different from a native experience?

You must optimize for both loading time and performance. The application could be run on a wide range of devices – from several years old phones to high-end desktops.

The application is accessible to a wider audience since they don’t need to install anything.

What are the team’s favorite features of PlayCanvas?

As a team consisting of both 3D artists and developers, PlayCanvas’ online editor provides a fantastic way to collaborate, prepare and preview our projects before pairing the solution with a stunning web UI or deploying it as a standalone viewer.

Our 3D artists also enjoy how the editor is robust and easy to use, and how its design promotes collaboration. Powerful material settings (per-texture UV and color channel, vertex colors, blend types, depth test/write, etc.), flexible texture compression and a fast response by the team when reporting bugs and requesting features are also great.

What is on the feature wish list for PlayCanvas this year?

As the future for 3D on the web continues to evolve, we are excited to see support for more accessible 3D formats, such as the glTF standard by the Khronos Group, which PlayCanvas are advocating for as well.

Beyond this, here are some things we look forward to:

  • Node-based shader editor
  • Support for editor extensions
  • Post processing (HDR bloom, chromatic aberration, SSAO, motion blur, color grading, eye adaption, etc.)
  • More customizable asset import options
  • Reflection probes
  • Material instances (see Unreal Engine)
  • Debug visualization (see Unreal Engine’s View Modes)
  • Expose currently hidden options in the editor (detail maps, etc.)

How do you see AR and 3D e-commerce evolve over the next few years?

The possibilities are enormous. The question is when do people actually start using AR. It has been around for many years, lots of interesting solutions and demos have been built, but the real value of AR has not reached the masses yet.

I think we are closing in on that though. Just the other day I was about to buy a new espresso coffee machine. One supplier had an AR model online in the e-store with which I could see that it looked good and covered my needs. With just one static USDZ file. It is such an easy way of helping your customer to make the right decision. Imagine how much value you add if you can see configured 3D models in AR and really see the potential of what you are about to buy. 

Next phase would be to configure and change your 3D model directly in AR-mode which would make the experience even stronger.

As the graphics quality gets better and better online and the fashion industry keeps on digitizing their customer journey, AR will probably be the best and easiest way of trying on fashion products like bags, watches, jewelry and clothes. It will reduce faulty orders on a massive scale if you can do a virtual fitting before buying stuff online.

Animech helps our customers to get what they want. Simply put: we empower people to make smart decisions through intelligent visualization.

Thank you, Staffan! Is there anything else you’d like to share?

You can visit our website here. You can also follow us on Twitter! You can also check out our other projects here: