Steamshot 13 – my WebXR entry to the 2024 js13kgames competition

I entered this years js13kgames competition. If you don’t know you have to create a game which when zipped has a file size no bigger than 13kb!! You can’t download online assets like images, sounds, libraries or fonts. Like last year I targeted the WebXR category. This allows an exception to the no libraries rule and allows the developer to use an external library: A-Frame, Babylon.js or Three.js. I chose Three.js.

You need to setup an npm project. Here’s my template. To use it, download a zip. Unzip, open the folder using VSCode. Then use:

npm install.

npm run start to start a test server. and

npm run build to create a distribution version in the dist folder and a zipped version in the zipped folder. It will also give a file size report.

The first challenge is creating a game that fits the theme for the year. In 2024 the theme was Triskaidekaphobia. That is the fear of the number 13. I decided to create a shooting gallery where the user must shoot any ball with the number 13 on it.

Initially I created a proxy of the environment in code.

export class Proxy{
    constructor( scene ){
        this.scene = scene;

        const geo1 = new THREE.CylinderGeometry( 0.25, 0.25, 3 );
        const mat1 = new THREE.MeshStandardMaterial( { color: 0x999999 } );
        const mat2 = new THREE.MeshStandardMaterial( { color: 0x444444, side: THREE.BackSide, wireframe: false } );

        const column = new THREE.Mesh( geo1, mat1 );
        
        for ( let x = -6; x<=6; x+=2 ){
            const columnA = column.clone();
            columnA.position.set( x, 1.5, -20);
            scene.add( columnA );
        }

        const geo2 = new THREE.PlaneGeometry( 15, 25 );
        geo2.rotateX( -Math.PI/2 );
        const floor = new THREE.Mesh( geo2, mat1 );
        floor.position.set( 0, 0, -12.5 );
        //scene.add( floor );

        const geo3 = new THREE.BoxGeometry( 15, 0.6, 0.6 );
        const lintel = new THREE.Mesh( geo3, mat1 );
        lintel.position.set( 0, 3.3, -20 );
        scene.add( lintel );

        const geo4 = new THREE.BoxGeometry( 15, 3.3, 36 );
        const room = new THREE.Mesh( geo4, mat2 );
        room.position.set( 0, 1.65, -10 );
        scene.add( room );
    }
}

I was aiming for a look like this –

A bit of a tall-order but you have to have a goal!!

The next step was creating the balls that move toward the player.

Just a simple class.

export class Ball{
    static states = { DROPPING: 1, ROTATE: 2, FIRED: 3 };
    static canvas = document.createElement('canvas');
    static geometry = new THREE.SphereGeometry( 0.5 );
    
    constructor( scene, num, minus = false, xPos = -1, speed = 0.1 ){
        if (Ball.canvas.width != 256 ){
		    Ball.canvas.width = 256;
            Ball.canvas.height = 128;
        }
        const context = Ball.canvas.getContext('2d');

        if (num == 13){
            context.fillStyle = "#000";
        }else if (minus){
            context.fillStyle = "#f00";
        }else{
            context.fillStyle = "#0f0";
        }

        this.num = num; 
        this.speed = speed;

        context.fillRect(0, 0, 256, 128);

        context.fillStyle = "#fff";
        context.font = "48px Arial";
        context.textAlign = "center";
        context.textBaseline = "middle";
        context.fillText(String(num), 128, 64 );

        const tex = new THREE.CanvasTexture( Ball.canvas );

        const material = new THREE.MeshStandardMaterial( { map: tex, roughness: 0.1 } );

        this.mesh = new THREE.Mesh( Ball.geometry, material );
        this.mesh.position.set( xPos, 4, -20 );
        this.mesh.rotateY( Math.PI/2 );

        this.state = Ball.states.DROPPING;

        scene.add( this.mesh )
    }

    update(game){
        switch(this.state){
            case Ball.states.DROPPING:
                this.mesh.position.y -= 0.1;
                if (this.mesh.position.y <= 1.6){
                    this.state = Ball.states.ROTATE;
                    this.mesh.position.y = 1.6;
                }
                break;
            case Ball.states.ROTATE:
                this.mesh.rotateY( -0.1 );
                console.log( this.mesh.rotation.y );
                if (this.mesh.rotation.y < -Math.PI/2.1){
                    this.state = Ball.states.FIRED;
                }
                break;
            case Ball.states.FIRED:
                this.mesh.position.z += this.speed;
                break;
        }

        if (this.mesh.position.z > 2){
            this.mesh.material.map.dispose();
            if (game) game.removeBall( this );
        }
    }
}

I created a proxy gun.

export class Gun extends THREE.Group{
    constructor(){
        super();

        this.createProxy();
    }

    createProxy(){
        const mat = new THREE.MeshStandardMaterial( { color: 0xAAAA22 } );
        const geo1 = new THREE.CylinderGeometry( 0.01, 0.01, 0.15, 20 );
        const barrel = new THREE.Mesh( geo1, mat ); 
        barrel.rotation.x = -Math.PI/2;
        barrel.position.z = -0.1;

        const geo2 = new THREE.CylinderGeometry( 0.025, 0.025, 0.06, 20 );
        const body = new THREE.Mesh( geo2, mat ); 
        body.rotation.x = -Math.PI/2;
        body.position.set( 0, -0.015, -0.042 );

        const geo3 = new THREE.BoxGeometry( 0.02, 0.08, 0.04 );
        const handle = new THREE.Mesh( geo3, mat ); 
        handle.position.set( 0, -0.034, 0);

        this.add( barrel );
        this.add( body );
        this.add( handle );
    }
}

and a Bullet

import { Ball } from "./ball.js";

export class Bullet{
    constructor( game, controller ){
        const geo1 = new THREE.CylinderGeometry( 0.008, 0.008, 0.07, 16 );
        geo1.rotateX( -Math.PI/2 );
        const material = new THREE.MeshBasicMaterial( { color: 0xFFAA00  });
        const mesh = new THREE.Mesh( geo1, material );

        const v = new THREE.Vector3();
        const q = new THREE.Quaternion();

        mesh.position.copy( controller.getWorldPosition( v ) );
        mesh.quaternion.copy( controller.getWorldQuaternion( q ) );

        game.scene.add( mesh );
        this.tmpVec = new THREE.Vector3();
        this.tmpVec2 = new THREE.Vector3();

        this.mesh = mesh;
        this.game = game;
    }

    update( dt ){
        let dist = dt * 2;
        let count = 0;

        while(count<1000){

            count++;
            if (dist > 0.5){
                dist -= 0.5;
                this.mesh.translateZ( -0.5 );
            }else{
                this.mesh.translateZ( -dist );
                dist = 0;
            }

            this.mesh.getWorldPosition( this.tmpVec );

            let hit = false;

            this.game.balls.forEach( ball => {
                if (!hit){
                    if (ball.state == Ball.states.FIRED ){
                        ball.mesh.getWorldPosition( this.tmpVec2 );
                        const offset = this.tmpVec.distanceTo( this.tmpVec2 );
                        if ( offset < 0.5 ){
                            hit = true;
                            ball.hit(this.game );
                            this.game.removeBullet( this );
                        }
                    }
                }
            });

            if (dist==0 || hit) break;
        }

        this.mesh.translateZ( dt * -2 );

        if ( this.mesh.position.length() > 20 ) this.game.removeBullet();
    }
}

Now I had a working basic game. Time to create the eye-candy.

First a score and timer display. WebXR does not allow the developer to use the DOM. You can’t simply create a div, position it and update its content using JavaScript. I decided to create a mechanical counter mechanism.

Each segment uses CylinderGeometry. To create the map with the numbers on I used the CanvasTexture class, this creates a Texture from an HTML Canvas, so you can use HTML Canvas drawing commands to ‘paint’ the texture.

export class Counter extends THREE.Group{
    static texture;
    static types = { SCORE: 0, TIMER: 1 };

    constructor( scene, pos = new THREE.Vector3(), rot = new THREE.Euler ){
        super();

        this.scale.set( 1.5, 1.5, 1.5 );

        scene.add( this );
        this.position.copy( pos );
        this.rotation.copy( rot );

        if ( Counter.texture == null ){
            const canvas = document.createElement('canvas');

            canvas.width = 1024;
            canvas.height = 64;

            const context = canvas.getContext( '2d' );

            context.fillStyle = "#000";
            context.fillRect( 0, 0, 1024, 64 );

            context.textAlign = "center";
            context.textBaseline = "middle";
            
            context.font = "48px Arial";

            context.fillStyle = "#fff";

            const inc = 1024/12;
            const chars = [ "0", "1", "2", "3", "4", "5", "6", "7", "8", "9", ":", " " ];
            let x = inc/2;

            chars.forEach( char => {
                context.setTransform(1, 0, 0, 1, 0, 0);
                context.rotate( -Math.PI/2 );
                context.translate( 0, x );
                context.fillText( char, -32, 34 );
                x += inc;
            });

            Counter.texture = new THREE.CanvasTexture( canvas );
            Counter.texture.needsUpdate = true;
        }

        const r = 1;
        const h = Math.PI * 2 * r;
        const w = h/12;

        const geometry = new THREE.CylinderGeometry( r, r, w );
        geometry.rotateZ( -Math.PI/2 );
        const material = new THREE.MeshStandardMaterial( { map: Counter.texture } );

        const inc = w * 1.1;
        const xPos = -inc * 2.5;
        const zPos = -r * 0.8;

        for( let i=0; i<5; i++ ){
            const mesh = new THREE.Mesh( geometry, material );
            mesh.position.set( xPos + inc*i, 0, zPos );
            this.add( mesh );
        }

        this.type = Counter.types.SCORE;

        this.displayValue = 0;
        this.targetValue = 0;
    }

    set score(value){
        if ( this.type != Counter.types.SCORE ) this.type = Counter.type.SCORE;
        this.targetValue = value;
    }

    updateScore( ){
        const inc = Math.PI/6;
        let str = String( this.displayValue );
        while ( str.length < 5 ) str = "0" + str;
        const arr = str.split( "" );
        
        this.children.forEach( child => {
            const num = Number(arr.shift());
            if (!isNaN(num)){
                child.rotation.x = -inc*num - 0.4;
            }
        });
    }

    updateTime( ){
        const inc = Math.PI/6;
        let secs = this.displayValue;
        let mins = Math.floor( secs/60 );
        secs -= mins*60;
        let secsStr = String( secs );
        while( secsStr.length < 2 ) secsStr = "0" + secsStr;
        let minsStr = String( mins );
        while( minsStr.length < 2 ) minsStr = "0" + minsStr;
        let timeStr = minsStr + ":" + secsStr;
        let arr = timeStr.split( "" );
        
        this.children.forEach( child => {
            const num = Number(arr.shift());
            if (isNaN(num)){
                child.rotation.x = -inc*10 - 0.4;
            }else{
                child.rotation.x = -inc*num - 0.4;
            }
        });
    }

    set seconds(value){
        if ( this.type != Counter.types.TIMER ) this.type = Counter.types.TIMER;
        this.targetValue = value;
        this.update( 0 );
    }

    get time(){
        let secs = this.targetValue;
        let mins = Math.floor( secs/60 );
        secs -= mins*60;
        let secsStr = String( secs );
        while( secsStr.length < 2 ) secsStr = "0" + secsStr;
        let minsStr = String( mins );
        while( minsStr.length < 2 ) minsStr = "0" + minsStr;
        return minsStr + ":" + secsStr;
    }

    update( dt ){
        if ( this.targetValue != this.displayValue ){
            if ( this.targetValue > this.displayValue ){
                this.displayValue++;
            }else{
                this.displayValue--;
            }
        }

        switch( this.type ){
            case Counter.types.SCORE:
               this.updateScore();
                break;
            case Counter.types.TIMER:
                this.updateTime();
                break
        }
    }
}

Another challenge was creating an environment map. Which is essential when using MeshStandardMaterial with a roughness less than 1. As soon as it is smooth it reflects a map which by default is black. Resulting in very dark renders of shiny objects. With only 13k to play with you can’t simply load a bitmap texture. Instead you have to generate an environment map at runtime. A WebGLRenderer does not have to write to the screen. If you set a WenGLRenderTarget you can render to that. For a environment map you need it to be compiled in a special way. You can use a PMREMGenerator and the compileEquirectangularShader method.

if (this.scene.environment == null){
            this.renderTarget = new THREE.WebGLRenderTarget(1024, 512);
            this.renderer.setSize( 1024, 512 );
            this.renderer.setRenderTarget( this.renderTarget );
            this.renderer.render( this.scene, this.camera );

            const pmremGenerator = new THREE.PMREMGenerator( this.renderer );
            pmremGenerator.compileEquirectangularShader();
            const envmap = pmremGenerator.fromEquirectangular( this.renderTarget.texture ).texture;
            pmremGenerator.dispose();

            this.scene.environment = envmap;
            this.renderer.setRenderTarget( null );
            this.resize();
        }

I also need a way of rendering wood. I decided to use noise like this CodePen example.

The NoiseMaterial class does the heavy lifting.

export class NoiseMaterial extends THREE.MeshStandardMaterial{
    constructor( type, options ){
		super( options );

		if ( this.noise == null ) this.initNoise();

		switch( type ){
			case 'oak':
				this.wood( 0x261308, 0x110302 );
				break;
			case 'darkwood':
				this.wood( 0x563308, 0x211302 );
				break;
			case 'wood':
			default:
        		this.wood();
				break;
		}

		this.onBeforeCompile = shader => {
			for (const [key, value] of Object.entries(this.userData.uniforms)) {
				shader.uniforms[key] = value;
			}
			shader.vertexShader = shader.vertexShader.replace( '#include <common>', `
				varying vec3 vModelPosition;
				#include <common>
				`);
			shader.vertexShader = shader.vertexShader.replace( '#include <begin_vertex>', `
				vModelPosition = vec3(position);
				#include <begin_vertex>
				`);
			shader.fragmentShader = shader.fragmentShader.replace( 
				'#include <common>', 
				this.userData.colorVars ) ;		
		  	shader.fragmentShader = shader.fragmentShader.replace( 
				'#include <color_fragment>', 
				this.userData.colorFrag ) 	
		  }
    }

	wood( light = 0x735735, dark = 0x3f2d17 ){
		this.userData.colorVars = `
			uniform vec3 u_LightColor;
			uniform vec3 u_DarkColor;
			uniform float u_Frequency;
			uniform float u_NoiseScale;
			uniform float u_RingScale;
			uniform float u_Contrast;

			varying vec3 vModelPosition;

			${this.noise}

			#include <common>
		`

		this.userData.colorFrag = `
			float n = snoise( vModelPosition.xxx )/3.0 + 0.5;
			float ring = fract( u_Frequency * vModelPosition.x + u_NoiseScale * n );
			ring *= u_Contrast * ( 1.0 - ring );

			// Adjust ring smoothness and shape, and add some noise
			float lerp = pow( ring, u_RingScale ) + n;
			diffuseColor.xyz *= mix(u_DarkColor, u_LightColor, lerp);
		`

		const uniforms = {};
		uniforms.u_time = { value: 0.0 };
		uniforms.u_resolution = { value: new THREE.Vector2() };
		uniforms.u_LightColor = { value: new THREE.Color(light) };
		uniforms.u_DarkColor = { value: new THREE.Color(dark) };
		uniforms.u_Frequency = { value: 55.0 };
		uniforms.u_NoiseScale = { value: 2.0 };
		uniforms.u_RingScale = { value: 0.26 };
		uniforms.u_Contrast = { value: 10.0 };

		this.userData.uniforms = uniforms;
	}

	initNoise(){
		this.noise = `
			//
			// Description : Array and textureless GLSL 2D/3D/4D simplex 
			//               noise functions.
			//      Author : Ian McEwan, Ashima Arts.
			//  Maintainer : stegu
			//     Lastmod : 20110822 (ijm)
			//     License : Copyright (C) 2011 Ashima Arts. All rights reserved.
			//               Distributed under the MIT License. See LICENSE file.
			//               https://github.com/ashima/webgl-noise
			//               https://github.com/stegu/webgl-noise
			// 

			vec3 mod289(vec3 x) {
			return x - floor(x * (1.0 / 289.0)) * 289.0;
			}

			vec4 mod289(vec4 x) {
			return x - floor(x * (1.0 / 289.0)) * 289.0;
			}

			vec4 permute(vec4 x) {
				return mod289(((x*34.0)+1.0)*x);
			}

			// Permutation polynomial (ring size 289 = 17*17)
			vec3 permute(vec3 x) {
			return mod289(((x*34.0)+1.0)*x);
			}
			
			float permute(float x){
				return x - floor(x * (1.0 / 289.0)) * 289.0;;
			}

			vec4 taylorInvSqrt(vec4 r){
			return 1.79284291400159 - 0.85373472095314 * r;
			}

			float snoise(vec3 v){ 
			const vec2  C = vec2(1.0/6.0, 1.0/3.0) ;
			const vec4  D = vec4(0.0, 0.5, 1.0, 2.0);

			// First corner
			vec3 i  = floor(v + dot(v, C.yyy) );
			vec3 x0 =   v - i + dot(i, C.xxx) ;

			// Other corners
			vec3 g = step(x0.yzx, x0.xyz);
			vec3 l = 1.0 - g;
			vec3 i1 = min( g.xyz, l.zxy );
			vec3 i2 = max( g.xyz, l.zxy );

			//   x0 = x0 - 0.0 + 0.0 * C.xxx;
			//   x1 = x0 - i1  + 1.0 * C.xxx;
			//   x2 = x0 - i2  + 2.0 * C.xxx;
			//   x3 = x0 - 1.0 + 3.0 * C.xxx;
			vec3 x1 = x0 - i1 + C.xxx;
			vec3 x2 = x0 - i2 + C.yyy; // 2.0*C.x = 1/3 = C.y
			vec3 x3 = x0 - D.yyy;      // -1.0+3.0*C.x = -0.5 = -D.y

			// Permutations
			i = mod289(i); 
			vec4 p = permute( permute( permute( 
						i.z + vec4(0.0, i1.z, i2.z, 1.0 ))
					+ i.y + vec4(0.0, i1.y, i2.y, 1.0 )) 
					+ i.x + vec4(0.0, i1.x, i2.x, 1.0 ));

			// Gradients: 7x7 points over a square, mapped onto an octahedron.
			// The ring size 17*17 = 289 is close to a multiple of 49 (49*6 = 294)
			float n_ = 0.142857142857; // 1.0/7.0
			vec3  ns = n_ * D.wyz - D.xzx;

			vec4 j = p - 49.0 * floor(p * ns.z * ns.z);  //  mod(p,7*7)

			vec4 x_ = floor(j * ns.z);
			vec4 y_ = floor(j - 7.0 * x_ );    // mod(j,N)

			vec4 x = x_ *ns.x + ns.yyyy;
			vec4 y = y_ *ns.x + ns.yyyy;
			vec4 h = 1.0 - abs(x) - abs(y);

			vec4 b0 = vec4( x.xy, y.xy );
			vec4 b1 = vec4( x.zw, y.zw );

			//vec4 s0 = vec4(lessThan(b0,0.0))*2.0 - 1.0;
			//vec4 s1 = vec4(lessThan(b1,0.0))*2.0 - 1.0;
			vec4 s0 = floor(b0)*2.0 + 1.0;
			vec4 s1 = floor(b1)*2.0 + 1.0;
			vec4 sh = -step(h, vec4(0.0));

			vec4 a0 = b0.xzyw + s0.xzyw*sh.xxyy ;
			vec4 a1 = b1.xzyw + s1.xzyw*sh.zzww ;

			vec3 p0 = vec3(a0.xy,h.x);
			vec3 p1 = vec3(a0.zw,h.y);
			vec3 p2 = vec3(a1.xy,h.z);
			vec3 p3 = vec3(a1.zw,h.w);

			//Normalise gradients
			vec4 norm = taylorInvSqrt(vec4(dot(p0,p0), dot(p1,p1), dot(p2, p2), dot(p3,p3)));
			p0 *= norm.x;
			p1 *= norm.y;
			p2 *= norm.z;
			p3 *= norm.w;

			// Mix final noise value
			vec4 m = max(0.6 - vec4(dot(x0,x0), dot(x1,x1), dot(x2,x2), dot(x3,x3)), 0.0);
			m = m * m;
			return 42.0 * dot( m*m, vec4( dot(p0,x0), dot(p1,x1), 
											dot(p2,x2), dot(p3,x3) ) );
			}
		`
	}
}

I created panelling, a gun and a ceiling with lights and air conditioning pipes.

When creating the panelling and the gun I made use of the ThreeJS Shape class and ExtrudeGeometry.

I did some play testing and the frame rate was suffering on my Quest2. I decided to swap the panelling for a texture. Which I created by using a WebGLRenderTarget.

            const panelling = new Panelling( 1.5, 1.5, true );
            panelling.position.set( 0.12, 1.34, -1 );
            this.scene.add( panelling );

            const renderTarget = new THREE.WebGLRenderTarget(128, 128, {
                generateMipmaps: true, 
                wrapS: THREE.RepeatWrapping,
                wrapT: THREE.RepeatWrapping,
                minFilter: THREE.LinearMipmapLinearFilter
            });
            this.camera.aspect = 1.2;
            this.camera.updateProjectionMatrix();
            this.renderer.setRenderTarget( renderTarget );
            this.renderer.render( this.scene, this.camera );
            
            const map = renderTarget.texture;
            map.repeat.set( 30, 6 );
            this.panels[0].panelling.material.map = map;
            this.panels[1].panelling.material.map = map;
            this.scene.remove( panelling );

With some play testing I adjusted the bullet speed to be slow enough that balls to the left or right were quite difficult to hit. Then I added varying ways for the balls move. Starting just straight and ending up going up and down on a swivel. I also added a leaderboard and styled the various panels using css.

I submitted the game.

Having submitted the game I started work on a ThreeJS Path Editor. The paths I used to create complex shapes using ExtrudeGeometry involved drawing on graph paper there had to be a better way.

It’s all ready for next years competition. Source code here.

Nik Lever September 2024

The ThreeJS Primer Discount Sale!

To celebrate the launch of my beginners guide to ThreeJS e-book, The ThreeJS Primer. All my Udemy ThreeJS courses are Udemy best price. Click the links to grab yourself a bargain.

Model viewer: Web 3D made easy

Learn to write JavaScript code while having fun making 3D web games using the most popular Open Source WebGL library ThreeJS

The Beginners Guide to 3D Web Game Development with ThreeJS

Learn to write JavaScript code while having fun making 3D web games using the most popular Open Source WebGL library ThreeJS

Learn to Create WebXR, VR and AR, experiences with ThreeJS

Learn how to create VR and AR experiences that work directly from the browser, using the latest API from Google and Amazon and our favourite Open Source WebGL library, ThreeJS

Learn GLSL Shaders from Scratch

Learn how to harness the power of the GPU in your web pages by learning to code GLSL shaders.

Create a 3D Multi-Player Game using ThreeJS and SocketIO

Learn how to use nodeJS, socketIO and ThreeJS to create a 3d multi-player game

Create a 3D Car Racing Game with ThreeJS and CannonJS

Learn to combine the physics engine CannonJS and ThreeJS to create a fun car racing game

Create a 3D RPG Game with ThreeJS

Learn how to harness the ThreeJS library to create a 3D RPG game

The ThreeJS Primer

New to ThreeJS then this FREE course is for you, a video version of the e-book.

Nik’s February Shader Course SALE!!!

Want to learn to create Shaders? Well you’ve picked the right time. My courses that contain lectures on coding Shaders are all Udemy best price for the next few days.

A Complete Guide to Unity’s Universal Render Pipeline

As Unity gradually switches to URP from the Built-in Render Pipeline, it’s time to learn the new techniques from the author of Unity’s URP e-books.

https://www.udemy.com/course/unity-urp/?couponCode=FEB24_BEST

Learn to write Unity Compute Shaders

Learn to harness the power of the GPU for processing intensive jobs.

https://www.udemy.com/course/compute-shaders/?couponCode=FEB24_BEST

Learn Unity Shaders from Scratch

Learn the black-art of Unity shaders in this comprehensive course on HLSL. NOW with URP Shaders and Shader Graph

https://www.udemy.com/course/learn-unity-shaders-from-scratch/?couponCode=FEB24_BEST

Learn GLSL Shaders from Scratch

Learn how to harness the power of the GPU in your web pages by learning to code GLSL shaders.

https://www.udemy.com/course/learn-glsl-shaders-from-scratch/?couponCode=FEB24_BEST

Udemy Unity courses at sale price

Megaphone with Promo speech bubble banner. Loudspeaker. Label for business, marketing and advertising. Vector on isolated background. EPS 10.

To celebrate finishing the second draft of my new Unity DOTS e-book, I’m having a sale of my Udemy Unity courses. Use the coupon code DEC23_BEST, or click the links below, to get the best price on Udemy for these courses over the next few days.

Unity DOTS is an acronym for Data-oriented Technology Stack, featuring:

  • The Jobs System – a simple way and safe way to bring multi-threading to your code
  • The Burst compiler – a straight to native assembler compiler
  • Unity’s Data-oriented Design (DoD) implementation Entity Component System (ECS).
  • And several other packages for Collections, Mathematics, Rendering and Physics

Switching from Object Oriented Programming (OOP) to DoD can result in massive performance improvements. Expect a DOTS course in late 2024.

The DOTS e-book will be my third e-book for Unity. My previous two are

Introduction to the Universal Render Pipeline for Advanced Creators

The Universal Render Pipeline Cookbook: Recipes for Shaders and Visual Effects

Here are my discounted courses:

A Complete Guide to Unity’s Universal Render Pipeline

As Unity gradually switches to URP from the Built-in Render Pipeline, it’s time to learn the new techniques from the author of Unity’s URP e-books.

https://www.udemy.com/course/unity-urp/?couponCode=DEC23_BEST

Learn to write Unity Compute Shaders

Learn to harness the power of the GPU for processing intensive jobs.

https://www.udemy.com/course/compute-shaders/?couponCode=DEC23_BEST

Learn Unity Shaders from Scratch

Learn the black-art of Unity shaders in this comprehensive course on HLSL. Including Universal Render Pipeline (URP) Shaders.

https://www.udemy.com/course/learn-unity-shaders-from-scratch/?couponCode=DEC23_BEST

Three.JS using NPM and vite

For all of my courses I include the Three.JS library I used at the time I was writing and recording the course. This ensures the code matches the library so no further installation is required other than downloading and unzipping a zip file from Udemy or GitHub or cloning and forking a repo from GitHub. But another approach is to use a package manager. By far the most popular is NPM, Node Package Manager and in this article we’ll look at using this approach. 

Caption: The Node.JS download page

To start you will need Node.JS installed on your PC, Mac or Linux device. If you haven’t got Node.JS installed then click the Node.JS link above or enter nodejs.org in your browser address bar. Download the installer for your device and install. NPM comes with the install. 

Caption: VSCode home page

If you haven’t got VSCode installed then install that as well. It is my recommended code editor these days. Either click the link above or enter https://code.visualstudio.com/ in your browser address bar.

Open VSCode and choose Open. 

Caption: VSCode Open

Navigate to a new folder where your project files will be stored. You’ll need to agree to trust the authors, but since that is you there is no problem. Use menu: Terminal > New Terminal. Enter 

npm install three

Notice you now have a node_modules folder and two new files package.json and package-lock.json.

Caption: Folders and files created

package.json looks like this. three is listed as a dependency. package-lock.json is created and edited by npm and should not be touched.

{
   "dependencies": {
      "three": "^0.157.0"
   }
}

three is the Three.JS library which you’ll find in the node_modules/three folder.

Now we’re going to add the build tool vite. Enter

npm install -D vite

Several new folders are added to the node_modules folder including one called vite. The others are dependencies that vite relies on. 

Open package.json and add

"type": "module",
"scripts": {
   "dev": "vite",
   "build": "vite build"
},

This will allow you to launch a dev server and package a completed project for distribution. 

You could place your project files at the root of the folder. But most developers prefer to keep things tidy by adding content to folders. Create a src folder and a public folder and create a new file called vite.config.js add this code to the file.

export default {
   root: "src",
   publicDir: "../public",
   build: {
      outDir: "../build"
   }
};

Now vite will look in src for any html or js files, in public for assets and package for distribution to the build folder. Note the public and build paths are relative to the src path. 

To see an example using npm and vite download this repo

Caption: GitHub Code button dropdown

Just click the green Code button and choose Download ZIP. Unzip to a folder of your choice and open the folder in VSCode. To install the dependencies enter

npm install

The package.json file is scanned for dependencies and the node_modules folder is populated with all the packages needed. Recall the scripts we added to package.json. Use

npm run dev
Caption: vite dev server running on port 5173

ctrl+click (PC) or cmd+click (Mac) the localhost link to launch the dev server in your browser.

Caption: Example app running in the vite dev server in the browser

Just a simple example of a Three.JS app created using vite as a build tool.

Take a look at src/index.html. Notice the script. Notice we can import the core Three.JS library from three. 

<script type="module">
   import * as THREE from "three";
   import { OrbitControls } from "three/addons/controls/OrbitControls.js";
   import { GUI } from "three/addons/libs/lil-gui.module.min.js";
   import { GLTFLoader } from 'three/addons/loaders/GLTFLoader.js';
   import { DRACOLoader } from 'three/addons/loaders/DRACOLoader.js';
   import { RGBELoader } from 'three/addons/loaders/RGBELoader.js';

three will be converted into node_modules/three/build/three.module.js  and three/addon becomes node_modules/three/examples/jsm. Why? Take a look at package.json in the three folder. 

"exports": {
   ".": {
      "import": "./build/three.module.js",
      "require": "./build/three.cjs"
   },
   "./examples/fonts/*": "./examples/fonts/*",
   "./examples/jsm/*": "./examples/jsm/*",
   "./addons/*": "./examples/jsm/*",
   "./src/*": "./src/*",
   "./nodes": "./examples/jsm/nodes/Nodes.js"
},

Notice exports. The default export for three, “.”, when used as an import is ./build/three.module.js. Used as a require, something used when creating a nodejs app, uses the classic javascript version ./build.three.cjs. ./addons/* becomes ./examples/jsm/* .

Back to the index.html file. Find the loadGLTF function, line 93. 

function loadGLTF(){
   const loader = new GLTFLoader( );
   const dracoLoader = new DRACOLoader();
   
   dracoLoader.setDecoderPath( 'draco-gltf/' );

   loader.setDRACOLoader( dracoLoader );

   // Load a glTF resource
   loader.load(
      // resource URL
      'motorcycle.glb',

Notice setDecoderPath is draco-gltf. Since this is not an import, for vite to find it correctly it must be in the public folder. 

Caption: Contents of the public folder

It is simply copied from node_modules/three/examples/jsm/libs/draco/gltf. You can see this folder also contains the glb loaded, motorcycle.glb, and the environment map, venice_sunset_1k.hdr. 

For the last step enter

npm run build

Notice a new folder is created, build

Caption: The build folder

A new index.html is created loading the js file in the assets folder. You might find you need to add a dot before the forward slash. 

src=”/assets/index…”

Becomes

src=”./assets/index…”

The contents of the public folder are copied to the build folder. The main script in the assets folder is bundled and minified. The single script now contains the Three.JS library and all the other imports in the index.html file in the src folder. 

Caption: Open with Live Server

If you have Live Server installed then you can run the app by right clicking on build/index.html and choosing Open with Live Server. 

Using npm and vite is a great way to create your Three.JS apps. I hope this short article helps you get started. 

WebGL in a nutshell

Close up of fresh hazelnuts against white background

In this article we’ll look at using WebGL to display a Quad, a rectangle, that fills the window. If you want to code-along then check out the CodePen-start link. Here is the final version.

/1

It’s a very simple shader just using uv to blend the colours. This article isn’t about the shader, it’s about getting the results of the shader on screen. You’ll learn about attributes, buffers, elements and programs. Let’s get started.

Before we can use WebGL in a browser we need a canvas and a context. To create this we’ll use a new function, setupWebGL. We create a canvas element and append it to the body. Then we get the webgl context. This could return null in which case we throw an error. These days most browsers on most devices do support webgl. 

function setupWebGl() {
   canvas = document.createElement("canvas");
   document.body.appendChild(canvas);
   const gl = canvas.getContext("webgl");
   if (gl == null) throw "WebGl not Supported";
   return gl;
}

Back in the init method we call the function. 

gl = setupWebGl();

By default a canvas is sized at 300 x 150. We want it to fill the screen to do that we’ll need a resize method. If a canvas has been created then set its width to window.innerWidth and its height to window.innerHeight. 

function onWindowResize() {
   if (canvas){
      canvas.width = window.innerWidth;
      canvas.height = window.innerHeight;
   }
}

In the init method add an event listener and also directly call this function. 

window.addEventListener( 'resize', onWindowResize );
onWindowResize();

Now we need to define a quad that will fill the canvas area. For that we need some vertices. The minimum we need to define a quad is the position of 4 vertices. But in our shader we’re also going to use a uv value.

Vertices - Position
Vertices – Position
Vertices - Uv
Vertices – Uv

We’re going to define an array of objects with position and uv properties. Each of these properties will be a simple array. Picturing a Quad, see images above. We start with the vertex at the bottom left corner giving this the position 0, 0 and the same values for uv. Then we move to the top left, this has position value 0 for x and window inner height for y. The uv for this vertex is 0, 1. The next vertex is top right, with position values of the window width and height and uv 1, 1. And finally the bottom right with position values of window width, 0 and uv of 1, 0.

const vertices = [
   { position: [0, 0], uv: [0, 0] },
   { position: [0, window.innerHeight], uv: [0, 1] },
   { position: [window.innerWidth, window.innerHeight], uv: [1, 1] },
   { position: [window.innerWidth, 0], uv: [1, 0] }
];

WebGL can draw points, lines and triangles. To render our quad we need to define two triangles by defining the indices of the vertices. When we do this we’re creating what WebGL describes as an Element. Let’s do that using another function, createQuadElement. First we define the indices.

Vertices – Indices
Triangle using vertex indices 0, 1 and 2
Triangle using vertex indices 0, 2 and 3

Bottom-left, top-left, top-right for triangle one and bottom-left, top-right, bottom-right for triangle two.

WebGL is all about buffers. This is how we pass data from the CPU to the GPU. Before we can pass any data we need to inform WebGL which buffer we’re passing the data to. We do this using bindBuffer. When we do this we need to inform WebGL what type of data we’re passing so it knows where to store it. For indices we use ELEMENT_ARRAY_BUFFER. And the second parameter is the CPU based buffer. Now WebGL is ready to receive data. This uses the WebGL bufferData method. Again we specify the target type, then the data, here we convert the JS array to an unsigned 16-bit integer array, this is the format that WebGL expects to store indices. The last parameter is the usage value. STATIC_DRAW means the data is going to be defined once and then used multiple times. You use DYNAMIC_DRAW when the data is likely to be updated regularly. This helps the GPU when allocating memory.We return an object with length and indexBuffer properties.

function createQuadElement() {
   const indices = [0, 1, 2, 0, 2, 3];

   //Create indices buffer
   const indexBuffer = gl.createBuffer();

   gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);
   gl.bufferData(
      gl.ELEMENT_ARRAY_BUFFER,
      new Uint16Array(indices),
      gl.STATIC_DRAW
   );

   return {
      length: indices.length,
      indexBuffer
   };
}

Back to the init method. Add a call to this method after defining the vertices.

glRect = createQuadElement();

OK, so now gl is a webgl context for a canvas and glRect is a WebGL element defining two triangles by indices. But at this stage WebGL doesn’t have any data about the triangles other than indices. It will need to know the vertex positions and for the shader we’re going to create it will need to know about the uv values. This involves a two stage process. First we define how we transform the vertex position values to canvas space in the vertex shader and what colour to use for each pixel in the fragment shader. For this we need a new function, setupProgram. A program in WebGL is a combination of a compiled vertex shader and fragment shader. After creating a program you can add attributes to the program. For our vertex shader we will have a vec2 uniform that will contain the screen width and height. An attribute defining the vertex position and another the uv. We need to pass an interpolated version of the uv to the fragment shader so we add a varying. The main function passes the uv value. Then we create a coord value.

Normalized Device Coordinates – NDC

Remember normalized device coordinates? To be on screen the x, y and z values must all be in the range -1 to 1. Position is a window coordinate.

Screen Position to NDC – Step 1

At 0, 0 we want to convert this to -1, -1 and at window top right this should be 1, 1. If we divide position by screen_size then all positions on screen are in the range 0 to 1.

Screen Position to NDC – Step 2

Multiply this by 2 and now we have a range of 0 to 2.

Screen Position to NDC – Step 3

Subtract one and we’re in the range -1 to 1.

Screen Position to NDC – Step 4

We set the z and w values to 1. Z could be -1 or 0. But if it is less than -1 or greater than 1 it would be clipped and you’d get a blank canvas.

function setupProgram() {
const vertexSource = `
   uniform vec2 screen_size;
   attribute vec2 position;
   attribute vec2 uv;

   varying vec2 vUv;
 
   void main () {
      vUv = uv;
      vec2 coord = 2.0 * (position / screen_size) - 1.0;
      gl_Position = vec4(coord.xy, 1, 1);
   }
`;

The fragment shader is super simple. We define a precision. Define the varying uv. And in the main function use vUv for the red and green channels. Remember in the fragment shader the value of vUv will be an interpolated value of all the vertices in the triangle based on the fragments location in the triangle. Blue is set to 0 and alpha to 1. Then we call compileShaders to create the program. All would be well if compileShaders existed, time to create this function.

const fragmentSource = `
   precision mediump float;

   varying vec2 vUv;

   void main () {
      gl_FragColor = vec4(vUv, 0, 1);  
   }
`;

   return compileShaders(vertexSource, fragmentSource);
}

Let’s keep things simple. Let’s split the task into making two shaders and then making a program from the compiled shaders. The makeShader function we’re going to write needs two parameters. The first will be the shader type and the second the text source. Our makeProgram function will take the compiled shaders and return a program. Because in this example there is only one program we’ll add use program to tell WebGL to make the newly created program the active one.

function compileShaders(vertexSource, fragmentSource) {
   const vertexShader = makeShader(gl.VERTEX_SHADER, vertexSource);
   const fragmentShader = makeShader(gl.FRAGMENT_SHADER, fragmentSource);
   const program = makeProgram(vertexShader, fragmentShader);
   gl.useProgram(program);
   return program;
}

OK. So now we need makeShader and makeProgram. Let’s start with makeShader. Remember this takes two parameters; the type, vertex or fragment and the source. We use the WebGL method createShader passing the type. Then we pass the source. Now we compile the shader. Better check all went well. The WebGL method getShaderParameter returns true if compilation was successful when used with the query flag compile status. If this is false then we tidy up by deleting the shader. Showing a console warn and throwing an error. If all went well then we return the shader.

function makeShader(type, source) {
   const shader = gl.createShader(type);
   gl.shaderSource(shader, source);
   gl.compileShader(shader);
   if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) {
      gl.deleteShader(shader);
      console.warn(source);
      throw "Shader is Invalid";
   }
   return shader;
}

Now we have the shaders time to create the WebGL program. The makeProgram function takes the compiled shaders as parameters. We first create a new program using the gl method createProgram. Then we attach the two shaders one at a time using the attachShader method of the WebGL context. To complete the process of creating a program we also need to use the WebGL method linkProgram that finalizes the creation of the data on the GPU. Like creating a shader we should check all went well. We do this using getProgramParameter passing the program and the constant LINK_STATUS. If this returns false then we get the problem using getProgramInfoLog and pass this to console warn. And throw an error.

function makeProgram(vertexShader, fragmentShader) {
   const program = gl.createProgram();
   gl.attachShader(program, vertexShader);
   gl.attachShader(program, fragmentShader);
   gl.linkProgram(program);
   if (!gl.getProgramParameter(program, gl.LINK_STATUS)) {
      console.warn(gl.getProgramInfoLog(program));
      throw "Unable to link Program";
   }
   return program;
}

Back in the init method we can add

program = setupProgram();

We getting close. But currently the program uses a uniform and two attributes. Currently we haven’t passed this data to the program. To pass the uniform we’ll use another function set2fUniform with three parameters; the program, a uniform name and the values to pass. The 2f refers to two floats or a vec2. We need a pointer to the memory location of this uniform. We get this using getUniformLocation. Since initially the uniform does not exist this method both returns an existing uniform and creates a new one. Then to populate a vec2 we use the WebGL method uniform2f, passing the location and 2 float values. We could use values[0] comma values[1]. But the rest parameter symbol three dots converts the values array into just that syntax.

function set2fUniform(program, uniformName, values) {
   const uniformLocation = gl.getUniformLocation(program, uniformName);
   gl.uniform2f(uniformLocation, ...values);
}

Now we can define the screen size parameter, back in the init method add

set2fUniform(program, "screen_size", [window.innerWidth, window.innerHeight]);

It just remains to set the vertex attributes position and uv for the program. For this we’ll use another function, createAttribute. We’ll use three parameters; program, the attribute name and the vertices array. First we extract an array that only consists of the named property using the JavaScript Array method map. This iterates through the array and returns the value that matches vertex name. We need the size of a single item in this new array. Then we create a new buffer. We get the location of the attribute on the GPU using the WebGL method getAttribLocation. Just like uniforms this has a dual purpose it can return the position of an existing attribute or create a new one. Now we prepare the location by calling enableVertexAttribArray. We bind the buffer, using the constant type ARRAY_BUFFER. The next WebGL call is to vertexAttribPointer. This describes to the GPU how to use the currently bound buffer. It takes the location, the size of each item in the array, the type of data, whether to normalize the data to a range based on the type. Since this does not apply to floats we set it to false. Parameter five is the stride, you can put gaps between each element, for this example the data is tightly packed so stride is 0. And the last parameter is an offset value to the first item, again this is 0 for our purposes.

function createAttribute(program, name, vertices) {

   const values = vertices.map(vertex => vertex[name]);
   const size = values[0].length;

   const buffer = gl.createBuffer();
   const attributeLocation = gl.getAttribLocation(program, name);

   gl.enableVertexAttribArray(attributeLocation);
   gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
   gl.vertexAttribPointer(
      attributeLocation,
      size, // Size
      gl.FLOAT, // Type
      false, // Normalize
      0, // Stride
      0 // Offset
   );
...

Now we create a helper attribute. This is so we can update the position attribute at run time if the window changes size. It has the values array, the buffer and a refresh method as properties. The refresh method binds the buffer, then passes data using the bufferData method of the WebGL context. For this we need to convert the values into a typed array, each element in the array must be a single float not an array. JavaScript has a useful method flat which converts an array of arrays into a simple array. Now we have this helper we can call the refresh method to actually pass the CPU data to the GPU.

...
   const attribute = {
      values,
      buffer,
      refresh() {
         gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
         gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(values.flat()), gl.STATIC_DRAW);
      }
   };

   attribute.refresh();

   return attribute;
}

Back to the init method add

position = createAttribute(program, "position", vertices);
const uv = createAttribute(program, "uv", vertices);

To create the attributes. 

We just need one more function. This time to draw the glRect element. Or any other element that is made up of triangles. We simply call the WebGL method drawElements. For this example the type is TRIANGLES. We pass the length property of the element, remember that’s the number of indices. The index type and the buffer.

function drawElement(element) {
   gl.drawElements(
      gl.TRIANGLES,
      element.length,
      gl.UNSIGNED_SHORT,
      element.indexBuffer
   );
}

In the resize method. Add

if (glRect) drawElement(glRect);
Viewport not set

The shader is just in the bottom left. By default the gl viewport will be 300 x 150, the canvas size at the time the context was created. In the resize method add

if (gl){
   gl.viewport(0, 0, window.innerWidth, window.innerHeight);
}

Much better, but there’s still a problem. If the window changes size then the uniform screen_size will have the wrong values and the attribute position will also be wrong. 

If program exists then call the set2fUniform function passing the program, screen_size and the window size. If position exists then update the values array and call its refresh method.

if (program){
   set2fUniform( program, "screen_size", [window.innerWidth, window.innerHeight]);
}

if (position){
   position.values[0] = [0, 0];
   position.values[1] = [0, window.innerHeight];
   position.values[2] = [window.innerWidth, window.innerHeight];
   position.values[3] = [window.innerWidth, 0];
   position.refresh();
}

Now if the window size changes the data on the GPU is updated. Great work. I hope this short introduction to WebGL helps you on your shading journey.

For more shading advice check out my courses

  1. This article is based on this from theodo.com ↩︎

My Udemy Unity courses are best price through to Tuesday

Unity have just contracted me to write their DOTS, Data Orientated Technology Stack e-book. To celebrate I thought I’d have a sale of my Unity courses. They’ll be at the best price on Udemy through to Tuesday. Time to grab a bargain!

A Complete Guide to Unity’s Universal Render Pipeline

As Unity gradually switches to URP from the Built-in Render Pipeline, it’s time to learn the new techniques from the author of Unity’s URP e-books.

https://www.udemy.com/course/unity-urp/?couponCode=SEP23_BEST

Learn to write Unity Compute Shaders

Learn to harness the power of the GPU for processing intensive jobs.

https://www.udemy.com/course/compute-shaders/?couponCode=SEP23_BEST

Learn Unity Shaders from Scratch

Learn the black-art of Unity shaders in this comprehensive course on HLSL.

https://www.udemy.com/course/learn-unity-shaders-from-scratch/?couponCode=SEP23_BEST

js13kgame competition 2023 diary

I’m entering the js13kgames competition this year. Here’s my diary.

Here’s the code on GitHub.
And here’s the game.

I’m semi-retired having worked with real-time 3d for nearly 30 years. I create video courses mainly teaching game programming.

js13kgames post-mortem page

13th August

The theme of this years competition is announced. 13th century. I gave it some thought and decided on a quest for the Holy Grail. I’m doing a WebXR game using ThreeJS. I’ve already created a working project framework using npm and webpack with help from Matt McKenna.

With a 13k size limit Blender models are a no-no. All assets need to be created in code. I fiddled with the ThreeJS Editor and came up with this as the player character.

Sir Coadalot in the ThreeJS Editor

Downloading this as a JSON file is 12K uncompressed. Let’s remake it in code.

14th August

Big day. Before the competition theme was announced I’d been working on the key components I thought my game would need. A VRButton. I’d already created one for my WebXR course. But I made a tweak to it for displaying the VR Cardboard icon from an svg string.

vr-cardboard.svg icon

I had created the most basic 3D physics engine. If you look in the source you’ll find it in the src>SimplePhysics folder. Just 3 files,

SPWorld.jsThis is where rigid bodies are added and collisions calculated
SPBody.jsA single rigid body
SPCollider.jsA SPBody instance has a single collider which can only be a Sphere or a AABB ( Axis Aligned Bounding Box )
SimplePhysics demo

Minified and zipped it comes in under 2K.

If you want to see it in action and you’ve downloaded the repo then rename index.js as index-game.js and rename index-sp.js as index.js. If you’ve got the game running that’s npm run start then you can see it in a browser using localhost:8080. The physics isn’t perfect to say the least but needs must when the entire game budget is only 13k.

My first step in creating the game was to change a sphere into my player character. The downloaded json file from the ThreeJS editor game the necessary geometries, material colours and mesh positions and orientations. Here’s the code to create the knight.

createModel(){
  const gSkirt = new THREE.CylinderGeometry(0.4, 0.6, 0.5, 32, 1, true );
  const gHead = new THREE.SphereGeometry(0.4, 24, 10);
  const pHelmet = [
    new THREE.Vector2(0.5, 0),
    new THREE.Vector2(0.5, 0.2),
    new THREE.Vector2(0.45, 0.2),
    new THREE.Vector2(0.4, 0.3),
    new THREE.Vector2(0.3, 0.4),
    new THREE.Vector2(0, 0.5),
  ];
  const gHelmet = new THREE.LatheGeometry(pHelmet, 12);
  const pTunic = [
    new THREE.Vector2(0.45, 0),
    new THREE.Vector2(0.43, 0.1),
    new THREE.Vector2(0.4, 0.2),
    new THREE.Vector2(0.32, 0.3),
    new THREE.Vector2(0.16, 0.4),
    new THREE.Vector2(0.05, 0.5),
  ];
  const gTunic = new THREE.LatheGeometry(pTunic, 12);
  const gBelt = new THREE.CylinderGeometry(0.45, 0.45, 0.2, 32, 1, false);

  const mSkirt = new THREE.MeshStandardMaterial( { color: 15991041 } );
  const mHead = new THREE.MeshStandardMaterial( { color: 16373422 } );
  const mHelmet = new THREE.MeshStandardMaterial( { color: 0xC7C7C7 } );
  const mTunic = new THREE.MeshStandardMaterial( { color: 16777215 } );
  const mBelt = new THREE.MeshStandardMaterial( { color: 12615993 } );

  const root = new THREE.Group();
  const skirt = new THREE.Mesh( gSkirt, mSkirt );  
  skirt.matrix.fromArray(
    [1,0,0,0,0,1,0,0,0,0,1,0,0,0.25,0,1]
  );
  root.add(skirt);
  const head = new THREE.Mesh( gHead, mHead ); 
  head.matrix.fromArray(
   [1,0,0,0,0,1,0,0,0,0,1,0,0,1.3466628932086855,0,1]
  );
  root.add(head);
  const helmet = new THREE.Mesh( gHelmet, mHelmet );
  helmet.matrix.fromArray(
   [1,0,0,0,0,1,0,0,0,0,1,0,0,1.4010108612494776,0,1]
  );
  root.add(helmet);
  const tunic = new THREE.Mesh( gTunic, mTunic );
  tunic.matrix.fromArray(
    [1,0,0,0,0,1,0,0,0,0,1,0,0,0.6106004423389476,0,1]);
  root.add(tunic);
  const belt = new THREE.Mesh( gBelt, mBelt );
  belt.matrix.fromArray(
    [1.2,0,0,0,0,1,0,0,0,0,1,0,-0.04,
     0.5495005511829094,0,1]
  );
  root.add(belt);

  root.traverse( object => {
    if ( object.matrixAutoUpdate ){
      object.matrix.decompose( object.position, object.quaternion, object.scale );
     }
   });

  return root;
}
Sir Coadalot

I also created a castle tower in code. I added my JoyStick control for testing on the desktop. Put it all together and had this – not bad for day 1

August 15th

I worked on animations for the player character today. Given the tight 13k budget. Using a 3D content creator like Blender and exporting as a GLB is a none starter. So I used the ThreeJS Editor, carefully moving and rotating the sword root object into various poses then writing down its position and rotation.

Inspector panel in the ThreeJS Editor

Having got a set of keyframes. I created a JS object.

const config1 = {
  duration: 0.4,
  times: [0, 0.1, 0.3],
  pos:[{ x:0, y:0, z:0 }, { x:-0.261, y:0.522, z:0.201 }, { x:-0.293, y:0.722, z:0.861 }],
  rot:[{ x:0, y:0, z:0 }, { x:21.69, y:13.79, z:-9.18 }, { x:-2.23, y:4.21, z:175.94 }]
}

And a function to convert this into a ThreeJS AnimationClip

createAnim(name, config){
  const pvalues = [], qvalues = [];
  const v = new THREE.Vector3(), q = new THREE.Quaternion(), e = new THREE.Euler();
  const d2r = Math.PI/180;

  for(let i=0; i<config.times.length; i++){
    const pos = config.pos[i];
    const rot = config.rot[i];
    v.set(pos.x, pos.y, pos.z).toArray( pvalues, pvalues.length );
    e.set(rot.x*d2r, rot.y*d2r, rot.z*d2r);
    q.setFromEuler(e).toArray( qvalues, qvalues.length );
  }

  const pos = new THREE.VectorKeyframeTrack( '.position', config.times, pvalues );
  const rot = new THREE.QuaternionKeyframeTrack( '.quaternion', config.times, qvalues );

  return new THREE.AnimationClip( name, config.duration, [ pos, rot ] );
}

I used a little test code to see it in action.

Sir Coadalot and sword

Of course the player needs an enemy. Meet the Black knight. Just the same with different material colours and one point on the helmet LatheGeometry points array changed.

August 16th

Today I coded the castle walls and towers. Added a DebugControls class to allow keyboard entry when testing using the WebXR emulator on a desktop. I also added some bad guys. Super primitive AI they just move toward the player character. The bad news is I’ve only got 1k left to complete the game. Something might have to go!!! Here’s a screengrab from my Quest2

August 17th

Today I refactored the game. Removed the BasicUI. Removed the OBJParser and the Rock OBJ String. Instead I create a rock using an IcosahedronGeometry instance then randomly perturb the vertex positions.

class Rock extends THREE.Mesh{
  constructor(radius=0.5){
    const geometry = new THREE.IcosahedronGeometry(radius, 8, 6);
    geometry.translate( 0, radius, 0 );
    const vertices = geometry.getAttribute('position');
    for(let i=0; i<vertices.array.length; i++){
      vertices.array[i] += (Math.random()-0.5) * 0.35;
    }
    vertices.needsUpdate = true;
    const material = new THREE.MeshPhongMaterial( {color: 0xaaaaaa } );
    super(geometry, material);
  }
}

I limited the scene to one tree type. This gained me 2K. I was unfeasibly happy by this. That’s what happens with this competition! And makes it fun.

I updated the castles, created Player and Enemy classes that extend the Knight class so I can create the models using the Knight class but have different behaviour for the Player and an Enemy. And I created some new props.

Props

August 18th

Today I setup patrolling for the bad guys. Just a four cornered path and the enemy moves around this path unless the player is within 10 world units. I also started work on the introduction panel and gameover panel. No way in the byte allowance I can use a custom font. That would blow the budget straightaway.

Patrolling

August 19th

Main thing today was making the sword functional. I added an Object3D to the end of the sword. In the Player update method I do a check using the physics engine to see if this object position intersects any colliders. If the ThreeJS object associated with the physics body has the name ‘Gate’ or ‘Enemy’, I call methods of the object. For Gate that is the method openGate. I have a problem though I only have 33 bytes left. I did some checking, removing the sfx increases the bytes to 330. But removing the CollisionEffect increases the remaining bytes to over 2K. All assets are nearly complete. So 2K should be enough. Looks like I need to simplify the CollisionEffect.

Opening Gate

August 20th

A week into the competition and the game is developing well. I was travelling today so didn’t do much. I created a ForceField that will be visible for 10secs after a Shield pickup. It uses an InstancedMesh. An InstancedMesh instance takes geometry and material just like a Mesh. In addition it has a third parameter, count. The count parameter is the number of duplicates of the geometry. To position and orientate each mesh you use the setMatrixAt method. Passing an index and a matrix. Here’s the update method showing how the motion of the shields is handled.

update(dt){
  this.time += dt;
        
  const PI2 = Math.PI * 2;
  const inc = PI2/ForceField.count;
  let index = 0;

  for(let row=0; row<ForceField.rows; row++){
    const n = (row % 2) ? 1 : -1;
    const y = (ForceField.height/ForceField.rows) * row;
    for(let i=0; i<ForceField.count; i++ ){
      const t = (this.time * n) % PI2;
      const r = (this.time * -1) % PI2;
      const z = Math.sin(t+i*inc) * ForceField.radius;
      const x = Math.cos(t+i*inc) * ForceField.radius;
      this.obj.position.set(x,y,z);
      this.obj.rotation.set(0,t,0);
      this.obj.updateMatrix();
      this.meshes.setMatrixAt( index ++, this.obj.matrix );
    }
  }

  this.meshes.instanceMatrix.needsUpdate = true;
}
ForceField

August 21st

Travelling again today so didn’t achieve much. Main thing was rewriting the CollisionEffect as a InstancedMesh, rather than extending the custom class GPUParticleSystem. Gained nearly 1700 bytes. Well worth it.

August 22nd-23rd

Lot’s of debugging. I now have the basis of a game. Lots of fine tuning to do. I have 384 bytes left. But a bit of tiding up might gain me enough to add some sound.

4th September

I was away for the last few days with my daughter and the grandkids. Didn’t get anything done! I did a session of debugging yesterday, added a little sound and with 23 bytes left submitted!

13K is a serious limit and restricted what I could add as gameplay. But I really enjoyed working within this restriction. Particularly happy with the physics engine. Looking forward to next years theme.

Disappointingly there was a bug on the js13kgames site which made my game unplayable on the voting site so it received no votes and came last in the WebXR category!!! The problem was a cross-origin problem meaning the Three.JS library wouldn’t load from the path provided by the organisers. Frustrating after the spending many hours creating the game. Heh-ho, nevertheless I enjoyed the challenge.

The Universal Render Pipeline Cookbook: Recipes for Shaders and Visual Effects

The Universal Render Pipeline Cookbook cover

The latest cookbook, I’ve written for Unity is now live. It is all about Universal Render Pipeline (URP) effects and is now available to download for free. The e-book provides 12 recipes for popular visual effects that can be applied to a wide range of games, art styles, and platforms. Get ready to cook up Renderer Features, GPU-instantiated meshes, decals, volumetric materials, and more. You can use it alongside my other Unity e-book, Introduction to the Universal Render Pipeline for advanced Unity creators , which offers a wealth of information about how to use URP for creators that have developed projects with the Built-In Render Pipeline.

To celebrate the launch of the new e-book my Udemy course “The Complete Guide to Unity’s Universal Render Pipeline (URP)”, is available for less than $10 until 9th July 2023.

Here’s a handy overview of the recipes you’ll find in the book.

1. Stencils

Renderer features provide you with ample opportunities to experiment with lighting and effects. This recipe focuses on Stencils, using only the bare minimum of required code. If you work alongside the sample project, open the sample scene via Scenes > Renderer Features > SmallRoom – Stencil in the Editor.

The sample project uses the magnifying glass over desk example, and the aim is to convert the lens of the magnifying glass so that it allows you to see through the desk like an x-ray image. The approach uses a combination of Layer Masks, shaders, and Renderer features.

Renderer Features are a great way to achieve dramatic custom effects or gameplay possibilities.

GitHubDownload the sample

Stencils in action: As the magnifying glass moves over the desk, it can see through the drawers to reveal what’s inside.
Stencils in action: As the magnifying glass moves over the desk, it can see through the drawers to reveal what’s inside.

2. Instancing

Exchanging data between CPU and GPU is a major bottleneck in the rendering pipeline. If you have a model that needs to be rendered many times using the same geometry and material, then Unity provides some great tools to do so, which are covered in the cookbook’s instancing chapter.

This recipe uses a field full of grass to illustrate the concept of instancing. It uses the SRP Batcher, GPU instancing, RenderMeshPrimitives, and ComputeBuffers.

A field of grass rendered using an SRP Batcher-compatible material
A field of grass rendered using an SRP Batcher-compatible material

3. Toon and outline shading

Often used together, toon and outline shaders present two distinct challenges. The toon shader takes the cooler that would be created using a URP-compatible Lit shader, and ramps the output rather than allowing continuous gradients, thereby requiring a custom lighting model.

The example in this recipe uses Shader Graph. However, Shader Graph doesn’t support custom lighting, so there’s no node available to directly access the Main and Additional Lights. Instead, you can leverage a custom node to access those.

Check out the Toon and outline shading recipe to get the full details.

One scene, three different looks: Standard shading (left), with post-processing (center), and per-material shading (right)
One scene, three different looks: Standard shading (left), with post-processing (center), and per-material shading (right)

4. Ambient Occlusion

Ambient Occlusion is a post-processing technique available from Unity 2020.2. This effect darkens creases, holes, intersections, and surfaces that are close to one another. In the real world, such areas tend to block out or occlude ambient light, thereby appearing darker.

See how you can implement real-time Screen Space Ambient Occlusion (SSAO) effect as a Renderer Feature using URP.

Screen Space Ambient Occlusion
Screen Space Ambient Occlusion

5. Decals

Decals are a great way to insert overlays onto a surface. They’re often used to add visuals such as bullet holes or tire treads to the game environment as the player interacts with the scene.

If you want to follow along this recipe, you’ll work with URP Decal Projection properties, creating the material, and even adding a decal with code.

A new Decal Projector in action
A new Decal Projector in action

6. Water

The water recipe is created in Shader Graph to make the steps more accessible. It’s built in three stages:

  • Creating the water color
  • Moving tiled normal maps to add wavelets to the surface
  • Adding moving displacement to the vertex positions to create a swell effect

While this recipe forms the basis of a simple water shader, you can enhance it using Caustic Reflections, Refraction, and Foam.

Simple water shader in motion

7. LUT for color grading

Using LUT Textures is an efficient way to create dramatic color grading, and this approach can be useful in many games. It involves using one filter, but the steps employed apply to all of them.

Using Color Lookup to create grading effects
Using Color Lookup to create grading effects

8. Lighting

Lighting with URP is similar to using the Built-in Render Pipeline. The main difference is where to find the settings.

This chapter in the cookbook covers related recipes for real-time lighting and shadows, including baked and mixed lighting using the GPU Progressive Lightmapper, Light Probes, and Reflection Probes. You’ll pick up enough instruction for a five-course meal!

A few things to keep in mind about shaders and color space: When using lighting in URP, you have a choice between a Lit Shader and Simple Lit Shader, which is largely an artistic decision. If you want a realistic render, you can use the Lit Shader, but if you want a more stylized render, you can use Simple Lit for stellar results.

The diorama scene mixing baked and real-time lighting
The diorama scene mixing baked and real-time lighting

9. Shadows

Shadow settings are set using a Renderer Data object and a URP Asset using URP. You can use these assets to define the fidelity of your shadows.

The URP Asset
The URP Asset

This recipe includes tips for: Main Light and Shadow Resolution, Shadow Cascades, baking lights, and more.

Texel size by scale setting: In the top-left image, texel size is set to 0.5; in the top-right image, 0.2; in the bottom-left image, 0.1, and in the bottom-right image, 0.05.
Texel size by scale setting: In the top-left image, texel size is set to 0.5; in the top-right image, 0.2; in the bottom-left image, 0.1, and in the bottom-right image, 0.05.

10. Light Probes

Light Probes save the light data at a particular position within an environment when you bake the lighting by clicking Generate Lighting via Window > Rendering > Lighting panel. This ensures that the illumination of a dynamic object moving through an environment reflects the lighting levels used by the baked objects. It will be dark in a dark area, and in a lighter area it will be brighter.

Follow this recipe to find out how to position Light Probes with a code-based approach in order to speed up your editing, how to use Reflection Probes in your scene, and how to blend them.

The robot inside and outside of the cave, with lighting level affected by Light Probes
The robot inside and outside of the cave, with lighting level affected by Light Probes

11. Screen Space Refraction

Screen Space Refraction uses the current opaque texture created by the render pipeline as the source texture to map pixels to the model being rendered. This method and recipe is about deforming the UV used to sample the image.

Learn how to use a normal map to create refraction effects as well as tint a refraction effect.

An example of Screen Space Refraction
An example of Screen Space Refraction

12. Volumetrics

This is a recipe for using ray marching to render a 3D texture. Unity supports 3D textures, which are an array of images placed in a grid on a single texture, rather like a Texture Atlas. The difference is that each image is the same size. Using a 3D UV value, you can source a texel from the grid of images with UV.Z defining the row and column of the individual image to use.

You can also use Houdini when creating the 3D texture. Alternatives to a 3D texture include using multilayered Perlin noise, or prebaking a tileable noise texture using Unity.

A cloud with ray marching
A cloud with ray marching

More resources

This cover image shown here is from PRINCIPLES, an adventure game from COLOPL Creators, the technology brand of COLOPL, Inc., who developed the series of Shironeko Project and Quiz RPG: The World of Mystic Wiz.
This cover image shown here is from PRINCIPLES, an adventure game from COLOPL Creators, the technology brand of COLOPL, Inc., who developed the series of Shironeko Project and Quiz RPG: The World of Mystic Wiz.

There are many advanced resources available for free from Unity. As mentioned at the beginning of the blog post, the e-book Introduction to the Universal Render Pipeline for advanced Unity creators is a valuable resource for helping experienced Unity developers and technical artists migrate their projects from the Built-in Render Pipeline to the URP.

All of the advanced e-books and articles are available from the Unity best practices hub. E-books can also be found on the advanced best practices documentation page.

The Complete Guide to Unity’s Universal Render Pipeline (URP)

And don’t forget my Udemy URP course is available for less than 10 bucks until 9th July