Model viewer: Web 3D made easy Learn to write JavaScript code while having fun making 3D web games using the most popular Open Source WebGL library ThreeJS
Learn to Create WebXR, VR and AR, experiences with ThreeJS Learn how to create VR and AR experiences that work directly from the browser, using the latest API from Google and Amazon and our favourite Open Source WebGL library, ThreeJS
Learn GLSL Shaders from Scratch Learn how to harness the power of the GPU in your web pages by learning to code GLSL shaders.
Create HTML5 Games using Adobe Animate Adobe Animate used to be Flash. Learn how you can use your Flash skills to create HTML5 games that use no plugins.
I entered this years js13kgames competition. If you don’t know you have to create a game which when zipped has a file size no bigger than 13kb!! You can’t download online assets like images, sounds, libraries or fonts. Like last year I targeted the WebXR category. This allows an exception to the no libraries rule and allows the developer to use an external library: A-Frame, Babylon.js or Three.js. I chose Three.js.
You need to setup an npm project. Here’s my template. To use it, download a zip. Unzip, open the folder using VSCode. Then use:
npm install.
npm run start to start a test server. and
npm run build to create a distribution version in the dist folder and a zipped version in the zipped folder. It will also give a file size report.
The first challenge is creating a game that fits the theme for the year. In 2024 the theme was Triskaidekaphobia. That is the fear of the number 13. I decided to create a shooting gallery where the user must shoot any ball with the number 13 on it.
Initially I created a proxy of the environment in code.
export class Proxy{
constructor( scene ){
this.scene = scene;
const geo1 = new THREE.CylinderGeometry( 0.25, 0.25, 3 );
const mat1 = new THREE.MeshStandardMaterial( { color: 0x999999 } );
const mat2 = new THREE.MeshStandardMaterial( { color: 0x444444, side: THREE.BackSide, wireframe: false } );
const column = new THREE.Mesh( geo1, mat1 );
for ( let x = -6; x<=6; x+=2 ){
const columnA = column.clone();
columnA.position.set( x, 1.5, -20);
scene.add( columnA );
}
const geo2 = new THREE.PlaneGeometry( 15, 25 );
geo2.rotateX( -Math.PI/2 );
const floor = new THREE.Mesh( geo2, mat1 );
floor.position.set( 0, 0, -12.5 );
//scene.add( floor );
const geo3 = new THREE.BoxGeometry( 15, 0.6, 0.6 );
const lintel = new THREE.Mesh( geo3, mat1 );
lintel.position.set( 0, 3.3, -20 );
scene.add( lintel );
const geo4 = new THREE.BoxGeometry( 15, 3.3, 36 );
const room = new THREE.Mesh( geo4, mat2 );
room.position.set( 0, 1.65, -10 );
scene.add( room );
}
}
I was aiming for a look like this –
A bit of a tall-order but you have to have a goal!!
The next step was creating the balls that move toward the player.
Just a simple class.
export class Ball{
static states = { DROPPING: 1, ROTATE: 2, FIRED: 3 };
static canvas = document.createElement('canvas');
static geometry = new THREE.SphereGeometry( 0.5 );
constructor( scene, num, minus = false, xPos = -1, speed = 0.1 ){
if (Ball.canvas.width != 256 ){
Ball.canvas.width = 256;
Ball.canvas.height = 128;
}
const context = Ball.canvas.getContext('2d');
if (num == 13){
context.fillStyle = "#000";
}else if (minus){
context.fillStyle = "#f00";
}else{
context.fillStyle = "#0f0";
}
this.num = num;
this.speed = speed;
context.fillRect(0, 0, 256, 128);
context.fillStyle = "#fff";
context.font = "48px Arial";
context.textAlign = "center";
context.textBaseline = "middle";
context.fillText(String(num), 128, 64 );
const tex = new THREE.CanvasTexture( Ball.canvas );
const material = new THREE.MeshStandardMaterial( { map: tex, roughness: 0.1 } );
this.mesh = new THREE.Mesh( Ball.geometry, material );
this.mesh.position.set( xPos, 4, -20 );
this.mesh.rotateY( Math.PI/2 );
this.state = Ball.states.DROPPING;
scene.add( this.mesh )
}
update(game){
switch(this.state){
case Ball.states.DROPPING:
this.mesh.position.y -= 0.1;
if (this.mesh.position.y <= 1.6){
this.state = Ball.states.ROTATE;
this.mesh.position.y = 1.6;
}
break;
case Ball.states.ROTATE:
this.mesh.rotateY( -0.1 );
console.log( this.mesh.rotation.y );
if (this.mesh.rotation.y < -Math.PI/2.1){
this.state = Ball.states.FIRED;
}
break;
case Ball.states.FIRED:
this.mesh.position.z += this.speed;
break;
}
if (this.mesh.position.z > 2){
this.mesh.material.map.dispose();
if (game) game.removeBall( this );
}
}
}
I created a proxy gun.
export class Gun extends THREE.Group{
constructor(){
super();
this.createProxy();
}
createProxy(){
const mat = new THREE.MeshStandardMaterial( { color: 0xAAAA22 } );
const geo1 = new THREE.CylinderGeometry( 0.01, 0.01, 0.15, 20 );
const barrel = new THREE.Mesh( geo1, mat );
barrel.rotation.x = -Math.PI/2;
barrel.position.z = -0.1;
const geo2 = new THREE.CylinderGeometry( 0.025, 0.025, 0.06, 20 );
const body = new THREE.Mesh( geo2, mat );
body.rotation.x = -Math.PI/2;
body.position.set( 0, -0.015, -0.042 );
const geo3 = new THREE.BoxGeometry( 0.02, 0.08, 0.04 );
const handle = new THREE.Mesh( geo3, mat );
handle.position.set( 0, -0.034, 0);
this.add( barrel );
this.add( body );
this.add( handle );
}
}
and a Bullet
import { Ball } from "./ball.js";
export class Bullet{
constructor( game, controller ){
const geo1 = new THREE.CylinderGeometry( 0.008, 0.008, 0.07, 16 );
geo1.rotateX( -Math.PI/2 );
const material = new THREE.MeshBasicMaterial( { color: 0xFFAA00 });
const mesh = new THREE.Mesh( geo1, material );
const v = new THREE.Vector3();
const q = new THREE.Quaternion();
mesh.position.copy( controller.getWorldPosition( v ) );
mesh.quaternion.copy( controller.getWorldQuaternion( q ) );
game.scene.add( mesh );
this.tmpVec = new THREE.Vector3();
this.tmpVec2 = new THREE.Vector3();
this.mesh = mesh;
this.game = game;
}
update( dt ){
let dist = dt * 2;
let count = 0;
while(count<1000){
count++;
if (dist > 0.5){
dist -= 0.5;
this.mesh.translateZ( -0.5 );
}else{
this.mesh.translateZ( -dist );
dist = 0;
}
this.mesh.getWorldPosition( this.tmpVec );
let hit = false;
this.game.balls.forEach( ball => {
if (!hit){
if (ball.state == Ball.states.FIRED ){
ball.mesh.getWorldPosition( this.tmpVec2 );
const offset = this.tmpVec.distanceTo( this.tmpVec2 );
if ( offset < 0.5 ){
hit = true;
ball.hit(this.game );
this.game.removeBullet( this );
}
}
}
});
if (dist==0 || hit) break;
}
this.mesh.translateZ( dt * -2 );
if ( this.mesh.position.length() > 20 ) this.game.removeBullet();
}
}
Now I had a working basic game. Time to create the eye-candy.
First a score and timer display. WebXR does not allow the developer to use the DOM. You can’t simply create a div, position it and update its content using JavaScript. I decided to create a mechanical counter mechanism.
Each segment uses CylinderGeometry. To create the map with the numbers on I used the CanvasTexture class, this creates a Texture from an HTML Canvas, so you can use HTML Canvas drawing commands to ‘paint’ the texture.
export class Counter extends THREE.Group{
static texture;
static types = { SCORE: 0, TIMER: 1 };
constructor( scene, pos = new THREE.Vector3(), rot = new THREE.Euler ){
super();
this.scale.set( 1.5, 1.5, 1.5 );
scene.add( this );
this.position.copy( pos );
this.rotation.copy( rot );
if ( Counter.texture == null ){
const canvas = document.createElement('canvas');
canvas.width = 1024;
canvas.height = 64;
const context = canvas.getContext( '2d' );
context.fillStyle = "#000";
context.fillRect( 0, 0, 1024, 64 );
context.textAlign = "center";
context.textBaseline = "middle";
context.font = "48px Arial";
context.fillStyle = "#fff";
const inc = 1024/12;
const chars = [ "0", "1", "2", "3", "4", "5", "6", "7", "8", "9", ":", " " ];
let x = inc/2;
chars.forEach( char => {
context.setTransform(1, 0, 0, 1, 0, 0);
context.rotate( -Math.PI/2 );
context.translate( 0, x );
context.fillText( char, -32, 34 );
x += inc;
});
Counter.texture = new THREE.CanvasTexture( canvas );
Counter.texture.needsUpdate = true;
}
const r = 1;
const h = Math.PI * 2 * r;
const w = h/12;
const geometry = new THREE.CylinderGeometry( r, r, w );
geometry.rotateZ( -Math.PI/2 );
const material = new THREE.MeshStandardMaterial( { map: Counter.texture } );
const inc = w * 1.1;
const xPos = -inc * 2.5;
const zPos = -r * 0.8;
for( let i=0; i<5; i++ ){
const mesh = new THREE.Mesh( geometry, material );
mesh.position.set( xPos + inc*i, 0, zPos );
this.add( mesh );
}
this.type = Counter.types.SCORE;
this.displayValue = 0;
this.targetValue = 0;
}
set score(value){
if ( this.type != Counter.types.SCORE ) this.type = Counter.type.SCORE;
this.targetValue = value;
}
updateScore( ){
const inc = Math.PI/6;
let str = String( this.displayValue );
while ( str.length < 5 ) str = "0" + str;
const arr = str.split( "" );
this.children.forEach( child => {
const num = Number(arr.shift());
if (!isNaN(num)){
child.rotation.x = -inc*num - 0.4;
}
});
}
updateTime( ){
const inc = Math.PI/6;
let secs = this.displayValue;
let mins = Math.floor( secs/60 );
secs -= mins*60;
let secsStr = String( secs );
while( secsStr.length < 2 ) secsStr = "0" + secsStr;
let minsStr = String( mins );
while( minsStr.length < 2 ) minsStr = "0" + minsStr;
let timeStr = minsStr + ":" + secsStr;
let arr = timeStr.split( "" );
this.children.forEach( child => {
const num = Number(arr.shift());
if (isNaN(num)){
child.rotation.x = -inc*10 - 0.4;
}else{
child.rotation.x = -inc*num - 0.4;
}
});
}
set seconds(value){
if ( this.type != Counter.types.TIMER ) this.type = Counter.types.TIMER;
this.targetValue = value;
this.update( 0 );
}
get time(){
let secs = this.targetValue;
let mins = Math.floor( secs/60 );
secs -= mins*60;
let secsStr = String( secs );
while( secsStr.length < 2 ) secsStr = "0" + secsStr;
let minsStr = String( mins );
while( minsStr.length < 2 ) minsStr = "0" + minsStr;
return minsStr + ":" + secsStr;
}
update( dt ){
if ( this.targetValue != this.displayValue ){
if ( this.targetValue > this.displayValue ){
this.displayValue++;
}else{
this.displayValue--;
}
}
switch( this.type ){
case Counter.types.SCORE:
this.updateScore();
break;
case Counter.types.TIMER:
this.updateTime();
break
}
}
}
Another challenge was creating an environment map. Which is essential when using MeshStandardMaterial with a roughness less than 1. As soon as it is smooth it reflects a map which by default is black. Resulting in very dark renders of shiny objects. With only 13k to play with you can’t simply load a bitmap texture. Instead you have to generate an environment map at runtime. A WebGLRenderer does not have to write to the screen. If you set a WenGLRenderTarget you can render to that. For a environment map you need it to be compiled in a special way. You can use a PMREMGenerator and the compileEquirectangularShader method.
I created panelling, a gun and a ceiling with lights and air conditioning pipes.
When creating the panelling and the gun I made use of the ThreeJS Shape class and ExtrudeGeometry.
I did some play testing and the frame rate was suffering on my Quest2. I decided to swap the panelling for a texture. Which I created by using a WebGLRenderTarget.
With some play testing I adjusted the bullet speed to be slow enough that balls to the left or right were quite difficult to hit. Then I added varying ways for the balls move. Starting just straight and ending up going up and down on a swivel. I also added a leaderboard and styled the various panels using css.
I submitted the game.
Having submitted the game I started work on a ThreeJS Path Editor. The paths I used to create complex shapes using ExtrudeGeometry involved drawing on graph paper there had to be a better way.
It’s all ready for next years competition. Source code here.
To celebrate the launch of my beginners guide to ThreeJS e-book, The ThreeJS Primer. All my Udemy ThreeJS courses are Udemy best price. Click the links to grab yourself a bargain.
Learn how to create VR and AR experiences that work directly from the browser, using the latest API from Google and Amazon and our favourite Open Source WebGL library, ThreeJS
Want to learn to create Shaders? Well you’ve picked the right time. My courses that contain lectures on coding Shaders are all Udemy best price for the next few days.
To celebrate finishing the second draft of my new Unity DOTS e-book, I’m having a sale of my Udemy Unity courses. Use the coupon code DEC23_BEST, or click the links below, to get the best price on Udemy for these courses over the next few days.
For all of my courses I include the Three.JS library I used at the time I was writing and recording the course. This ensures the code matches the library so no further installation is required other than downloading and unzipping a zip file from Udemy or GitHub or cloning and forking a repo from GitHub. But another approach is to use a package manager. By far the most popular is NPM, Node Package Manager and in this article we’ll look at using this approach.
To start you will need Node.JS installed on your PC, Mac or Linux device. If you haven’t got Node.JS installed then click the Node.JS link above or enter nodejs.org in your browser address bar. Download the installer for your device and install. NPM comes with the install.
If you haven’t got VSCode installed then install that as well. It is my recommended code editor these days. Either click the link above or enter https://code.visualstudio.com/ in your browser address bar.
Open VSCode and choose Open.
Navigate to a new folder where your project files will be stored. You’ll need to agree to trust the authors, but since that is you there is no problem. Use menu: Terminal > New Terminal. Enter
npm install three
Notice you now have a node_modules folder and two new files package.json and package-lock.json.
package.json looks like this. three is listed as a dependency. package-lock.json is created and edited by npm and should not be touched.
{
"dependencies": {
"three": "^0.157.0"
}
}
three is the Three.JS library which you’ll find in the node_modules/three folder.
This will allow you to launch a dev server and package a completed project for distribution.
You could place your project files at the root of the folder. But most developers prefer to keep things tidy by adding content to folders. Create a src folder and a public folder and create a new file called vite.config.js add this code to the file.
Now vite will look in src for any html or js files, in public for assets and package for distribution to the build folder. Note the public and build paths are relative to the src path.
To see an example using npm and vite download this repo.
Just click the green Code button and choose Download ZIP. Unzip to a folder of your choice and open the folder in VSCode. To install the dependencies enter
npm install
The package.json file is scanned for dependencies and the node_modules folder is populated with all the packages needed. Recall the scripts we added to package.json. Use
npm run dev
ctrl+click (PC) or cmd+click (Mac) the localhost link to launch the dev server in your browser.
Just a simple example of a Three.JS app created using vite as a build tool.
Take a look at src/index.html. Notice the script. Notice we can import the core Three.JS library from three.
<script type="module">
import * as THREE from "three";
import { OrbitControls } from "three/addons/controls/OrbitControls.js";
import { GUI } from "three/addons/libs/lil-gui.module.min.js";
import { GLTFLoader } from 'three/addons/loaders/GLTFLoader.js';
import { DRACOLoader } from 'three/addons/loaders/DRACOLoader.js';
import { RGBELoader } from 'three/addons/loaders/RGBELoader.js';
three will be converted into node_modules/three/build/three.module.js and three/addon becomes node_modules/three/examples/jsm. Why? Take a look at package.json in the three folder.
Notice exports. The default export for three, “.”, when used as an import is ./build/three.module.js. Used as a require, something used when creating a nodejs app, uses the classic javascript version ./build.three.cjs. ./addons/* becomes ./examples/jsm/* .
Back to the index.html file. Find the loadGLTF function, line 93.
function loadGLTF(){
const loader = new GLTFLoader( );
const dracoLoader = new DRACOLoader();
dracoLoader.setDecoderPath( 'draco-gltf/' );
loader.setDRACOLoader( dracoLoader );
// Load a glTF resource
loader.load(
// resource URL
'motorcycle.glb',
Notice setDecoderPath is draco-gltf. Since this is not an import, for vite to find it correctly it must be in the public folder.
It is simply copied from node_modules/three/examples/jsm/libs/draco/gltf. You can see this folder also contains the glb loaded, motorcycle.glb, and the environment map, venice_sunset_1k.hdr.
For the last step enter
npm run build
Notice a new folder is created, build.
A new index.html is created loading the js file in the assets folder. You might find you need to add a dot before the forward slash.
src=”/assets/index…”
Becomes
src=”./assets/index…”
The contents of the public folder are copied to the build folder. The main script in the assets folder is bundled and minified. The single script now contains the Three.JS library and all the other imports in the index.html file in the src folder.
If you have Live Server installed then you can run the app by right clicking on build/index.html and choosing Open with Live Server.
Using npm and vite is a great way to create your Three.JS apps. I hope this short article helps you get started.
In this article we’ll look at using WebGL to display a Quad, a rectangle, that fills the window. If you want to code-along then check out the CodePen-start link. Here is the final version.
It’s a very simple shader just using uv to blend the colours. This article isn’t about the shader, it’s about getting the results of the shader on screen. You’ll learn about attributes, buffers, elements and programs. Let’s get started.
Before we can use WebGL in a browser we need a canvas and a context. To create this we’ll use a new function, setupWebGL. We create a canvas element and append it to the body. Then we get the webgl context. This could return null in which case we throw an error. These days most browsers on most devices do support webgl.
function setupWebGl() {
canvas = document.createElement("canvas");
document.body.appendChild(canvas);
const gl = canvas.getContext("webgl");
if (gl == null) throw "WebGl not Supported";
return gl;
}
Back in the init method we call the function.
gl = setupWebGl();
By default a canvas is sized at 300 x 150. We want it to fill the screen to do that we’ll need a resize method. If a canvas has been created then set its width to window.innerWidth and its height to window.innerHeight.
function onWindowResize() {
if (canvas){
canvas.width = window.innerWidth;
canvas.height = window.innerHeight;
}
}
In the init method add an event listener and also directly call this function.
Now we need to define a quad that will fill the canvas area. For that we need some vertices. The minimum we need to define a quad is the position of 4 vertices. But in our shader we’re also going to use a uv value.
We’re going to define an array of objects with position and uv properties. Each of these properties will be a simple array. Picturing a Quad, see images above. We start with the vertex at the bottom left corner giving this the position 0, 0 and the same values for uv. Then we move to the top left, this has position value 0 for x and window inner height for y. The uv for this vertex is 0, 1. The next vertex is top right, with position values of the window width and height and uv 1, 1. And finally the bottom right with position values of window width, 0 and uv of 1, 0.
WebGL can draw points, lines and triangles. To render our quad we need to define two triangles by defining the indices of the vertices. When we do this we’re creating what WebGL describes as an Element. Let’s do that using another function, createQuadElement. First we define the indices.
Bottom-left, top-left, top-right for triangle one and bottom-left, top-right, bottom-right for triangle two.
WebGL is all about buffers. This is how we pass data from the CPU to the GPU. Before we can pass any data we need to inform WebGL which buffer we’re passing the data to. We do this using bindBuffer. When we do this we need to inform WebGL what type of data we’re passing so it knows where to store it. For indices we use ELEMENT_ARRAY_BUFFER. And the second parameter is the CPU based buffer. Now WebGL is ready to receive data. This uses the WebGL bufferData method. Again we specify the target type, then the data, here we convert the JS array to an unsigned 16-bit integer array, this is the format that WebGL expects to store indices. The last parameter is the usage value. STATIC_DRAW means the data is going to be defined once and then used multiple times. You use DYNAMIC_DRAW when the data is likely to be updated regularly. This helps the GPU when allocating memory.We return an object with length and indexBuffer properties.
function createQuadElement() {
const indices = [0, 1, 2, 0, 2, 3];
//Create indices buffer
const indexBuffer = gl.createBuffer();
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);
gl.bufferData(
gl.ELEMENT_ARRAY_BUFFER,
new Uint16Array(indices),
gl.STATIC_DRAW
);
return {
length: indices.length,
indexBuffer
};
}
Back to the init method. Add a call to this method after defining the vertices.
glRect = createQuadElement();
OK, so now gl is a webgl context for a canvas and glRect is a WebGL element defining two triangles by indices. But at this stage WebGL doesn’t have any data about the triangles other than indices. It will need to know the vertex positions and for the shader we’re going to create it will need to know about the uv values. This involves a two stage process. First we define how we transform the vertex position values to canvas space in the vertex shader and what colour to use for each pixel in the fragment shader. For this we need a new function, setupProgram. A program in WebGL is a combination of a compiled vertex shader and fragment shader. After creating a program you can add attributes to the program. For our vertex shader we will have a vec2 uniform that will contain the screen width and height. An attribute defining the vertex position and another the uv. We need to pass an interpolated version of the uv to the fragment shader so we add a varying. The main function passes the uv value. Then we create a coord value.
Remember normalized device coordinates? To be on screen the x, y and z values must all be in the range -1 to 1. Position is a window coordinate.
At 0, 0 we want to convert this to -1, -1 and at window top right this should be 1, 1. If we divide position by screen_size then all positions on screen are in the range 0 to 1.
Multiply this by 2 and now we have a range of 0 to 2.
Subtract one and we’re in the range -1 to 1.
We set the z and w values to 1. Z could be -1 or 0. But if it is less than -1 or greater than 1 it would be clipped and you’d get a blank canvas.
The fragment shader is super simple. We define a precision. Define the varying uv. And in the main function use vUv for the red and green channels. Remember in the fragment shader the value of vUv will be an interpolated value of all the vertices in the triangle based on the fragments location in the triangle. Blue is set to 0 and alpha to 1. Then we call compileShaders to create the program. All would be well if compileShaders existed, time to create this function.
Let’s keep things simple. Let’s split the task into making two shaders and then making a program from the compiled shaders. The makeShader function we’re going to write needs two parameters. The first will be the shader type and the second the text source. Our makeProgram function will take the compiled shaders and return a program. Because in this example there is only one program we’ll add use program to tell WebGL to make the newly created program the active one.
OK. So now we need makeShader and makeProgram. Let’s start with makeShader. Remember this takes two parameters; the type, vertex or fragment and the source. We use the WebGL method createShader passing the type. Then we pass the source. Now we compile the shader. Better check all went well. The WebGL method getShaderParameter returns true if compilation was successful when used with the query flag compile status. If this is false then we tidy up by deleting the shader. Showing a console warn and throwing an error. If all went well then we return the shader.
function makeShader(type, source) {
const shader = gl.createShader(type);
gl.shaderSource(shader, source);
gl.compileShader(shader);
if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) {
gl.deleteShader(shader);
console.warn(source);
throw "Shader is Invalid";
}
return shader;
}
Now we have the shaders time to create the WebGL program. The makeProgram function takes the compiled shaders as parameters. We first create a new program using the gl method createProgram. Then we attach the two shaders one at a time using the attachShader method of the WebGL context. To complete the process of creating a program we also need to use the WebGL method linkProgram that finalizes the creation of the data on the GPU. Like creating a shader we should check all went well. We do this using getProgramParameter passing the program and the constant LINK_STATUS. If this returns false then we get the problem using getProgramInfoLog and pass this to console warn. And throw an error.
function makeProgram(vertexShader, fragmentShader) {
const program = gl.createProgram();
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
if (!gl.getProgramParameter(program, gl.LINK_STATUS)) {
console.warn(gl.getProgramInfoLog(program));
throw "Unable to link Program";
}
return program;
}
Back in the init method we can add
program = setupProgram();
We getting close. But currently the program uses a uniform and two attributes. Currently we haven’t passed this data to the program. To pass the uniform we’ll use another function set2fUniform with three parameters; the program, a uniform name and the values to pass. The 2f refers to two floats or a vec2. We need a pointer to the memory location of this uniform. We get this using getUniformLocation. Since initially the uniform does not exist this method both returns an existing uniform and creates a new one. Then to populate a vec2 we use the WebGL method uniform2f, passing the location and 2 float values. We could use values[0] comma values[1]. But the rest parameter symbol three dots converts the values array into just that syntax.
It just remains to set the vertex attributes position and uv for the program. For this we’ll use another function, createAttribute. We’ll use three parameters; program, the attribute name and the vertices array. First we extract an array that only consists of the named property using the JavaScript Array method map. This iterates through the array and returns the value that matches vertex name. We need the size of a single item in this new array. Then we create a new buffer. We get the location of the attribute on the GPU using the WebGL method getAttribLocation. Just like uniforms this has a dual purpose it can return the position of an existing attribute or create a new one. Now we prepare the location by calling enableVertexAttribArray. We bind the buffer, using the constant type ARRAY_BUFFER. The next WebGL call is to vertexAttribPointer. This describes to the GPU how to use the currently bound buffer. It takes the location, the size of each item in the array, the type of data, whether to normalize the data to a range based on the type. Since this does not apply to floats we set it to false. Parameter five is the stride, you can put gaps between each element, for this example the data is tightly packed so stride is 0. And the last parameter is an offset value to the first item, again this is 0 for our purposes.
Now we create a helper attribute. This is so we can update the position attribute at run time if the window changes size. It has the values array, the buffer and a refresh method as properties. The refresh method binds the buffer, then passes data using the bufferData method of the WebGL context. For this we need to convert the values into a typed array, each element in the array must be a single float not an array. JavaScript has a useful method flat which converts an array of arrays into a simple array. Now we have this helper we can call the refresh method to actually pass the CPU data to the GPU.
We just need one more function. This time to draw the glRect element. Or any other element that is made up of triangles. We simply call the WebGL method drawElements. For this example the type is TRIANGLES. We pass the length property of the element, remember that’s the number of indices. The index type and the buffer.
function drawElement(element) {
gl.drawElements(
gl.TRIANGLES,
element.length,
gl.UNSIGNED_SHORT,
element.indexBuffer
);
}
In the resize method. Add
if (glRect) drawElement(glRect);
The shader is just in the bottom left. By default the gl viewport will be 300 x 150, the canvas size at the time the context was created. In the resize method add
if (gl){
gl.viewport(0, 0, window.innerWidth, window.innerHeight);
}
Much better, but there’s still a problem. If the window changes size then the uniform screen_size will have the wrong values and the attribute position will also be wrong.
If program exists then call the set2fUniform function passing the program, screen_size and the window size. If position exists then update the values array and call its refresh method.
Unity have just contracted me to write their DOTS, Data Orientated Technology Stack e-book. To celebrate I thought I’d have a sale of my Unity courses. They’ll be at the best price on Udemy through to Tuesday. Time to grab a bargain!
A Complete Guide to Unity’s Universal Render Pipeline
As Unity gradually switches to URP from the Built-in Render Pipeline, it’s time to learn the new techniques from the author of Unity’s URP e-books.
The theme of this years competition is announced. 13th century. I gave it some thought and decided on a quest for the Holy Grail. I’m doing a WebXR game using ThreeJS. I’ve already created a working project framework using npm and webpack with help from Matt McKenna.
With a 13k size limit Blender models are a no-no. All assets need to be created in code. I fiddled with the ThreeJS Editor and came up with this as the player character.
Downloading this as a JSON file is 12K uncompressed. Let’s remake it in code.
14th August
Big day. Before the competition theme was announced I’d been working on the key components I thought my game would need. A VRButton. I’d already created one for my WebXR course. But I made a tweak to it for displaying the VR Cardboard icon from an svg string.
I had created the most basic 3D physics engine. If you look in the source you’ll find it in the src>SimplePhysics folder. Just 3 files,
SPWorld.js
This is where rigid bodies are added and collisions calculated
SPBody.js
A single rigid body
SPCollider.js
A SPBody instance has a single collider which can only be a Sphere or a AABB ( Axis Aligned Bounding Box )
Minified and zipped it comes in under 2K.
If you want to see it in action and you’ve downloaded the repo then rename index.js as index-game.js and rename index-sp.js as index.js. If you’ve got the game running that’s npm run start then you can see it in a browser using localhost:8080. The physics isn’t perfect to say the least but needs must when the entire game budget is only 13k.
My first step in creating the game was to change a sphere into my player character. The downloaded json file from the ThreeJS editor game the necessary geometries, material colours and mesh positions and orientations. Here’s the code to create the knight.
createModel(){
const gSkirt = new THREE.CylinderGeometry(0.4, 0.6, 0.5, 32, 1, true );
const gHead = new THREE.SphereGeometry(0.4, 24, 10);
const pHelmet = [
new THREE.Vector2(0.5, 0),
new THREE.Vector2(0.5, 0.2),
new THREE.Vector2(0.45, 0.2),
new THREE.Vector2(0.4, 0.3),
new THREE.Vector2(0.3, 0.4),
new THREE.Vector2(0, 0.5),
];
const gHelmet = new THREE.LatheGeometry(pHelmet, 12);
const pTunic = [
new THREE.Vector2(0.45, 0),
new THREE.Vector2(0.43, 0.1),
new THREE.Vector2(0.4, 0.2),
new THREE.Vector2(0.32, 0.3),
new THREE.Vector2(0.16, 0.4),
new THREE.Vector2(0.05, 0.5),
];
const gTunic = new THREE.LatheGeometry(pTunic, 12);
const gBelt = new THREE.CylinderGeometry(0.45, 0.45, 0.2, 32, 1, false);
const mSkirt = new THREE.MeshStandardMaterial( { color: 15991041 } );
const mHead = new THREE.MeshStandardMaterial( { color: 16373422 } );
const mHelmet = new THREE.MeshStandardMaterial( { color: 0xC7C7C7 } );
const mTunic = new THREE.MeshStandardMaterial( { color: 16777215 } );
const mBelt = new THREE.MeshStandardMaterial( { color: 12615993 } );
const root = new THREE.Group();
const skirt = new THREE.Mesh( gSkirt, mSkirt );
skirt.matrix.fromArray(
[1,0,0,0,0,1,0,0,0,0,1,0,0,0.25,0,1]
);
root.add(skirt);
const head = new THREE.Mesh( gHead, mHead );
head.matrix.fromArray(
[1,0,0,0,0,1,0,0,0,0,1,0,0,1.3466628932086855,0,1]
);
root.add(head);
const helmet = new THREE.Mesh( gHelmet, mHelmet );
helmet.matrix.fromArray(
[1,0,0,0,0,1,0,0,0,0,1,0,0,1.4010108612494776,0,1]
);
root.add(helmet);
const tunic = new THREE.Mesh( gTunic, mTunic );
tunic.matrix.fromArray(
[1,0,0,0,0,1,0,0,0,0,1,0,0,0.6106004423389476,0,1]);
root.add(tunic);
const belt = new THREE.Mesh( gBelt, mBelt );
belt.matrix.fromArray(
[1.2,0,0,0,0,1,0,0,0,0,1,0,-0.04,
0.5495005511829094,0,1]
);
root.add(belt);
root.traverse( object => {
if ( object.matrixAutoUpdate ){
object.matrix.decompose( object.position, object.quaternion, object.scale );
}
});
return root;
}
I also created a castle tower in code. I added my JoyStick control for testing on the desktop. Put it all together and had this – not bad for day 1
August 15th
I worked on animations for the player character today. Given the tight 13k budget. Using a 3D content creator like Blender and exporting as a GLB is a none starter. So I used the ThreeJS Editor, carefully moving and rotating the sword root object into various poses then writing down its position and rotation.
Having got a set of keyframes. I created a JS object.
Of course the player needs an enemy. Meet the Black knight. Just the same with different material colours and one point on the helmet LatheGeometry points array changed.
August 16th
Today I coded the castle walls and towers. Added a DebugControls class to allow keyboard entry when testing using the WebXR emulator on a desktop. I also added some bad guys. Super primitive AI they just move toward the player character. The bad news is I’ve only got 1k left to complete the game. Something might have to go!!! Here’s a screengrab from my Quest2
August 17th
Today I refactored the game. Removed the BasicUI. Removed the OBJParser and the Rock OBJ String. Instead I create a rock using an IcosahedronGeometry instance then randomly perturb the vertex positions.
class Rock extends THREE.Mesh{
constructor(radius=0.5){
const geometry = new THREE.IcosahedronGeometry(radius, 8, 6);
geometry.translate( 0, radius, 0 );
const vertices = geometry.getAttribute('position');
for(let i=0; i<vertices.array.length; i++){
vertices.array[i] += (Math.random()-0.5) * 0.35;
}
vertices.needsUpdate = true;
const material = new THREE.MeshPhongMaterial( {color: 0xaaaaaa } );
super(geometry, material);
}
}
I limited the scene to one tree type. This gained me 2K. I was unfeasibly happy by this. That’s what happens with this competition! And makes it fun.
I updated the castles, created Player and Enemy classes that extend the Knight class so I can create the models using the Knight class but have different behaviour for the Player and an Enemy. And I created some new props.
August 18th
Today I setup patrolling for the bad guys. Just a four cornered path and the enemy moves around this path unless the player is within 10 world units. I also started work on the introduction panel and gameover panel. No way in the byte allowance I can use a custom font. That would blow the budget straightaway.
August 19th
Main thing today was making the sword functional. I added an Object3D to the end of the sword. In the Player update method I do a check using the physics engine to see if this object position intersects any colliders. If the ThreeJS object associated with the physics body has the name ‘Gate’ or ‘Enemy’, I call methods of the object. For Gate that is the method openGate. I have a problem though I only have 33 bytes left. I did some checking, removing the sfx increases the bytes to 330. But removing the CollisionEffect increases the remaining bytes to over 2K. All assets are nearly complete. So 2K should be enough. Looks like I need to simplify the CollisionEffect.
August 20th
A week into the competition and the game is developing well. I was travelling today so didn’t do much. I created a ForceField that will be visible for 10secs after a Shield pickup. It uses an InstancedMesh. An InstancedMesh instance takes geometry and material just like a Mesh. In addition it has a third parameter, count. The count parameter is the number of duplicates of the geometry. To position and orientate each mesh you use the setMatrixAt method. Passing an index and a matrix. Here’s the update method showing how the motion of the shields is handled.
update(dt){
this.time += dt;
const PI2 = Math.PI * 2;
const inc = PI2/ForceField.count;
let index = 0;
for(let row=0; row<ForceField.rows; row++){
const n = (row % 2) ? 1 : -1;
const y = (ForceField.height/ForceField.rows) * row;
for(let i=0; i<ForceField.count; i++ ){
const t = (this.time * n) % PI2;
const r = (this.time * -1) % PI2;
const z = Math.sin(t+i*inc) * ForceField.radius;
const x = Math.cos(t+i*inc) * ForceField.radius;
this.obj.position.set(x,y,z);
this.obj.rotation.set(0,t,0);
this.obj.updateMatrix();
this.meshes.setMatrixAt( index ++, this.obj.matrix );
}
}
this.meshes.instanceMatrix.needsUpdate = true;
}
August 21st
Travelling again today so didn’t achieve much. Main thing was rewriting the CollisionEffect as a InstancedMesh, rather than extending the custom class GPUParticleSystem. Gained nearly 1700 bytes. Well worth it.
August 22nd-23rd
Lot’s of debugging. I now have the basis of a game. Lots of fine tuning to do. I have 384 bytes left. But a bit of tiding up might gain me enough to add some sound.
4th September
I was away for the last few days with my daughter and the grandkids. Didn’t get anything done! I did a session of debugging yesterday, added a little sound and with 23 bytes left submitted!
13K is a serious limit and restricted what I could add as gameplay. But I really enjoyed working within this restriction. Particularly happy with the physics engine. Looking forward to next years theme.
Disappointingly there was a bug on the js13kgames site which made my game unplayable on the voting site so it received no votes and came last in the WebXR category!!! The problem was a cross-origin problem meaning the Three.JS library wouldn’t load from the path provided by the organisers. Frustrating after the spending many hours creating the game. Heh-ho, nevertheless I enjoyed the challenge.