visit
The tool I use for accessing WebGL functionality in the browser is called three.js. It's a package that simplifies the process of doing 3d work in the browser - and to do this, it uses canvas
. We'll append this through code later.
It's important to understand that three.js simply gives us an interface to work with WebGL, which is an API for rendering 2d and 3d objects on the web. that's why we'll import three.js next. You can do this through npm
. The two things we're going to want to install here are as follows:
npm i three
npm i open-simplex-noise
npm install three-orbitcontrols
import * as THREE from "//cdn.skypack.dev/[email protected]";
import { OrbitControls } from "//cdn.skypack.dev/[email protected]/examples/jsm/controls/OrbitControls.js";
import openSimplexNoise from '//cdn.skypack.dev/open-simplex-noise';
create a new scene, for our 3d objects to sit on
create a camera, so we can look at our scene
create a renderer, and set its size so we don't get weird fuzzy shapes
add in our orbital controls, so we can click and drag our object and move it around
// Scene
let scene = new THREE.Scene();
// Camera
let camera = new THREE.PerspectiveCamera( 75, innerWidth / innerHeight, 0.1, 1000 );
camera.position.set(1.5, -0.5, 6);
// Renderer
let renderer = new THREE.WebGLRenderer({antialias: true, alpha: true});
renderer.setSize( innerWidth, innerHeight );
// Append our renderer to the webpage. Basically, this appends the `canvas` to our webpage.
document.body.appendChild( renderer.domElement );
new OrbitControls(camera, renderer.domElement);
First up, is our sphere. Each object in three.js consists of two parts - the geometry, which is the vertices and triangles that make up the sphere, and the mesh, which is the colors, patterns, and features of those vertices.
Since we want to manipulate all of our vertices ultimately, I am going to store them all separately in a positionData
array too. We will use the built in Vector3
function to store sets of 3d coordinates in three.js.
// Create our geometry
let sphereGeometry = new THREE.SphereGeometry(1.5, 100, 100);
// This section is about accessing our geometry vertices and their locations
sphereGeometry.positionData = [];
let v3 = new THREE.Vector3();
for (let i = 0; i < sphereGeometry.attributes.position.count; i++){
v3.fromBufferAttribute(sphereGeometry.attributes.position, i);
sphereGeometry.positionData.push(v3.clone());
}
// A `normal` material uses the coordinates of an object to calculate its color
let sphereMesh = new THREE.MeshNormalMaterial();
// Combine both, and add it to the scene.
let sphere = new THREE.Mesh(sphereGeometry, sphereMesh);
scene.add(sphere);
Now, one caveat here, I decided to make my sphere a little more customizable, and to do that, I used shaders. So, when we call MeshNormalMaterial
, it actually does something a bit unusual for the web. It uses something called shaders to calculate the color of each vertex.
There are two types of shaders, fragment
, which is essentially the colors of the object, and vertex
, which is the position of the vertices on that shape. These shaders are written in GLSL or OpenGL Shading Language - so not Javascript. I'm not going to go into detail about how this language works, but it's a little bit more like C than Javascript.
Instead of using MeshNormalMaterial
, we can use ShaderMaterial
, and build our own shaders.
We will use Normal Material shaders - so the same effect will occur, but having them in our code means we can update it later - for example, change the colors.
We can pass Javascript variables into the shader in real time using uniforms
, which are a special type of variable in GLSL.
That means we define our GLSL in the HTML, and pull it in with a Javascript selector. Note: I haven't made any real changes to these shaders compared to MeshNormalMaterial
- the only difference is I am passing in a color as a uniform. That means we can change this value from Javascript if we want. I'll only show the fragment shader here, but both can be found in the . Notice that I define uniform vec3 colorA
- that's the variable we will be using from our Javascript!
<script id="fragment" type="text/glsl">
uniform vec3 colorA;
#define NORMAL
#if defined( FLAT_SHADED ) || defined( USE_BUMPMAP ) || defined( TANGENTSPACE_NORMALMAP )
varying vec3 vViewPosition;
#endif
#include <packing>
#include <uv_pars_fragment>
#include <normal_pars_fragment>
#include <bumpmap_pars_fragment>
#include <normalmap_pars_fragment>
#include <logdepthbuf_pars_fragment>
#include <clipping_planes_pars_fragment>
void main() {
#include <clipping_planes_fragment>
#include <logdepthbuf_fragment>
#include <normal_fragment_begin>
#include <normal_fragment_maps>
gl_FragColor = vec4( normalize( normal ) * colorA + 0.5, 1.0 );
#ifdef OPAQUE
gl_FragColor.a = 1.0;
#endif
}
</script>
A normal shader calculates the color of a pixel by the calculation normalize( normal ) * 0.5 + 0.5
. As such, we can swap out the first 0.5
for a custom color, that being our uniform colorA
. We can then add both of these vertex and fragment shader to our Javascript like so:
let sphereMesh = new THREE.ShaderMaterial({
uniforms: {
colorA: {type: 'vec3', value: new THREE.Vector3(0.5, 0.5, 0.5)},
},
vertexShader: document.getElementById('vertex').textContent,
fragmentShader: document.getElementById('fragment').textContent,
});
let noise = openSimplexNoise.makeNoise4D(Date.now());
let clock = new THREE.Clock();
renderer.setAnimationLoop( () => {
// Get the time
let t = clock.getElapsedTime();
sphereGeometry.positionData.forEach((p, idx) => {
// Create noise for each point in our sphere
let setNoise = noise(p.x, p.y, p.z, t * 1.05);
// Using our Vector3 function, copy the point data, and multiply it by the noise
// this looks confusing - but it's just multiplying noise by the position at each vertice
v3.copy(p).addScaledVector(p, setNoise);
// Update the positions
sphereGeometry.attributes.position.setXYZ(idx, v3.x, v3.y, v3.z);
})
// Some housekeeping so that the sphere looks "right"
sphereGeometry.computeVertexNormals();
sphereGeometry.attributes.position.needsUpdate = true;
// Render the sphere onto the page again.
renderer.render(scene, camera);
})
Now our sphere will start morphing! I repeated this for the plane behind the sphere, too. I used a BoxGeometry
here, with just a basic mesh, that makes it look like a wireframe. The code for that bit, along with everything else, .
Making 3d shapes on the web is a great frontend skill to have. Although a lot can be done in CSS and HTML, ome effects can only be achieved through 3d, and three.js provides the perfect platform to do that on. I hope you've enjoyed this quick guide to creating a 3d morphing sphere in three.js and Javascript. If you'd like more Javascript content, you can read all my other stuff here.