<START:TRANSMISSION_0002>
At the get-go of working on this new site update, I wanted to incorporate an image of myself combined with WebGL and Three.js or something to add intrigue.
I have some portraits of myself but I’ve grown a bit tired of them, so I needed to either schedule a photoshoot with a friend, or get creative.
Hey AI
I started playing around with Midjourney, which incase you’re not familiar is one of a few popular ways to generate imagery based on simple prompts. Under the hood, it’s doing some state of the art stuff, most of which I don’t understand, but overall, the results have blown me away.
This is where it struck me, I’d combine what I was learning with WebGL with Midjourney and created an interactive avatar.
Creating the image
Generating an image with Midjourney takes learning the “art of the prompt”. The prompt is the thing that you write that leads to the image being made.
I used a source image of myself, combined with the prompt: “man smiling in daylight outdoor garden, blue sky and clouds, wide, full body, colorful, film photography, realism, bokeh --q 2 --v 4”
The output kiiiiiinda looked like me, so I chose the top left quadrant and had Midjourney generate more variations.
After some refinement, the AI was keying in on the right features that fit my likeness.
Editing
Next I worked on the editing and refinement of the image. First isolating the figure with a mask, removing aberrations, adjusting color, upscaling, photo compositing, and retouching.
The WebGL-ification
Because this isn’t a 3d model, and just a flattened image, if I was going to import it into WebGL and get anything worth a damn, I would need to create what’s called a normal map, which is a texture map that can be overlaid on a rasterized image to create dimension.
The normal map looks like this:
So I’m using react-three-fiber which is built upon three.js. The image and the normal map are merged together onto a flattened plane geometry, I have three light sources in the scene, and have tied the position of the light source to the movement of the cursor. So as you move a cursor around you get this subtle change in light cascading over the simulated dimension of the texture mapping. After many weeks of tinkering, I’ve netted out a pretty fun result.
You can play with the result here: newnew.john.design/exp/john-gl
This is one of many experiments that I’ve made in the process of developing my updated site. You freely navigate around the work-in-progress version of the site which and see some of these yourself. I’ll be deep diving some of them in future transmissions.
I hope in the future to play with more of the 3d rendering aspects of three.js.
Here are some helpful links:
Midjourney: https://midjourney.com/home/
React Three Fiber: https://docs.pmnd.rs/react-three-fiber/getting-started/introduction/
Three.js: https://threejs.org/
Normal Maps: https://cpetry.github.io/NormalMap-Online/
Thanks for nerding out with me.
–John
<END:TRANSMISSION_0002>