1 Aug 2025
JS: The terrain work for the project has been a really exciting part of the project so far. A lot of the inspiration came from Nazia’s desert journey — her videos and 360° photos of the environment were highly inspiring. To recreate a terrain similar to the photographs was the first step. We started with photogrammetry, hoping to build something directly from those visuals. Unfortunately, there just wasn’t enough data to get a usable result. 
After that, we discussed Unity’s terrain builder, but Isaac pointed out some compatibility issues with web-based platforms. So, we shifted gears again and gave InfraWorks a try. We downloaded a sample terrain and ran a test import. The outcome wasn’t perfect, but it definitely showed promise. That small win gave us the push we needed to keep going. On a team call, we decided to look for terrain data with the right topography and details — something closer to what we saw in Nazia’s images. We started scouting areas in West Texas, near the Chihuahuan desert, and managed to identify a few spots that looked promising. 
Now we wait to see what those imports bring. Fingers crossed! 
Videos and images, Moroccan desert, 2025
18 Jul 2025
IH: For Salt Lines, the work I did involved creating a system of boids where we could render multiple thousands of boids within a WebGPU environment, in order to distribute this experience over the web with no download. 
I started by naively for-looping over the boids to see how much we could get. Performance wasn’t great, but we had flocking, avoidance, and cohesion working. Then I went to the Burst compiler, and suddenly had multiple thousands running at 100+ FPS. The next milestone was teaching them to follow a target, and then offsetting each boid so they could form a shape: a ship, whale, or lion. That was a big win. 
Exporting to WebGPU, however, broke Burst, so I had to step back and rebuild everything using shaders. I converted the CPU-bound logic into a ping-pong shader system, where the GPU computes all the boid states into textures. Suddenly the few thousand boids I had were running at hundreds of FPS.​​​​​​​ 
I pushed the system as far as I could, and I was able to get up to 65,500 boids rendering at over 100 FPS in the web browser. That was a huge victory.
Now the goal is to reintroduce mesh-based offsets into the shader system, so the swarm can form shapes again and follow a central game object. Eventually, we want users to be able to manipulate that object themselves — guiding a massive swarm across the landscape in VR.” 
WebGPU is so new that it comes with strange limits. Unity’s terrain system immediately hits WebGPU’s texture/shader cap, which I believe is set by the browser. A single diffuse layer works, but as soon as you add normals or speculars, it stops rendering. It’s one of those challenges that comes with pushing bleeding-edge technology. ​​​​​​​
15 Jul 2025
NP: The second project we’ve started is called Salt Lines — an immersive VR experience. This was inspired by my recent art residency in the Moroccan desert. I imagined a massive point cloud, or swarm, made up of blue cubes swirling above the minimal desert landscape. We’re taking this initial concept and bringing it to life using Isaac’s programming and Justin’s 3d design skills. It’s been great to collaborate on developing the idea and really thinking about how to create organic movement, transformation and interaction. 
During research on ‘swarms’, I came across two references that are super interesting. The first is a YT video called “Swarm Intelligence: The Power of the Collective Mind” and the second is a series of photographs of starling swarms by Danish photographer Søren Solkær, definitely worth checking out both. 
Above: Images/Video created with Midjourney
Early concept sketch
Early concept sketch
Early concept sketch
Early concept sketch
12 Jul 2025
NP: It’s been great to regroup again after the successful completion of the Displacement Doorway project last year! … We’re coming together again across 3 different timezones: Austin, Atlanta and Stockholm. We have a couple of projects on the go. 

The first is a quick installation with a Kinect, (revisiting a project from my Masters program), where we’re mapping a point cloud to a live feed. Isaac helped to update the Processing code and get it all running with the Kinect2. It’s about connecting text with spatial data - literally poetry in motion! We’re organizing a live dance performance to demo this in action, more to come on that … here are a few images of WiP - 
Back to Top