October 28, 2024

In the spring of 2024, I took a special relativity class at my high school.

It was a really fun time (shoutout to Dr. V), and out of the experience came a project idea: something something relativity rendering??

This blog post is the first in what will be a series detailing the evolution of this project.

The vision: a realtime, interactive special relativity renderer with fancy modern optics; a proof-of-concept relativistic game engine with the features to make something sophisticated, something beyond a teaching tool.

There are games that operate under special relativity physics: Velocity Raptor and A Slower Speed of Light. They’re great! But their engines have substantial limitations.

Velocity Raptor has limited support for moving objects, and no acceleration. The player is represented with a physically impossible rigid body, and there’s no scene lighting.

A Slower Speed of Light, in exchange for 3d rendering and good direct illumination optics, has completely static scene geometry. Only the observer can move.

More recently, there have emerged a number of non-interactive or close-to-but-not-quite interactive raytracers for both special and general relativity, like Astray or Non-linear Monte Carlo Ray Tracing for Visualizing Spacetime.

Science communication is the unifying theme of every visualizer of relativity that I’ve encountered. Relativity is unintuitive, and it’s useful to see terrell rotation or redshifting or black holes with your own eyes.

None of the visualization tools I’ve surveyed deal with relativistic mechanics. This is understandable, since they’re intended to communicate the most unintuitive aspects of special relativity visually, and it’s hard to do that with mechanics.

But I want something more: a physically absurd thought experiment made real, realistically. I want to see a 2cs-long train moving at 3/5c and then have that train collide with a wall. A reasonably featureful physics simulation and renderer you can build a game around that conforms to the framework of special relativity, if not our reality.

How hard can it be?

I began work on this project over the summer between high school and college. I knew that I would need Vulkan, for its hardware raytracing extensions. Having learned from my prior experiences using Vulkan in Rust, I opted to use vulkano instead of ash. It still caused me a lot of headache (I had to fork vulkano), and there’s still around a thousand lines of boilerplate, but it’s strictly an improvement.

Summer turns into fall, and I’m off to college! As a first-year CCS Computing major, I get to work on an independent programming project for my CMPTGCS 1L class for my first two quarters!! A perfect opportunity to realize a relativistic renderer :D

The first step to realization was realizing that rigid bodies cannot exist under special relativity. No object on the order of a lightsecond in size can be rigid, since that would require a faster speed of sound than speed of light. You know what that means? It’s softbody time!!!

This put some constraints on my engine: I’m not interested in dealing with 3d softbodies or 4d worldlines. So, 2 spatial dimensions and 1 temporal dimension it is! Keeps things simpler.

I decided to start with the softbody physics.

I went with a simple physics scheme: represent softbodies as a grid of point-mass particles connected by springs — but, with the twist that the simulation timestep can be no greater than the time it takes light to travel between two adjacent particles. This way, information cannot propagate through the softbody faster than c. That does mean that the speed of sound is c, but hey no objects on this scale can exist irl we left plausibility behind a long time ago.

Collisions happen when particles are within about a light-timestep of each other. Since particles can move up to the speed of light, and my timestep is fixed at a pretty high value, I need to catch fast-moving particles before they jump past.

The first step was implementing a numerical integrator (rk4) to turn forces into velocities. Since we would be working with hundreds of thousands of particles, the implementation needed to be GPU based. In fact, pretty much all of the everything needs to be GPU based. So, I implemented rk4 as a series of 5 compute shaders chained together, converting each particle’s instantaneous forces into instantaneous velocities.

As for detecting collisions, that’s a trickier proposition. Once I know there’s a collision I can just add it to the pile of forces to integrate. But how do I detect a collision without looping over every particle?? Lucky for me, Sebastian Lague has a GPU-friendly solution for exactly this problem. So, I adapted his GPU implementations of a spatial lookup grid and bitonic merge sort.

Not bad!

But we’re still using newtonian physics. Special relativity, it turns out, doesn’t affect the softbody physics engine very much. Calculations can all be performed with respect to the ground frame, and we still have F = dp/dt, a = dv/dt, and v = dx/dt. The only difference is that we need to use a different formula to convert from force to acceleration; instead of a = F/m, we use a = 1/(γm0) * (F - (v.F)v/(c^2)).

Let’s see how it runs!

Oh. Hmm…

This is where having a discrete timestep is a problem: the instantaneous acceleration of a particle might be so high that multiplying by my timestep makes the object move faster than light. γ becomes imaginary, and NaNs start propagating. So, I clamp the velocity of particles at 0.9999c.

At low velocities this works pretty well. But high speed collisions create this…

That actually looks pretty neat… still, not quite what I’m going for.

As it turns out, it’s the finite timestep striking again! While softbodies with infinite tensile strength could probably be modelled with a more sophisticated simulator, at high enough speeds it’s easier to just let these already unrealistically strong bonds break.

This is how we get the video I teased this blog post with. Putting it here to save you the scroll :)

But don’t worry, this hasn’t made everything hopelessly brittle. As configured, objects don’t start breaking until you collide at over 0.3c.

We have relativistic mechanics :DDD

Wait, what about the rendering?

Well, for now, I’ve just been rendering the measured reality of the ground frame with the particles rendered as points. Next blog post, I’ll implement a more sophisticated raytraced renderer, so do stay tuned for that (I have an RSS feed!)

Til next time :D