Update 161: 02/25/2021DESTRUCTION SYSTEM I've been working more on my
destruction simulation script, which I created last May.
This is the primary script I use for level design in Desolus, as it pre-fractures meshes so they can be used with the game's black holes.
As I mentioned previously, all of the meshes are procedurally generated from within the destruction script, and then reassembled accordingly.
This script was very robust, and turned out to be a great level design tool. However it was very slow running in editor due to the sheer volume of computations required.
In a short explanation, the script is looping through 2,000 meshes which comprise that cathedral, and 30,000 meshes which are the fractured “chunks” of those 2000 meshes.
To determine which chunks are within the boundaries of the green boxes, an algorithm computes the bounds and compares them to each chunk’s vertices.
It deletes the chunks which are in bounds and then reassembles the meshes back together with the remaining chunks.
Given the amount of meshes and vertices, this used to be pretty slow running on a single C# thread.
I decided it was in need of a speedup!
---
UNITY JOBS SYSTEMI've been glancing at
Unity's Jobs System for quite some time, but haven't had the chance to use it until now.
After spending some time the last several days, I ported my destruction simulation to the Jobs System.
It was fairly complex to port. The Jobs System only takes non-managed types, so you can’t give it a native array of arrays or anything.
Only non-managed types, like structures comprised of primitives, or simply primitives, are supported.
I had to redo all the internal logic for iterating over the meshes, to ensure they are done all at once, rather than one by one.
This was tricky because I had to take ALL mesh’s vertices, and plop them into the jobs system into a single native array.
Vertices had to be indexed by a value which determines which vertices belonged to which meshes.
After the bounds computations are done, the data needed to be reassembled back together using a complex loop.
Additionally, I had to reprogram all of the script's methods to be
Editor Coroutines, because the Jobs System runs on an asynchronous thread.
The coroutine approach was necessary so the Jobs System could finish computing data, before the script performs
Monobehaviour tasks. Previously, all of the methods were done on the main thread.
The results of all of this effort was fantastic!
The script running through a combined total of 16.3 million vertices, and 32.7 thousand meshes, in FOUR SECONDS!
This used to take TWENTY MINUTES.
Combined with the burst compiler, the jobs system parallelized it and made it super speedy.
I'm extremely excited because the time spent on this optimization will pay back itself several times over!
If you're looking to get started with the Jobs System I would recommend:
---