Welcome, Guest. Please login or register.

Login with username, password and session length

 
Advanced search

1411539 Posts in 69383 Topics- by 58441 Members - Latest Member: Amit Kumar

May 02, 2024, 10:27:00 PM

Need hosting? Check out Digital Ocean
(more details in this thread)
TIGSource ForumsCommunityDevLogsDesolus: A Surreal First Person Puzzle Game
Pages: 1 ... 18 19 [20] 21 22 ... 26
Print
Author Topic: Desolus: A Surreal First Person Puzzle Game  (Read 110020 times)
Mark Mayers
Level 10
*****



View Profile WWW
« Reply #380 on: December 22, 2019, 12:27:22 PM »

It looks like this project has been around for a while, but I'm just finding it now. It looks fantastic! Definitely going to be keeping an eye on this one.

Toast Right
Logged

Desolus Twitter: @DesolusDev Website: http://www.desolus.com DevLog: On TIG!
clickbecause
TIGBaby
*


View Profile
« Reply #381 on: January 24, 2020, 02:46:20 PM »

Hi Mark,

Amazing work! I stumbled on this devlog while doing research on GPU based "portal" implementations. I'm working on a project that revolves around some similar ideas (periods of time instead of alternate universes). Would you be willing to talk a bit about how your universes are organized? For example, are both universes located in the same physical space and only rendered differently? Or are they actually discrete terrains that are offset in world space? Are they both present in the same Unity scene file?

I've been throwing tons of ideas at the wall and found a few promising solutions, but you are soooo much further ahead of me and I would love to learn from your experience implementing this idea. Thanks!
Logged
Mark Mayers
Level 10
*****



View Profile WWW
« Reply #382 on: January 26, 2020, 03:06:22 PM »

Hi Mark,

Amazing work! I stumbled on this devlog while doing research on GPU based "portal" implementations. I'm working on a project that revolves around some similar ideas (periods of time instead of alternate universes). Would you be willing to talk a bit about how your universes are organized? For example, are both universes located in the same physical space and only rendered differently? Or are they actually discrete terrains that are offset in world space? Are they both present in the same Unity scene file?

I've been throwing tons of ideas at the wall and found a few promising solutions, but you are soooo much further ahead of me and I would love to learn from your experience implementing this idea. Thanks!


Hey! Thanks for reaching out.

I've talked a bit about how the rendering works in Desolus in previous posts, such as this one from last March.

The very first prototype of the alternate universe mechanic which is the basis of the game was back in May 2016.
This was a simple stencil buffer mask done with shaders.

However, recently I've been moving over to Unity's Scriptable Render Pipeline and completely building one from scratch.
The new pipeline is almost finished (I'll post about it over the next week or two), but it changes the rendering system a bit.
I talked a bit about my motivations to switch to SRP in the last DevLog entry.

Over the years though the portal rendering in Desolus has become pretty sophisticated, as the technology needs of the game improved.

I'm not sure exactly how deep down the rabbit hole you would like to go.
However, I would probably start out with researching shader tutorials on the stencil buffer.
Logged

Desolus Twitter: @DesolusDev Website: http://www.desolus.com DevLog: On TIG!
Mark Mayers
Level 10
*****



View Profile WWW
« Reply #383 on: January 26, 2020, 08:01:07 PM »

Update 150: 01/26/2020

SCRIPTABLE RENDER PIPELINE, PART II

(PART I)

Over the last month I've made a huge amount of progress in creating my own Scriptable Render Pipeline for Desolus.

At the beginning of January, I completed my lighting model and added shadows to Desolus.
Since then I have been porting all of my shaders over to HLSL, and have been building the rest of my SRP.  

---

FINISHING OFF THE LIGHTING



Earlier this month I finished off my BRDF lighting model for Desolus.
The lighting model is superior to Unity's standard shader, as I'm using my own custom BRDF which I discussed in the previous post.

You can see the gradient of metallic and specular values of materials in the above image.
Compared to last update, I fixed a few mathematical errors in my code which led to incorrect lighting.

Although the function is tweaked with a art-directed values (like the small amount of Fresnel) it's still primarily physically based.
This physically based rendering workflow allows me to create a diverse set of materials, simply by tweaking the metallic and specular properties.

---

CREATING SHADOWS



Implementing shadows was a large amount of work (about 3 days of programming).
However, I was able to implement a version based on Catlike Coding's SRP series.

Again, I would highly recommend Catlike's SRP documentation series, and Unity's own SRP documentation is practically non-existent.

These shadows are high-quality cascaded shadows, with tweakable resolution and shadow filtering.
Because they are still using some of Unity's API, they integrate into the existing shadow functions which can be set per directional light.

---
Logged

Desolus Twitter: @DesolusDev Website: http://www.desolus.com DevLog: On TIG!
Mark Mayers
Level 10
*****



View Profile WWW
« Reply #384 on: February 25, 2020, 12:58:25 PM »

Update 151: 02/25/2020

SCRIPTABLE RENDER PIPELINE, PART III

Most of the time between January and February 2020 was spent on further implementation of the scriptable render pipeline.
Fortunately, it's almost done!

Below is a step through of how everything in the game is rendered so far.

---

SHADOW MAPS



Rendering shadows in games can be fairly complex, but fortunately I had ample amount of reference.

Desolus is currently using an 8k resolution shadowmap, with four cascades.
Additionally, I am using a 7x7 Percentage Closer Filtering algorithm to drastically smooth shadows.

You can read more about implementing shadows in the reference link.

Inspecting the shadow maps in Unity's frame debugger is pretty neat, they're almost like architecture prints

---

DRAWING GEOMETRY



After the screen is cleared from the shadow pass, I draw normal geometry to the screen.

Drawing geometry in Scriptable Render Pipeline is considerably more efficient with draw calls, due to Unity's 'SRP Batcher.'

As you can see from this screenshot, I am drawing a considerable amount of modular geometry on screen.
Normally, this would result in a great deal of draw calls.
However, because of Unity's SRP batching system, the draw calls are highly optimized to reduce CPU load.

You can see the lighting of my BRDF model at work here. I gave the architecture in Desolus a slightly metallic property.

Additionally, I opted instead of using Unity's fog model, to use Inigo Quilez's fog.
This model is significantly better, as Unity's fog is not based in world-space distance and instead based in view-space distance.
Using this fog model, rotating the camera doesn't distort the fog color.

---

DRAWING INSTANCED GEOMETRY (PARTICLES)



Desolus uses my own custom implementation of TC Particles, which is written by my friend Arthur Brussee whom I worked with on Manifold Garden.

I have been using the particle system since the very first builds of Desolus.

TC Particles is highly performant and allows for GPU particles which run in a compute shader.
Although the system is a bit old, as it was created in 2014, it was considerably ahead of its time in that Unity was still using CPU particles.

I briefly contemplated migrating to Unity's VFX graph (as it also supports GPU particles).
However, the VFX graph is DEEPLY coupled with Universal Render Pipeline and High Definition Render Pipeline, and I would rather not deal with porting it.

Instead, I opted to port TC Particles from Unity's old pipeline to my own custom SRP.

For anyone else porting similar systems, there are a few critical differences between Unity's old API and the new SRP API.

  • Functions like OnRenderImage() and Camera.OnPreCull() no longer work.
    If you're designing your own SRP, you must manually place when to call your functions.
    Honestly this is a good thing, because the order of events is considerably less ambiguous and you have manual control.
     
  • Unity's 'Graphics' functions no longer work.
    For example, instead of using Graphics.DrawMeshInstancedIndirect, you MUST use the Command Buffer version instead.
    For each camera, pass in your Scriptable Render Pipeline's command buffer, and have it call this function if the object isn't culled.
     
  • All shaders must be written in HLSL, instead of CG, but you probably already knew this.

TC Particles processes everything in a compute shader, draws instanced geometry with C#, and then renders the result in a shader.
Fortunately, the existing code was modular enough in that it was primarily only changing over the API as described above.

Overall, the port took several days. However the code was written well enough that it was a clean and successful.

---

POST PROCESSING: COLOR CORRECTION



Another challenge with Scriptable Render Pipeline, is getting post-processing working.
To begin, I looked into Unity's own Post-Processing Stack. However, I opted to implement my own stack as the Desolus post-processing is fairly minimal.  

After all of the normal and procedural geometry is drawn, I begin rendering my post-processing stack for Desolus.
Currently, there are only two effects (Color Correction and Bloom), as the Fog is done within the fragment shader.

Despite the relative simplicity of these effects, there were a few challenges.
As I mentioned previously, in SRP the Unity graphics functions such as OnRenderImage(), no longer work. Additionally, there is no Graphics.Blit function.

A reasonable substitute for Graphics.Blit is CommandBuffer.Blit.
However, due to the restrictions with Unity's Blit (specifically with the stencil buffer, as discussed previously, I created my own 'Manual' Blit function.
This function simply binds the camera's color/depth textures to a material, sets the render target, and draws a static triangle mesh with the image effect material.

You can see the Blit in action in the frame debugger, under 'Draw Mesh.'
One draw is used to copy the current camera's color/depth buffer to a temporary texture with the image effect, and another to copy back to the render target.

After creating this, porting the color correction script was fairly simple. I've been using ColorCorrection.cs for ages, and the Desolus colors are graded very precisely.
Porting this involved transferring the old C# code to my new SRP post-processing stack, and copying over the animation curve vales.
Additionally, I translated the old Unity CG shader into HLSL.

It worked well! The port ended up being super clean, with a very small amount of code difference.

---

POST PROCESSING: BLUR AND BLOOM



A bit more challenging to port than the Color Correction effect, was the Bloom in Desolus.
After attempting a direct port of Unity's 'Optimized Bloom' and having it be considerably bugged, I almost gave up.

I began implementing my own bloom shader, after a bit of research. I successfully implemented this with my SRP, but the results were sub-par compared to my old bloom.
Additionally, from a production standpoint, I knew that *yet again* would I have to spend a considerably amount of time on color-grading.
The Desolus color grading process in total takes me about 10-15 hours to do, as it's very meticulously adjusting color curves and bloom values.

Knowing this, I got a bit of rest and tackled the problem again.
I ended up successfully porting over both the old CG shader, to HLSL and the old C# code to the SRP API with a bit of focus.

The bloom shader itself is a 'fast' Gaussian Blur which progressively down samples the image based on a certain number of iterations.
I have my bloom set to 8 iterations, with a billinear downsample for each texture.
This results in a large amount of 'Draw Mesh' calls from the custom Blit, but since Desolus is meant for desktop PCs, the performance impact is negligible.

Overall, this was also a clean port, albeit a bit more difficult.

---

FINAL IMAGE (SO FAR)



Although the basics of my SRP seems to be complete, I am still lacking a few critical features.

From a visual standpoint, there is no volumetric lighting.
My next set of work will be porting my existing volumetric lighting system over to my SRP, which could be challenging.

Also, most importantly, there is no rendering set up for portals between universes yet.
Although I (theoretically) perfected my approach in the old Unity pipeline, it's going to take a bit of work to implement in SRP.

Fortunately, I have total control over the rendering process so I won't have to fight Unity. Crazy

---
Logged

Desolus Twitter: @DesolusDev Website: http://www.desolus.com DevLog: On TIG!
amasinton
Level 1
*



View Profile
« Reply #385 on: February 28, 2020, 03:47:37 PM »

Fortunately, I have total control over the rendering process so I won't have to fight Unity. Crazy

---

Thank you for this in-depth discussion of how you built your SRP in Unity.  So many insights!  And it looks so so so good!  

With your experience and skillset you have really been able to seize the many, many good things 2019.3 offers, adapting your project to take advantage of the excellent performance and control opportunities of the new system.  You don't have to fight Unity, bend it to your will, as much any more because you have so much more control.  Very well done, indeed!

It seems like 2019.3 is a turning-point where Unity now favors users with deep engine and rendering knowledge over users with art and design knowledge.  In order to take advantage of the power that Unity offers, you now need to be able to create your own SRP, or you need to be able to re-structure your architecture to use ECS/Burst, etc.  You sort of make your own engine within the engine.  Is that concerning or liberating to you?

Thanks for your thoughts and keep up the inspiring work!  
Logged
oahda
Level 10
*****



View Profile
« Reply #386 on: February 29, 2020, 01:20:46 AM »

Aaaaa looking forward to reading this a bit later!!

Game is so pretty <3
Logged

Mark Mayers
Level 10
*****



View Profile WWW
« Reply #387 on: March 09, 2020, 07:01:46 PM »

Fortunately, I have total control over the rendering process so I won't have to fight Unity. Crazy

---

Thank you for this in-depth discussion of how you built your SRP in Unity.  So many insights!  And it looks so so so good!  

With your experience and skillset you have really been able to seize the many, many good things 2019.3 offers, adapting your project to take advantage of the excellent performance and control opportunities of the new system.  You don't have to fight Unity, bend it to your will, as much any more because you have so much more control.  Very well done, indeed!

It seems like 2019.3 is a turning-point where Unity now favors users with deep engine and rendering knowledge over users with art and design knowledge.  In order to take advantage of the power that Unity offers, you now need to be able to create your own SRP, or you need to be able to re-structure your architecture to use ECS/Burst, etc.  You sort of make your own engine within the engine.  Is that concerning or liberating to you?

Thanks for your thoughts and keep up the inspiring work!  

Hey! Thank you so much for your enthusiasm Smiley

I think that SRP is honestly an absolute godsend for games like Desolus.
The rendering in Desolus is too specific and niche, I really had to write it myself.
Instead of creating convoluted solutions which work around Unity's pre-existing architecture, I can make my own specific solutions.

Fortunately with Universal Render Pipeline or High-Definition Render Pipeline, I think others will benefit in other ways.
URP and HDRP are are built with the public SRP framework, which means there are fewer 'black box' situations.
You can also extend URP or HDRP yourself, with certain features like custom render passes.

Although it's ostensibly more complex, I appreciate the flexibility and choices Unity is giving developers.
(That being said, URP and HDRP have a longgg way to go right now, before they are production ready.)

Aaaaa looking forward to reading this a bit later!!

Game is so pretty <3

 Toast Right Toast Right Toast Right
Logged

Desolus Twitter: @DesolusDev Website: http://www.desolus.com DevLog: On TIG!
Mark Mayers
Level 10
*****



View Profile WWW
« Reply #388 on: April 04, 2020, 02:10:51 PM »

Update 152: 04/04/2020

SCRIPTABLE RENDER PIPELINE, PART IV

During the end of March, I finished the work on the Desolus Scriptable Render Pipeline! (For the most part.)

There are two things left to talk about, being portal rendering and performance.

---

PORTAL RENDERING AND BANANA JUICE

I won't be going into portal rendering in detail, but I will talk about a specific problem.

Banana juice.

What is banana juice? Watch this talk on portal rendering, by Valve.



https://youtu.be/riijspB9DIQ?t=1066

Banana juice is normally solved by using an oblique projection matrix, which essentially culls everything in a camera in front of a specific plane.

However, this is only useful if you have one camera per portal, or render a portal camera manually.
Desolus only has one camera for each universe, and uses the stencil buffer to render portals.

To render objects correctly, instead of using an oblique projection matrix, I generate a per-pixel portal mask using the stencil buffer and the depth buffer.



Both the stencil and depth buffers are used to generate a color buffer portal mask.
This mask is used with Scriptable Render Pipeline, to show which universe the camera should display.

There are two types of portal masks:
-One without using the depth buffer
-One with using the depth buffer

With these two masks, I can determine which pixels an object should discard in its fragment shader.
In the below image, the black pixels are the 'banana juice' which will later be discarded, to allow for perfect portal rendering.



It gets a bit complicated. I'll write an in-depth article on the portal rendering some day revealing all of my secrets.

---

PERFORMANCE

Gathering some baseline performance metrics, the game now performs, on average almost 70% faster!

This is a HUGE performance gain. We are talking about:

Scene in old pipeline, at 4k resolution: 35 frames per second.
Scene in the new Desolus SRP, at 4k resolution: 60 frames per second.

This is also before the game is fully optimized, so I will see further performance gains down the line!

---

WHAT'S NEXT?

I still have to integrate volumetric lighting back into the game.
As such, I have been working with Raphael Ernaelsten (creator of Aura) to help adapt Aura 2 to Scriptable Render Pipeline.

However, my primary focus has now shifted back to level design, architecture, and content creation.
An exciting end to a long journey!

---
Logged

Desolus Twitter: @DesolusDev Website: http://www.desolus.com DevLog: On TIG!
Mark Mayers
Level 10
*****



View Profile WWW
« Reply #389 on: April 25, 2020, 03:35:03 PM »

Update 153: 04/25/2020

ARCHITECTURE AND LEVEL DESIGN



Have been primarily working on creating the city Desolus takes place in, over the last month.

It's coming along!

---
« Last Edit: June 27, 2020, 10:20:49 PM by Mark Mayers » Logged

Desolus Twitter: @DesolusDev Website: http://www.desolus.com DevLog: On TIG!
Mark Mayers
Level 10
*****



View Profile WWW
« Reply #390 on: May 30, 2020, 02:57:58 PM »

Update 154: 05/30/2020

BLACK HOLES AND DESTRUCTION



Back in September I created a new prototype for the black holes in Desolus, which gives an effect of destroying architecture.

The idea behind this concept was to make the black holes in Desolus feel like a force of nature, rather than an artificial creation.
Instead of moving individual sections buildings with black holes, you are instead tearing and destroying pieces of the architecture.

The prototype effect was done entirely in a shader, which looks convincing and is an interesting effect.

However, it's impracticable from a game design perspective:
  • The effect performed with shaders by discarding fragments within a certain radius, so the mesh is not actually destroyed.
  • Since the mesh is not actually modified, collision isn't valid, and the player can't interact with these environments.
  • The effect is very expensive, and has wasted computation of vertex and triangles which aren't seen but always render.
  • From a level design perspective, the workflow for this effect up was not intuitive or sustainable.
  • More complex destruction beyond a single sphere isn't possible with the limitations of the shader.

Because of these limitations, I investigated a solution for procedurally modifying meshes to convincingly simulate destruction.

I set a few requirements for developing a tool which would allow me to design the black hole destruction for Desolus:
  • Given a series of points and radius (simulating the game's black holes) this tool computes a localized destruction effect.
  • This tool is essentially instantaneous in and built in to the Unity editor, as this would be a core level design task which is repeated constantly.
  • Destroyed meshes generated by this tool are optimized and combined, as thousands of destroyed pieces are impractical.
  • The level design tool is agnostic to the fracture algorithm, in case it's improved later or modified.

What that in mind, I set off on a month long adventure to create this tool.

---

VORONOI ALGORITHM



My intuition led me to believe an ideal place to start would be an implementation of Voronoi cells, in three dimensions.

I created an algorithm which given a radius:
  • Creates a list of 3D points generated from a stochastic grid.
  • Finds the Delaunay triangulation and resulting Voronoi diagram (in three dimensions) of those points.
  • Instantiates a sphere comprised of meshes created from the Voronoi cells.

The result is a somewhat convincing base for procedural destruction.
You can see the 'rocks' are generated in a pattern which would resemble rubble from a destroyed building.



---

VORONOI ALGORITHM, PART II: NVIDIA BLAST



Although I was able to implement this fairly quickly, a far more difficult problem is having the Voronoi cells confined to a pre-defined shape.
I decided to look at computational geometry papers to implement a solution.

One of the best papers I could find on the subject, is NVIDIA's 2013 paper, 'Real Time Dynamic Fracture with Volumetric Approximate Convex Decompositions.'
This paper was extraordinarily helpful, but I realized that implementing a practical solution myself in Unity would take more time than I had available.

Rather than reinvent the wheel, I opted to implement an extension of NVIDIA Blast for use in Unity.
I am fairly positive NVIDIA Blast, in some capacity, is a modernized and production ready implementation of that 2013 paper.

I may or may not use NVIDIA Blast as a final back-end tool for Voronoi destruction, since it has a few issues.
However, for the time being it's very viable to work with.  

Using NVIDIA Blast's Voronoi fracture algorithm, I was able to write an editor tool in Unity where you can procedurally fracture any mesh.



---

VORONOI ALGORITHM, PART III: MESH ISLANDS VS. RANDOM



Something which drastically improved the quality of the mesh fracture algorithm, was first using a decomposition algorithm to break the mesh into individual components.

For example, architectural meshes (like my cathedral towers) are broken into windows, walls, buttresses, etc.
This allows a significantly more realistic and readable form of destruction than simply random Voronoi.

The algorithm I created:
  • Breaks a mesh into its individual components (islands).
  • Analyzes the volume of each island, and determines the number of chunks it should be fractured into based on a set chunk density.
  • For each island, fracture into into the computed number of chunks using our Voronoi algorithm. Setting a chunk density allows for Voronoi chunks which are consistently sized.

Additionally, since the mesh islands are not destroyed or modified significantly, this makes recombining the meshes for optimization easy.
Both the mesh islands and chunks can be kept in the resulting GameObject, to determine the optimal mesh for recombination.



For optimization during recombination, I initially planned on using an octree algorithm.
My idea was to recursively subdivide the meshes with Voronoi, to form an octree hierarchy for optimization.
However, the island approach actually worked considerably better from both a visual and programmatic sense.
A more complex algorithm doesn't necessarily mean there will be better results.

---

IMPORTANCE OF THE 32 BIT MESH BUFFER



If you made a 65,536 vertex mesh and 4,294,967,295 vertex mesh fight who do you think would win.

As a side note, something absolutely critical in this algorithm is Unity's 32 bit meshes.
By default, Unity's 16 bit meshes can only have 65,536 vertices, which is a huge limitation.

For this algorithm to work, a single mesh is desirable. The island decomposition seems to have the best results when a single mesh is the input.
Without 32 bit meshes computation of high-poly models, such as the architecture in Desolus, would not be feasible.

---

LOCALIZED FRACTURES, AND OPTIMIZATION



The other component of this mesh destruction tool is the interface with Desolus's black holes.
I wrote a script which determines mesh destruction and optimization for in-game usage.

My vision for this tool was to set a series of points/radius (spheres) to determine how meshes should be destroyed.
The tool uses pre-fractured meshes, so computation time in-editor is ideal, and an optimization process to recombine meshes after evaluating destruction.

MESH PRE-FRACTURE COMPUTATION
  • For each input GameObject, combines all sub meshes of that object into a 32 bit mesh.
  • Breaks the combined mesh into its individual components (islands).
  • Analyzes the volume of each island, and determines the number of chunks it should be fractured into based on a set chunk density.
  • For each islands, fracture into the computed number of chunks using our Voronoi algorithm.
  • Save all data of fractured GameObject into a Prefab, so it can be accessed later. We want to keep both the islands and their chunks.
  • Assign a 'proxy' script to the input GameObject, which stores its fractured prefab variant.

LOCAL MESH FRACTURE (Defined with a series of spheres)
  • For each input GameObject, instantiate its fractured variant from the proxy script, which we will use for computation.
  • For each island component in the fractured object, evaluate which islands are within the spheres, or not within the spheres:
    -For each point/radius in our point list, run the bounds computation to determine if it's within the spheres or not.
    -If an island is completely encompassed by the spheres, keep the full island for the final mesh.
    -If the island is only partially encompassed by the spheres, run the bounds computation on each of the island's chunks.
  • Take our islands & chunks and combine them into optimized meshes, one for each 'half' determined by our bounds tests.

---

CONCLUSION

With this tool, I can create an infinite permutation of these destroyed buildings, each optimized for use in-game.
All I have to do is manipulate a series of points and radius, and I can specify exactly how I want destruction to happen.

This workflow makes designing destruction intuitive, infinitely reusable, and keeps the variant intact if I change a mesh's fracture algorithm or art.

---
« Last Edit: June 27, 2020, 10:00:43 PM by Mark Mayers » Logged

Desolus Twitter: @DesolusDev Website: http://www.desolus.com DevLog: On TIG!
Schrompf
Level 9
****

C++ professional, game dev sparetime


View Profile WWW
« Reply #391 on: May 31, 2020, 03:14:13 AM »

Impressive write-up, thank you! I imagine splitting up a triangle soup into voronoi cells is difficult, no? You basically do Constructive Solid Geometry there. Does Blast do this, or did you really write a CSG operation from scratch?

Either way, I'm impressed.
Logged

Snake World, multiplayer worm eats stuff and grows DevLog
Mark Mayers
Level 10
*****



View Profile WWW
« Reply #392 on: May 31, 2020, 12:25:55 PM »

Impressive write-up, thank you! I imagine splitting up a triangle soup into voronoi cells is difficult, no? You basically do Constructive Solid Geometry there. Does Blast do this, or did you really write a CSG operation from scratch?

Either way, I'm impressed.

Hey thanks!

The first part of the voronoi algorithm was 100% from scratch, but for the voronoi cells to appear as the solid inside to the mesh, that's using Blast's algorithm.

My initial design was to create the voronoi cells first, and then do a boolean intersection of the voronoi with a given mesh.
However, there aren't any great CSG libraries with Unity which produced accurate results, and writing my own from scratch would have been out of scope for this project.
This is one of the primary reasons why I decided to use Blast as a computational back end.
Although I could write a lot of these geometry algorithms myself, I can't compete with the army of academics at NVIDIA.

There are still a few issues with Blast, one being I think the project may no longer be in production.
There haven't been any updates to the official repository since September 2019, which is right when Unreal launched their Chaos engine.

Since Blast officially is ONLY an Unreal plugin, this pretty much rendered Blast obsolete the day Chaos launched.
I was lucky I was able to get Blast working at all in Unity, I used a C# wrapper over Blast's DLLs which I manually compiled and imported into Unity.

I think in the future I may switch to a different computational back end, but for the time being this suits my purpose.
Logged

Desolus Twitter: @DesolusDev Website: http://www.desolus.com DevLog: On TIG!
amasinton
Level 1
*



View Profile
« Reply #393 on: June 10, 2020, 01:38:54 PM »

My goodness this is impressive!   Shocked

You're doing in Unity what I have to do in my modeling app and then import into Unity - namely the Voronoi fracturing of buildings.  All of my fractured geometry in my game is therefore "baked" because it's actually a separate mesh (or hundreds of meshes, because I sometimes want each fragment to move individually) from the intact whole.  My approach is really rigid and makes changing things tough.  I like your approach much better - although I don't think I understand it very well.  It seems like you can just break things up any time and any way you want.  That's magic.  Well done!   Toast Right

It's also interesting to see that we both share a similar approach to fracturing the buildings in that they are fractured by structural island - walls, buttresses, windows, etc.  I agree that this does make a more visually satisfying result after destruction.  It's not 100% realistic, but it has enough of a suggestion of realism that the eye and mind accept immediately.  Just enough detail where it counts!

Finally, thanks for the reply way back in March to my questions about the direction Unity seems to be going in.  You gave a well-reasoned, measured response - something that is rare at any time.  I appreciate that.

Logged
Mark Mayers
Level 10
*****



View Profile WWW
« Reply #394 on: June 12, 2020, 06:03:17 PM »

You're doing in Unity what I have to do in my modeling app and then import into Unity - namely the Voronoi fracturing of buildings.  ....  It seems like you can just break things up any time and any way you want.

Yea exactly! I knew that I could have prebaked fractured meshes using an external modeling program, but the options are definitely limited with this approach.

With the tool I created, I can have infinite permutations of fractured meshes computed in-editor.
I just hit one button and it creates a destroyed version of the mesh, with the type of destruction I need for level design.

It's also interesting to see that we both share a similar approach to fracturing the buildings in that they are fractured by structural island - walls, buttresses, windows, etc.  I agree that this does make a more visually satisfying result after destruction.  It's not 100% realistic, but it has enough of a suggestion of realism that the eye and mind accept immediately.  Just enough detail where it counts!

For sure, fracturing by structural islands definitely gives a more visually pleasing result.
I won't be able to do a true physically based destruction simulation with the limits of current real-time technology, but I can get something which is good enough!

Finally, thanks for the reply way back in March to my questions about the direction Unity seems to be going in.  You gave a well-reasoned, measured response - something that is rare at any time.  I appreciate that.

Haha, like I mentioned I think Unity is a bit rough around the edges right now, but I think they are heading in the right direction.
Logged

Desolus Twitter: @DesolusDev Website: http://www.desolus.com DevLog: On TIG!
Mark Mayers
Level 10
*****



View Profile WWW
« Reply #395 on: June 27, 2020, 10:27:50 PM »

Update 155: 06/27/2020

More Shaders!

After I completed the destruction framework (in the last post) I took a bit of a break to do more shader programming.
I still had to port over my sky shaders into my custom Scriptable Render Pipeline, and managed to do so!



Version of the sky and lighting system during sunset.

---



I've always loved the aurora borealis, so I created a stylized version with shaders.

---

These shaders have existed in Desolus for quite some time, but I had to rewrite them from scratch to be compatible with my custom SRP. Finally got around to it.

I've been working primarily on level design now, but more on that later!


« Last Edit: June 29, 2020, 11:15:05 PM by Mark Mayers » Logged

Desolus Twitter: @DesolusDev Website: http://www.desolus.com DevLog: On TIG!
Mark Mayers
Level 10
*****



View Profile WWW
« Reply #396 on: June 27, 2020, 10:40:55 PM »

(Also a quick meta DevLog update).

I had to painstakingly re-upload my old gifs, as they since they seem to have been removed from my old hosting website (ugh).
Fortunately, I realized TIGSource supports Vimeo embed, which is of vastly higher quality than any gif.

Vimeo also seems to have the correct or nearly correct colors for Desolus, which is fantastic.
Every other video hosting website I've done distorts the colors due to compression.

I'll be using Vimeo going forward, I hope you can enjoy some lovely vids.






---
Logged

Desolus Twitter: @DesolusDev Website: http://www.desolus.com DevLog: On TIG!
Mark Mayers
Level 10
*****



View Profile WWW
« Reply #397 on: September 11, 2020, 01:01:01 PM »

Update 156: 09/11/2020

EVOLUTION OF BLACK HOLES IN DESOLUS

Desolus has changed dramatically in its various prototypes prior to when the game entered production in 2017. However, the sole constant aspect of the game has been the thematic focus on black holes as a game mechanic. As you read through the game’s DevLog, you can see how these changes happened over time.

In this DevLog entry, I’ll focus on the current (and final) implementation of the black hole in Desolus, which is the game's core mechanic.  





---

CONTEXTUALIZING WITHIN THE GAME’S DESIGN

One criticism I received is that Desolus was too abstract, too alien, devoid of narrative context. After receiving this constructive criticism, I pivoted my game’s design and themeing to something less abstract, while still retaining its unique elements.

As the game’s narrative and vision solidified around ‘Explore a city of Gothic architecture, which is torn between universes’ at the end of 2018, I had to revisit a lot of the game’s mechanics. My vision for the game is you are exploring a city which has gone through a dimensional cataclysm. The black holes, the portals, the alternate universes, these are all natural occurrences. I wanted these elements to feel like hurricanes, tornadoes, earthquakes, etc, but on a cosmic scale.



The Great Day of His Wrath, John Martin 1851–1853
 

I realized in context with the game's current design, many of the previous elements in the game felt contrived or out of place. Everything felt alien, abstract, like it was intentionally placed rather than being a natural occurrence.

To avoid cliches of the first person puzzle genre, there needed to be no cubes and switches, no portal gun, no GLADoS, no test chambers with a locked door and key. Many first person puzzle games are still stuck in 2007, and I believe the genre can move forward into new territory by abandoning these elements.
  


The black hole in Desolus, September 2018: Version with the black hole 'gun' which is now removed.


With this goal in mind, I combed through various elements of the game which needed revision. The first thing I eliminated was the ‘gun,’ which was in previous prototypes of the game. This element was no longer needed, as the black holes are a natural force rather than an artificial one. The black hole is mechanically similar, but streamlined and with narrative context.

However, the black holes in Desolus needed to feel like a natural destructive force. Previously, black holes switched specific pieces of architecture between universes, and not others. This felt artificial, like some being specifically chose which objects are linked to a black hole.



The black hole in Desolus, September 2019: Streamlined version, with natural destruction effect.


Instead of entire specific buildings, I opted for black holes ripping out huge chunks of architecture between universes. This felt more like architecture is being physically ripped through space and time. This approach is considerably more interesting and thematically appropriate.

However, I knew I could do better by taking advantage of the procedural mesh destruction in Unity I created earlier.

----

VFX AND SHADER ANALYSIS

In a previous post, I discussed my new workflow for procedural mesh destruction in Desolus.

I took advantage of pre-fracturing meshes procedurally in Unity, and integrated that tool into the VFX. The goal with this shader was to make it seem like the black hole is manipulating individual pieces of the building, rather than the entire building at once. Although we're entirely in the realm of science fiction and fantasy, I do think this results in a more 'realistic' effect.

STEP 1: Baking a chunk's center point into the mesh UVs.



Using a VFX technique I found on Twitter, I found it was possible to bake in pivot data into a mesh's UVs. This allows for an object to be modified in a vertex shader by its pivot, rather than individual vertices. With this in mind, you can move entire 'chunks' of objects at a time.

Taken from this example of baking pivots in UV data, "We've now essentially created a new coordinate expression, what I would call "fragment position", we can use this to pass unique values to separate fragments instead of distorting the vertices of the entire mesh as a whole."

After a mesh is fractured pre-fractured, I modified my procedural destruction script to update the mesh's UVs with the center point of each destroyed chunk. You can see from the above image the data is there. I baked the X and Y coordinate into UV1, and the Z coordinate into UV2, and pulled it out in the vertex shader.

STEP 2: Transformations of position/rotation, given a chunk's center point.



After the pivot data is baked, it's possible manipulate chunks in a vertex shader.
All positions/rotations are modified based off of the chunk's pivot value, rather than individual vertices.

This allows for interesting and complex effects, which are super performant since it's done entirely on the GPU.
For example, the above gif is a simple gravitational attraction shader, offset a bit with some noise.
It looks like a full particle system! But in reality it's just a vertex shader and a mesh.

STEP 3: Gravitational lens and 'black hole' screen effect.



Previously, I've talked about my gravitational lens shader, which I wrote all the way back in 2015!
This is an updated version of that shader, which is physically based on a gravitational lens. I suggest you read my previous post for more detail.

In terms of the black hole's corona (its outer rim), I created a simple flow map shader which animates.

STEP 4: Putting it all together.



The final steps were integrating all of these new elements into my previous system for the Desolus black holes.

I added a modified version of step 2's vertex shader into my shader for the architecture in Desolus.
Important to note is that the shader also has its own shadow pass and depth pass, so that shadows and volumetric lighting render properly.

---

IN CONCLUSION

All of these changes to the black holes I feel really add narrative context to the game which was lacking previously.
By anchoring concepts and design in natural phenomena, I believe this will help player connect with the game’s themes and increase the depth of the game’s world.
« Last Edit: September 11, 2020, 01:17:00 PM by Mark Mayers » Logged

Desolus Twitter: @DesolusDev Website: http://www.desolus.com DevLog: On TIG!
retrophilion
Level 1
*



View Profile
« Reply #398 on: September 11, 2020, 01:40:30 PM »

What are you?! A wizard or something?!
Logged
Mark Mayers
Level 10
*****



View Profile WWW
« Reply #399 on: September 11, 2020, 01:51:38 PM »

What are you?! A wizard or something?!

Logged

Desolus Twitter: @DesolusDev Website: http://www.desolus.com DevLog: On TIG!
Pages: 1 ... 18 19 [20] 21 22 ... 26
Print
Jump to:  

Theme orange-lt created by panic