Welcome, Guest. Please login or register.

Login with username, password and session length

 
Advanced search

1411196 Posts in 69314 Topics- by 58380 Members - Latest Member: feakk

March 18, 2024, 06:53:07 PM

Need hosting? Check out Digital Ocean
(more details in this thread)
TIGSource ForumsCommunityDevLogsPERSONAL SPACE - A story of galactic exploration and interior decorating
Pages: 1 [2] 3 4 ... 8
Print
Author Topic: PERSONAL SPACE - A story of galactic exploration and interior decorating  (Read 23670 times)
Schrompf
Level 9
****

C++ professional, game dev sparetime


View Profile WWW
« Reply #20 on: November 11, 2019, 11:22:41 PM »

Go Nuts and Bolts, please.
Logged

Snake World, multiplayer worm eats stuff and grows DevLog
oahda
Level 10
*****



View Profile
« Reply #21 on: November 11, 2019, 11:46:14 PM »

Dang, stole my bad pun. Who, Me? Looking good!
Logged

nova++
Level 4
****


Real life space alien (not fake)


View Profile
« Reply #22 on: November 18, 2019, 10:37:32 AM »

Yesterday I finally re-added gas giants to the system generator. It's been neglected for a long time as I focus on interior stuff, but I've been poking at it again. It's like seeing an old friend.

Logged

oahda
Level 10
*****



View Profile
« Reply #23 on: November 18, 2019, 11:34:23 AM »

Gorgeous. You've done a great job getting the look down! What does it look like as you get very close or even enter the clouds? Or is that not something you can do?
Logged

nova++
Level 4
****


Real life space alien (not fake)


View Profile
« Reply #24 on: November 18, 2019, 02:59:40 PM »

Gorgeous. You've done a great job getting the look down! What does it look like as you get very close or even enter the clouds? Or is that not something you can do?

At present there's not much to look at, but you can fly into them. A proper cloud renderer is something on the bucket list, and eventually you'll be able to go so deep that the sun fades away and your ship implodes. So, uh, don't.



Something of an aspirational design (featured in good old Cosmos). Different cloud layers, towering columns, thunder and lightning... Floating life forms would be lovely too but animals are certainly something for a post-feature-complete world.

Here's a picture from the old thread of cruising inside the atmosphere of one while being a bad driver and going in the back while the engines are going (there will be an autopilot for that, later)



I'm still thinking about storms. I have an idea for a hacky way to do them, based on how craters work, but I dunno how it will turn out... we'll see, I guess. That's another thing for later, though.
« Last Edit: November 18, 2019, 03:06:00 PM by NovaSilisko » Logged

oahda
Level 10
*****



View Profile
« Reply #25 on: November 18, 2019, 03:10:32 PM »

Kiss Brings back good memories of discovering and binging Cosmos over a late couple of nights probably closer to ten years ago now. Should do that again. Looking forward to seeing what your apple pie will look like once you're done baking!
Logged

nova++
Level 4
****


Real life space alien (not fake)


View Profile
« Reply #26 on: November 20, 2019, 03:03:22 PM »

Here's another one. I made some tweaks to how it generates the banding and I'm pretty happy with how it works right now. I'm totally okay with lots of vivid colors in this game.



It has a range of palettes to work through defined as gradients with keyframes:



It picks two of them, and then effectively does a random-walk through both simultaniously to generate pixels of a small texture that's used to inform the actual texture generator. The blending is weighted such that it's usually maybe 80-90% one and 10-20% of the other, but it's possible to get a 50-50 mix, occasionally yielding some strange colors.

Later I'm going to divide them up roughly to match the Sudarsky gas giant classification, which explains various visual aspects of gas giants based on temperature, caused by different components in their atmosphere and clouds: https://en.wikipedia.org/wiki/Sudarsky%27s_gas_giant_classification



It's a bit fuzzy, though. Sudarsky's classification doesn't cover smaller gas/ice giants like Uranus and Neptune, so I'll have to do some more digging to see what might affect their appearance. Not too rigidly, though. I want my purple gas giants to still show up from time to time.
« Last Edit: November 20, 2019, 03:10:58 PM by NovaSilisko » Logged

Schrompf
Level 9
****

C++ professional, game dev sparetime


View Profile WWW
« Reply #27 on: November 20, 2019, 11:25:45 PM »

And I want to see them, too. I wonder how you want to join this with the inner features of gas giants which you want to implement later. I *desire* alien gas giant worlds full of wonders and miracles. Yet I fear that you'll stick with the colouring scheme and nothing else simply out of production capacity.

I'm so sad that No Man's Sky stopped at colourful module worlds and never tried to add a debris field with Endymion-like space trees connecting the asteroids with giant roots, ship wrecks telling a little story about two lovers that died there centuries ago, gas giants with bubbly squids roaming around cloud towers. They would have had the capacity for it. Instead they settled for "random everything".
Logged

Snake World, multiplayer worm eats stuff and grows DevLog
nova++
Level 4
****


Real life space alien (not fake)


View Profile
« Reply #28 on: November 21, 2019, 03:00:59 AM »

And I want to see them, too. I wonder how you want to join this with the inner features of gas giants which you want to implement later. I *desire* alien gas giant worlds full of wonders and miracles. Yet I fear that you'll stick with the colouring scheme and nothing else simply out of production capacity.

I'm so sad that No Man's Sky stopped at colourful module worlds and never tried to add a debris field with Endymion-like space trees connecting the asteroids with giant roots, ship wrecks telling a little story about two lovers that died there centuries ago, gas giants with bubbly squids roaming around cloud towers. They would have had the capacity for it. Instead they settled for "random everything".

I'm still doing my best to keep true to my "based on a true story" description. I want to use things like the Sudarsky classification and other physics to inform how it generates things, but I want it to be more vivid and varied, a bit more wild, "larger than life" you could say.

I wish I could generate little stories for the player to discover, in a coherent fashion. Part of the "background plot radiation" is the fact that every structure you come across is long abandoned yet shows no sign of conflict or destruction, only the wear caused by the passing of the ages, leaving the open question of: what happened?

The answer is: I don't know. And I wish I could make the game tell that story bit by bit, by giving you little pieces of story in a way that lets you build your own picture of how things went, without really having a strictly defined answer. Maybe I can figure out how to do that, someday.

The solar systems themselves can tell stories for the more astronomically inclined. The generation isn't nearly there yet, but I would love if someday it could reach the level of complexity that you can mentally work backwards and think about how XYZ feature of the system might have formed billions of years ago.
« Last Edit: November 21, 2019, 03:13:54 AM by NovaSilisko » Logged

nova++
Level 4
****


Real life space alien (not fake)


View Profile
« Reply #29 on: November 27, 2019, 08:02:36 AM »


SCATTERBRAINED II



I've been messing with atmospheric scattering again, on a mission to alleviate the weird issues that were going on before. At first, I set out to rewrite the original functions from scratch - they weren't my code, and I didn't really understand how they worked. So redoing them step by step would be a good way to learn that.

But as I went on, this ended up morphing into a different method entirely. Sort of. Let me try and explain, as simply as I can...

(which means 95% of what you are about to read won't make any sense)

The process for rendering the atmosphere, at its heart, based around raymarching: the act of walking a point across some range, and using each point to contribute to the final rendered result. This must be done for every pixel, so as you can imagine, the performance impact of every operation in the pile adds up fast. The actual calculation of rayleigh/mie scattering is barely an issue in comparison.

As explained quite well in this post: http://davidson16807.github.io/tectonics.js//2019/03/24/fast-atmospheric-scattering.html

the case of atmospheric scattering is a particularly dense one. As the point is walked through the atmosphere, it needs to take in contributions not just from the atmosphere between the point and the camera, but between the point and the sun. That means the number of operations is [view ray sample count] * [light ray sample count] * [light count], which is... a lot.

So instead, we turn to approximations, as does the link above. This allows us to just do a bit of math to calculate the atmosphere for the light ray, rather than doing dozens more samples every time. But this is where things changed with my new version, however. Helpful as they were, the approximations as detailed in the link had a problem... See here:


We (myself and the author of that post) never did quite work out what was going on. But the gist of it is that there was a strange "banding" when viewing the atmosphere from within at low angles, and it was honestly so bad it prevented me from showing many pictures of the atmosphere from certain angles. Something as-yet-unidentified in the math just starts to break down.

Still, I recommend reading the original post to get an understanding of how the original works (integrals are still a bit black magic to me, if I'm being honest).

But as mentioned, this time around, I came up with a different way to do it. I realized that, because the atmosphere is being calculated as a spherically symmetric object, looking in any direction can be condensed down to two values: altitude, and angle. Like this:


Based on this, you can calculate the amount of atmosphere in a ray extending out from a point at a given altitude toward an angle. The ideal result is pretty much the same as the first approximation, although I wasn't sure if it would work out.

It turns out I'm not the first one to think of this (damn it!). A chap named Chapman came up with more or less the same concept decades ago: https://en.wikipedia.org/wiki/Chapman_function

There are a few approximations for it, but my execution is a bit different, and kinda lazy.

Presently, instead of calculating it on the fly every time, I'm pre-computing a lookup table for a range of angles from straight up to straight down in the Y-axis of the texture and vertically through the atmosphere in the X-axis. Each position on the table performs a long and expensive raymarch of its own (you know, the one I said earlier that we shouldn't be doing... but it's okay, because this happens only once and gets reused) to produce a reasonably accurate value.

So, if we need to find the amount of atmosphere along a ray, we can simply calculate the ray's angle relative to the surface below the point, fetch its altitude, and grab the value from the table that we need. Simple.

(I 100% did this texture lookup because it was easier than figuring out the math for the approximations Hand Thumbs Up Right it might perform better than doing the math each time too, but I assure you that wasn't on purpose, it's just a side effect of laziness)

How well does this work?


Well, shit. What went wrong? It's almost right.

The answer: It's my fault. It's because it's a texture - there's inherently some limit to its resolution, but the horizon is an instant, sharp line. This isn't as important for the samples that get used for the ray to the sun because it quickly gets averaged out, but it is important for the density along the ray from the camera.

That's where the problem comes from. No matter what, it never quite matches the actual horizon, and that misalignment leads to incorrect densities in the area of the horizon, and leads to the oddness you see above.

The fix was pretty simple. Previously, the view ray used the same lookup as the light rays. I changed it to instead add up the total amount of atmosphere as it steps along, and use that instead. And after that?



Much better.

At this point you're probably screaming silently into the void over why it's so damn small and pixelated, and the answer is: because it is. To facilitate much easier debug, I'm doing this on the CPU first, instead of in a shader. It takes ages to render, but it lets me dig deep into the life of each pixel:


Of course, it's got me a bit worried that the lower resolution and lack of real-time rendering is hiding something, which I won't know until I do an all-up test with it rendered properly as a shader...

Fingers crossed.

Edit:

Finger-crossing worked!


« Last Edit: November 27, 2019, 11:49:26 AM by NovaSilisko » Logged

Schrompf
Level 9
****

C++ professional, game dev sparetime


View Profile WWW
« Reply #30 on: November 27, 2019, 11:13:34 PM »

Very cool.

*furiously takes notes*

I failed on this like ten years ago, never got the circular math to match every case. It's probably something really stupid TodayMe would spot in an instant.
Logged

Snake World, multiplayer worm eats stuff and grows DevLog
oahda
Level 10
*****



View Profile
« Reply #31 on: November 28, 2019, 07:24:20 AM »

Pretty!
Logged

hidingspot
Level 1
*



View Profile WWW
« Reply #32 on: November 28, 2019, 07:27:36 AM »

Looks so nice!
Logged

nova++
Level 4
****


Real life space alien (not fake)


View Profile
« Reply #33 on: November 28, 2019, 01:39:14 PM »

Oh yes, I forgot a few more. It looks about the same as the old one, really - just without as many bugs






There's still one major thing I want to do at some point. Presently, the blending mode is far from ideal... rather than be overlaid the way it is now, it needs to become opaque and effectively saturate, when there's enough air (or more accurately, particulate in the air) to start blocking anything beyond that point.

Compare this less-than-pretty picture:



to the real deal:



They're at different times of day, but like I said, you can see the way the atmosphere becomes more opaque. To say nothing of the differing scatter characteristics between an idealized simplification and a real atmosphere made of stuff.

I need to not think about it so much...
Logged

nova++
Level 4
****


Real life space alien (not fake)


View Profile
« Reply #34 on: December 05, 2019, 02:22:34 PM »

It's a bit frustrating working on systems that aren't visually interesting, because I have nothing to show for it  Shrug

Nonetheless, I've been starting in earnest on the second complete rewrite of the underlying world code for the game, meaning the framework dealing with celestial bodies, reference frames, orbits, all that sort of stuff. I'm not quite ready to explain the new architecture I'm going for, mainly because I'm not 100% on how it will look at the end of this, but I think I have a fairly clear picture.

One part of it I'm fairly confident in, though, is the overall connective logic of each type of environment - interstellar, interplanetary, and local. Before, it was a bit clumsily handled from a central world handler object that had to keep a lot of plates spinning at any one time; an example of the "classic me" programming style of a number of handler objects with a ton of tasks, which I'm moving away from these days.

Now, instead, there are three Domains, one for each environment, and each one sort-of nested into the previous. For instance, when the Local domain is active, the Interplanetary domain will change its behavior to be effectively subservient to its subdomain.

What this actually does depends on which domain it is, and it's also not entirely nailed down yet, but a good example is the viewer position. If the interplanetary domain is active, the interstellar viewpoint will align with the interplanetary viewpoint (at vastly different scales, of course), and if the local domain is active, the interstellar viewpoint will align with the local viewpoint. That's the simplest example I can give, or can bring myself to explain without pain.

Man, I hate explaining programming stuff. I prefer when I can just spout something vaguely correct and then hold up pretty pictures to distract everyone.
Logged

nova++
Level 4
****


Real life space alien (not fake)


View Profile
« Reply #35 on: December 20, 2019, 04:38:33 PM »

RING THINGS

Hey! I'm doing something you can look at again! Ugly prototyping somethings, but still.



After a while away from working on digital things I am back to some shader-related things, this time shadowing for planetary rings. I'm hoping this technique works out, because it's looking very good so far.

An important thing (for me, that most people wouldn't notice) is getting an accurate penumbral effect from the shadow. Which is to say, the softening of the shadow the further away from the object it is.



I had some ideas of my own for how to do this, but I ended up making something directly based off of the work done here: http://iquilezles.org/www/articles/rmshadows/rmshadows.htm and, by extension, here: https://www.shadertoy.com/view/lsKcDD

I managed to fudge together something that properly ("properly") calculates the softness based on the angular diameter of the sun (possibly inadvertently stumbling upon the actual math? I'm not sure... definitely not... well, maybe? It's not accurate either way but that's due to the nature of this method) and the result is, if nothing else, reasonably convincing:



But it goes further than that. I'm fairly certain I can extend this technique to work with an arbitrary number of bodies using some sort of a buffer with basic geometry data in it, I can then sample a shadow network from any given point. I could potentially even do similar math CPU-side to get a realistic sun brightness value for the local lights... but one thing at a time.

« Last Edit: December 27, 2019, 03:49:54 PM by NovaSilisko » Logged

nova++
Level 4
****


Real life space alien (not fake)


View Profile
« Reply #36 on: December 21, 2019, 02:15:35 PM »

Developing the shader further. Brought back a ring mesh and texture generator that I wrote some time ago, and added a lighting model that I made back then too...





Said lighting model basically approximates every pixel of the ring as an illuminated sphere. It's better than treating it as if it were a solid plane, but it's only one piece of the puzzle. Light interplay in planetary rings is very complicated and affected by things like particles hiding other particles behind them, light bouncing between particles, light passing through particles directly (in the case of icy rings such as Saturn's), particles bouncing light back toward the camera...

And all of that depends almost entirely on the composition and density of the ring. A rocky ring will have very little in the way of translucency, for instance, and would appear darker from the backside than an icy ring. It doesn't help that we only really have Saturn's rings to use as a ground truth, beyond the small scattering of photographs from Uranus and Neptune...
Logged

michaelplzno
Level 10
*****



View Profile WWW
« Reply #37 on: December 22, 2019, 07:01:46 PM »

That's a good planet and all, but when are you going to add some furniture?!?
Logged

nova++
Level 4
****


Real life space alien (not fake)


View Profile
« Reply #38 on: December 24, 2019, 06:14:34 PM »

That's a good planet and all, but when are you going to add some furniture?!?

Soon enough! Something I enjoy about this project is I have a nice simple interior interaction sim and furniture models to go work on when I get sick of working on celestial mechanics and space shaders and everything.

...which, incidentally, hasn't happened yet.



RING THINGS
EPISODE II

RINGS ARE JUST PLANET FURNITURE

I redid the shadowing method completely. The raymarching things I linked previously have a decent visual result, but are very inaccurate when it comes to what the light should actually be doing. You can see in my gif version that the darkest part of the shadow (the umbra) always stays the same size, and the soft part around it (the penumbra) grows around it. That's not really how it works, as you can see from the diagram in the same post.

So, instead of that, I went back and tried the original method I thought of at the very beginning of this. It's pretty simple, really.

Starting from the pixel on the ring you want to light, you aim a ray at the planet and a ray at the sun, and then on the line between those, you fire X number of rays across the disk of the sun, and use the total that hit as your illumination value.



There's probably a much nicer way to do this when dealing with simple spheres, but my stubborn insistence on using ellipsoidal bodies means I have to go a bit above and beyond...

This isn't totally accurate, as it ignores the effects of curvature on both bodies, but, well, I'd say it looks pretty convincing.

...however...

It does have a particlar drawbacks; one I'm not going to worry too much about yet, but one that should nonetheless be acknowledged: it can't really render the antumbra, the volume within which the planet appears smaller than the sun.

That's not too big of a deal for rings, at least, but it makes the prospect of using this as a more general shadowing system notably more difficult...

Oh well. It'll do.

Another point to bring up is the business of optimization. It's certainly a waste to fire all those samples at the sun from every point in the ring, and there's a few simple things we can do to reduce that by a lot.

First: Is our pixel in the same hemisphere as the lit half of the planet? If so, there's no possible way it can be shadowed, so we don't need to do anything.

Second: We can run a first pair of feelers to the sun's edges, inside and outside. If both hit, our pixel is fully shadowed, and we can skip all the rest. Similarly, if both miss, our pixel is fully lit, and once again we can skip everything else.

Thirdly: As we walk along the sample line, as soon as we start intersecting the planet, all future tests can be skipped, and the total number of hits can simply have the remaining number of samples dumped onto it. This further reduces the amount of work to do, although it does make the antumbra problem look even worse.

I do enjoy shader work, even if it is hard to debug sometimes. Most of the time.

I fully expect for all this work to turn out to be useless upon discovery of a much better and simpler method.
« Last Edit: December 27, 2019, 03:51:57 PM by NovaSilisko » Logged

nova++
Level 4
****


Real life space alien (not fake)


View Profile
« Reply #39 on: December 27, 2019, 03:48:25 PM »

RING THINGS
EPISODE III

THE THINGS REFUSE TO STOP COMING

There's probably a much nicer way to do this when dealing with simple spheres, but my stubborn insistence on using ellipsoidal bodies means I have to go a bit above and beyond...

[...]

I fully expect for all this work to turn out to be useless upon discovery of a much better and simpler method.

So, about that.



ACT I
KNOWING WHEN TO BURN DOWN YOUR CODE FOR THE INSURANCE MONEY

2. Yep, it did turn out to be useless (ok, not entirely, more on that later)
1. I can do it with spheres and it turns out I don't lose much

It works differently now, and far simpler... no more raycasting stuff, no more RealEllipsoids(tm). Instead, all the relevant positions are transformed into a new coordinate system, rescaling them so that, to the code, the planet is a simple sphere of radius 1. This enables an extremely simple method of calculating shadows that only requires one calculation, has no stairstepping, and supports the entire shadow range, not just the penumbra/umbra.

All that needs to be done is to get the angular diameters of the sun and planet (how much of the sky, from the viewpoint of the pixel, each one spans), and the angular distance between them. From there, the overlap (if there is any) between these two circles is calculated:


Fig. A: a diagram I totally 100% understand, honest

(https://diego.assencio.com/?index=8d6ca3d82151bad815f78addf9b5c1c6)

From the calculated intersect area, we can now determine how much of the sun is covered by the planet. So:

Code:
light = (sunArea - intersectArea) / planetArea

Here's a cross-section of the entire shadow spanning quite a distance behind our test planet:



And properly in action (which looks basically exactly the same to the previous one):





INTERMISSION

...As an aside, this led me to find an odd property of shadows. Or at least how shadows are commonly rendered. And a weird optical illusion.

Here's another cross section:



Look at the antumbra (the rearmost part, after the umbra [the dark, solid part of the shadow] ends). It has a sort of two-horned look to it, doesn't it? It looks weird.

I really thought I was doing something wrong, but after what felt like hours of tearing my hair out over it, I finally went with a suggestion I received and rendered a similar thing in blender.



Damn. Damn. Sure, it's subtler, but it's still there. Either I'm doing it right, or both myself and blender are doing it wrong. But this is where the optical illusion part comes in. If you look at the image as a whole, you can see the "horns" quite clearly, but you get closer they seem to disappear. Here's another, simpler example.

This is an image of two identical gradients intersecting at 90 degrees:



It looks for all the world to me like there's a dark spike running from the bottom right to the top left, and yet all that's happening for a given color is it goes up and turns right. And again, if you zoom in far enough, it just kind of disappears. I'm curious if this is some sort of documented optical illusion, and if it has a name...



ACT III
THE QUEST FOR ACT II AND ALSO MORE SHADOWS

Remember that thing I told you I'd get back to you later about? About the raycasting not being entirely useless? Yeah, that thing. It had one last hurrah when it came to the final piece to my ring puzzle: casting shadows from the rings onto the planet itself.

Which works great, btw. Thanks for asking.



I'm not using the raycast to project the shadow onto the surface per-se. I'll work up to that explanation. To start off, though, I need to provide a bit of context on how the ring shadows work, which again is rather simple.

Every pixel of the planet surface knows what direction the light is, so along that line it picks the point that lies on the plane of the rings:



The distance between this point and the center of the planet is then converted into a 0-to-1 range depending on where inside the ring it lies - 0 at the ring's smallest radius, and 1 at the ring's largest radius.



And then that value is simply used to sample the ring texture at that position - specifically, its alpha channel, which represents particle density. Using that alpha value to darken the pixel on the planet, we get a perfect shadow from the ring.

But that's only part of it. The way I just described will, as mentioned, get you a perfect projection of the shadow onto the surface, but that's not what it would really look like.

Just like the shadows of the planet on the ring, the shadow of the ring on the planet would be softened because the sun has some radius to it. It would have its own penumbral effects just like I described in my earlier post. And to do that, I used a cousin of the raycasting technique for shadows - the process I just described for sampling the ring texture is repeated multiple times for a series of points scattered across the solar disk. All those samples are averaged together, thus producing a blurred result depending on how large the solar disk is from that point:







Fuzzy! All is well.



« Last Edit: December 27, 2019, 03:57:28 PM by NovaSilisko » Logged

Pages: 1 [2] 3 4 ... 8
Print
Jump to:  

Theme orange-lt created by panic