Welcome, Guest. Please login or register.

Login with username, password and session length

 
Advanced search

1411508 Posts in 69379 Topics- by 58435 Members - Latest Member: graysonsolis

April 30, 2024, 03:49:33 PM

Need hosting? Check out Digital Ocean
(more details in this thread)
TIGSource ForumsDeveloperTechnical (Moderator: ThemsAllTook)[SOLVED] Help! Lit normals divided by horizontal line (GLSL)?
Pages: [1] 2
Print
Author Topic: [SOLVED] Help! Lit normals divided by horizontal line (GLSL)?  (Read 2067 times)
oahda
Level 10
*****



View Profile
« on: December 22, 2015, 06:40:44 PM »



What the heck?

Here's a scene with a normal map. The red dot is the origin of a point light with infinite attenuation. It should light stuff up based on a normal map in every direction from this origin.

For some reason I get this weird diagonal dividing line through the light source. The more I increase the z position of the light, the more this is fixed, but it also removes all the nice, hard lights present in the above example, which is not desirable, and of course I shouldn't have to do this anyway, eh?

So what's going on here?

I'm using the default calculation N•L, which looks like this in the fragment shader:

Code:
clamp(dot(normal, normalize(vec3(gl_FragCoord.xy - posLight, 0))), 0, 1)

Nothing wrong whatsoever here as far as I can tell and it matches up with other code I've double checked against.

The diagonal line is rather reminiscent of what can be done with a dot product, so I wonder if that has anything to do with it...

Is this not how I should light a 2D scene with normal maps?

Nothing wrong with the normal map itself, because I get this even when I make the normal map from a flat plane facing the camera.

HELP

Looking into it, the distance of the diagonal from the position of the light (red dot) is apparently directly linked to the z position of the light. If I change z to 100 instead of 0 as above, I get this:



And sure, I could just set it to the radius of the light and it'd match up perfectly except the intensity of the light also increases as the z value does, making all of the highlights versus shadows blur into one big light that overexposes the entire image. So that's a hacky solution.

WHERE DID I GO WRONG Cry
« Last Edit: January 16, 2016, 03:58:47 AM by Prinsessa » Logged

oahda
Level 10
*****



View Profile
« Reply #1 on: December 23, 2015, 04:46:43 AM »

This is the closest problem I've been able to find, but this is not my problem, but just posting it here so that nobody else does: http://gamedev.stackexchange.com/questions/98782/proper-normal-vector-transformations-in-normal-mapping

All my textures are RGBA and there's nothing wrong with the normal map. It gets lit from the right angles and everything. It's just this weird dividing diagonal that I can't understand...
Logged

dlan
Level 1
*



View Profile
« Reply #2 on: December 23, 2015, 05:30:53 AM »

could you post the rest of your shader? I don't see anything wrong with the N•L calculation, could be a mismatch with the input parameters. Since you are using gl_FragCoord directly, I suppose your poslight parameter is bounded to screen space.

Anyway, I used the following shader for my own 2D normal mapping: https://github.com/mattdesl/lwjgl-basics/wiki/ShaderLesson6#FragmentShader the full GLSL code is available in the Fragment Shader section, with an explanation for each lines of the shader, it could help you isolate the bug in your code.
Logged
oahda
Level 10
*****



View Profile
« Reply #3 on: December 23, 2015, 05:42:45 AM »

Yeah, I've read that tutorial too. I actually used the texture and normal map there temporarily for my wall until I made my own the other day. So I'm doing the * 2 - 1 and everything — again, my normals are most likely fine and not the problem. Shrug

There aren't really any other parameters TBH. I'm just testing right now so the light's position and everything is just hard-coded into the shader ATM. I use gl_FragCoord for the things to light and then I just calculate the position of the light into the same space by taking the camera's position into account and so on. But not right now. It's just hard-coded into a specific position on the screen for now.

The normal variable just reads the RGB value from the normal buffer at the correct UV position (I render all normal maps into their correct position on the screen into a separate buffer from the colour buffer before I move on to the final pass where I apply lights — this is a deferred rendering system) and then normal = normalize(normal * 2 - 1), like I said.

That's it. The code is literally just this (except variable names are a little different; changing them for readability here):

Code:
vec3 normal = texture(normalmap, gl_FragCoord.xy / resolution).rgb;
normal = normalize(normal * 2 - 1);

vec3 light = normalize(vec3(gl_FragCoord.xy - posLight, 0));

result = clamp(dot(normal, light), 0, 1);

It almost feels like the coördinate system is messed up or something (I did export the normals from Blender which is quite notorious for its swapped y and z semantics) but changing things around, like using normal.xzy instead doesn't help. It just cuts the light off in a different position. So that's not it either.

But then we come back to the fact that the stuff that does get lit does get lit from the right angle, so it'd be odd if the coördinates had anything to do with it, as more problems would probably arise from that. So I don't think that's it either...

I'm still feeling somewhat suspicious about the dot product since it's a diagonal line...
« Last Edit: December 23, 2015, 06:02:43 AM by Prinsessa » Logged

dlan
Level 1
*



View Profile
« Reply #4 on: December 23, 2015, 09:15:09 AM »

just a detail, but instead of

Code:
gl_FragCoord.xy - posLight

shouldn't it be

Code:
posLight - gl_FragCoord.xy
Logged
oahda
Level 10
*****



View Profile
« Reply #5 on: December 23, 2015, 09:26:28 AM »

No, that results in inverted normals.
Logged

Polly
Level 6
*



View Profile
« Reply #6 on: December 23, 2015, 10:18:13 AM »

Can you post a screenshot of what your normal buffer / pass looks like? I suspect something might be wrong with your tangent conversion.
Logged
BorisTheBrave
Level 10
*****


View Profile WWW
« Reply #7 on: December 23, 2015, 10:43:36 AM »

Quote
directly linked to the z position of the light
huh? your code implies posLight is a vec2?


Also, how did you set up normalmap? You may want to try
Code:
normal = normalize(normal * 2 - 256);
incase you are using a non-floating point texture.
Logged
oahda
Level 10
*****



View Profile
« Reply #8 on: December 23, 2015, 01:38:56 PM »

Code:
[quote author=BorisTheBrave link=topic=52012.msg1207700#msg1207700 date=1450896216]
[quote]directly linked to the z position of the light[/quote]
huh? your code implies posLight is a vec2?
It is but I add a z value myself. The word vec3 is right there in the code.

Also, how did you set up normalmap? You may want to try
Code:
normal = normalize(normal * 2 - 256);
incase you are using a non-floating point texture.
Nope, they're all clamped between 0 and 1.

Can you post a screenshot of what your normal buffer / pass looks like? I suspect something might be wrong with your tangent conversion.
I just draw the normal maps just the way they are on file into the place where the objects are. It's like the colour buffer but only with the normals.

Some older pictures I already had lying around in my devlog, so it's not a perfect match with the above images (old test graphics) but it still works in the same way.



So it's exactly the same as if I'd had a forward rendering system and had read each normal map from a sampler of the raw texture and not from a buffer.

Black simply means "has no normal map; ignore the calculation for this fragment", so there's an if (normal.x != 0 && normal.y != 0 && normal.z != 0) before I do the stuff.
Logged

Pit
Level 0
***


View Profile WWW
« Reply #9 on: December 24, 2015, 06:47:29 PM »

Nothing wrong with the normal map itself, because I get this even when I make the normal map from a flat plane facing the camera.

That's kinda weird, I suppose your coordinate system is set that a normal map "facing the camera" means all normals equal (0,0,1)? In that case the dot product should without a doubt always be zero since you set the z-component of the other vector to 0.

If the line still appears there's either a problem somewhere else in the shader (unlikely according to what you've said about it) or there's indeed something wrong with the normals. On the screenshot it almost looks as if they wouldn't mainly face towards the camera as they should, but to the bottom-left corner (thus leading to the diagonal since the top-right part would face away from the light). One scenario where I can imagine this happen is when writing the normals you don't do the (x+1)/2 thing and an x value of 0 ends up as a 0 in the buffer. Thus when you do the x*2-1 thing you end up with a -1. Although if all you do is simply copying the normals from the texture into the buffer that doesn't sound like a lot could go wrong.

So, can you make a screenshot of the scene from the first post where you write the final normal values into the color buffer?
i.e. gl_FragColor = vec4(normal, 1.0)?

(if this is complete bullshit what I write I blame the fact it's almost 4am xD )
« Last Edit: December 24, 2015, 06:57:45 PM by Pit » Logged
joseph ¯\_(ツ)_/¯
Level 10
*****



View Profile
« Reply #10 on: December 27, 2015, 10:21:18 AM »

Some naieve artist-who-doesn't-write-shaders input in case it helps. I could be off base here, just want to share a different perspective:

This looks basically how I expect it should without any information from the 'surface' the normals are projected onto.

On 3d models with normal maps the map-normals are combined in some way (multiplied I suppose?) with the lowpoly geometry's normals in world space, so that in addition to the information on the normal map (how is this pixel angled relative to this surface?) you are also lighting based on the information of the geometry (where is this vertex in space, and how is it angled?)

It looks like you're either not doing or stomping over whatever the equivalent of this step is. How did you light this surface _before_ you added normal maps? Is that information still being used in your shader?
Logged

oahda
Level 10
*****



View Profile
« Reply #11 on: January 10, 2016, 06:16:35 AM »

Yo, sorry for taking so long to get back to this. Had a bit of a timeout over the holidays.

Not sure if any of you are really on to it, I'm afraid, but someone on Twitter apparently managed to recreate the problem. But did so by doing things that I don't seem to have done... Here's the entire conversation, in case anybody can get something out of it...

Note that the ones containing my name (@avaskoog) are replies to me and not posts by me and vice versa with their name (@tuan_kuranes).



@avaskoog looks like result lightdir is toward Bottom Left. light dir and normal in same space/orientation? Check  https://www.shadertoy.com/view/4ljSRw



@tuan_kuranes Should differ for each fragment. If light is at (0,0) and I check frag to left it might be (-1,0)-(0,0), to right (1,0)-(0,0)…

@tuan_kuranes The light itself has no direction. I'm only taking the vector between the current fragment and the position of the light.

@tuan_kuranes So the light should effectively just be a big, infinite sphere (tho I cut it off by fading it out within a radius).

@tuan_kuranes (and that "cutting off" is irrelevant bc I get this problem even when I don't do that)



@avaskoog sorry wasn't clear, lightdir  not as in a directionnal light VS point light, but the resulting vec L used in the NdotL equation



@tuan_kuranes Yeah. Should differ with every fragment still, and as you can see _some_ very bright parts still light up in the darkness…



@avaskoog according to NdL theory it really looks like a light dir pointing to BL. Normal is Z up ? screen is lightPos(0.5,0.5) ?



@tuan_kuranes Sorta. ATM exact normals differ for each px but I tried w completely flat normal map facing cam too and got the same problem.



@avaskoog made this  https://www.shadertoy.com/view/Xsd3Rj  using your code, everything seemed ok. So input the culprit (normal or lightpos)



@tuan_kuranes Hm. Sad I'm stumped. Thanks for that effort tho! I wonder what it could be...



@avaskoog multiply your poslight by screensize ( or divide glfragcoord by screensize)

@avaskoog i reproduce same bug on shadertoy if I use lightpos in 0,1 instead of 0,screensize



@tuan_kuranes Lightpos is of course in screen size range as well and not within [0, 1), so should that matter? Plus I normalise the diff.

@tuan_kuranes Lightpos is currently (910, 490) and screen is 1080p. Fragments are within that range as well. How can that be wrong? s:

@tuan_kuranes I think I misunderstood your wording a bit, but yeah. I'm working in screen size range coords.

@tuan_kuranes Tried doing what you said anyway tho and it didn't help so ya. Sad



@avaskoog strange. Made https://www.shadertoy.com/view/Xsd3Rj  with colored debug to diff with your renders, could help find faulty inputs



@tuan_kuranes Yeah, you definitely seem to be on to *something* since you managed to reproduce it. I'll look at that. Thanks again!
Logged

oahda
Level 10
*****



View Profile
« Reply #12 on: January 10, 2016, 06:19:39 AM »

I'm even failing to reproduce the way they managed to reproduce it by modifying stuff at their link... I have no experience with shadertoy at all...
Logged

Polly
Level 6
*



View Profile
« Reply #13 on: January 10, 2016, 07:45:57 AM »

Maybe you should / could post your entire fragment shader, it's a little difficult guessing what you've done wrong from just the snippets you've provided Wink

+ Another guess, perhaps you're not using your position buffer properly? Or in case you don't have / need one ( since it's a 2D game ), perhaps you're doing something wrong calculating the position of fragments?
Logged
Sik
Level 10
*****


View Profile WWW
« Reply #14 on: January 10, 2016, 07:43:57 PM »

It is but I add a z value myself. The word vec3 is right there in the code.

Had to look at this again, you're hardcoding the Z value to 0, then using normalize to make up for it. Wouldn't this result in the X and Y values pointing to a different direction? Last I checked when you recreate Z from X and Y, you have to explicitly compute it.

I'm not sure if this is the real issue (in fact I'm not even sure it'd give that kind of bug) but you should really look again at how you're handling this.
Logged
oahda
Level 10
*****



View Profile
« Reply #15 on: January 10, 2016, 10:04:31 PM »

I meant the z for the position of the light, not the z for the normals — that one is baked into the normal maps.

+ Another guess, perhaps you're not using your position buffer properly? Or in case you don't have / need one ( since it's a 2D game ), perhaps you're doing something wrong calculating the position of fragments?
Right now the position of the light is simply hard coded into the shader. I explained how I get the normal values above (look for the buffer images).
Logged

Polly
Level 6
*



View Profile
« Reply #16 on: January 11, 2016, 06:17:38 AM »

Alright, not quite sure what you're doing wrong .. but here's a simple example ( using a example diffuse & normal pass from a 2D game )



Vertex Shader ( full-screen quad )

Code:
uniform vec2 mouse;
varying vec3 lightVec;

void main()
{
  // Mouse controlled point-light
  lightVec = (vec3(mouse.x, mouse.y, 1.0) - gl_Vertex.xyz) * vec3(16.0 / 9.0, 1.0, 1.0);

  gl_TexCoord[0] = gl_MultiTexCoord0;
  gl_Position = gl_Vertex;
}

Fragment Shader

Code:
uniform sampler2D diffuse; // Diffuse pass
uniform sampler2D normal; // Normal pass

varying vec3 lightVec;

void main()
{
  vec4 d = texture2D(diffuse, gl_TexCoord[0].st);
  vec3 l = lightVec * inversesqrt(dot(lightVec, lightVec));
  vec3 n = normalize(texture2D(normal, gl_TexCoord[0].st).xyz * 2.0 - 1.0);

  gl_FragColor = d * dot(l, n);
}
Logged
Pit
Level 0
***


View Profile WWW
« Reply #17 on: January 11, 2016, 11:24:02 AM »

If you want a temporary fix for the problem you might try to simply add vec3(0.5) (or vec3(0.5,0.5,0.0)) to the normal, that might give you the correct result xD

Code:
clamp(dot(normal+vec3(0.5), normalize(vec3(gl_FragCoord.xy - posLight, 0))), 0, 1)

In the shadertoy thing you can reproduce the diagonal line by subtracting that from the normals (i.e. making them face to the bottom left)

Code:
float ndl = clamp(dot(lightDir.xyz, normal.xyz-vec3(0.5)), 0., 1.);
Logged
oahda
Level 10
*****



View Profile
« Reply #18 on: January 12, 2016, 02:09:57 AM »

What's with the inverse square root, Polly? And why multiplying it by the light vector?

That would almost seem to imply that I'd forgotten the * 2 - 1, Pit, but we've already established that I haven't. :c Might try and see what happens, tho...

EDIT:
Tried it, and the interesting result is that it "rotates the diagonal"...

« Last Edit: January 12, 2016, 02:27:35 AM by Prinsessa » Logged

Polly
Level 6
*



View Profile
« Reply #19 on: January 12, 2016, 04:34:59 AM »

What's with the inverse square root, Polly? And why multiplying it by the light vector?

I could / should have simply replaced that with "vec3 l = normalize(lightVec)" .. same result Wink

In any case, if you tried copy-pasting the vertex & fragment shader and still got weird results i really have no idea what's going on. I've PM'ed you the example ( binary & source ), perhaps you could check if that does run properly on your system.
Logged
Pages: [1] 2
Print
Jump to:  

Theme orange-lt created by panic