InfiniteStateMachine
|
|
« Reply #4060 on: May 31, 2015, 04:13:57 PM » |
|
What happens if you try to run that executable on fewer cores?
It doesn't affect the executable itself, only the Make process. So for me, it will kick off up to 8 concurrent compilations instead of having to do them all serially. Surprising that it defaults to one core.
|
|
|
Logged
|
|
|
|
Whiteclaws
|
|
« Reply #4061 on: May 31, 2015, 06:17:49 PM » |
|
I implemented Simplex Noise all by myself, like a big boi
|
|
|
Logged
|
|
|
|
Garthy
|
|
« Reply #4062 on: May 31, 2015, 07:06:49 PM » |
|
What happens if you try to run that executable on fewer cores?
It doesn't affect the executable itself, only the Make process. So for me, it will kick off up to 8 concurrent compilations instead of having to do them all serially. Surprising that it defaults to one core. Probably because it is the safest option. Make and Makefiles have been around a very, very long time. The Makefiles for some projects do not parallelise well- usually because they were written when parallel builds were not a viable or tested option; or the build process for a project is complicated enough that the true set of parallel-safe dependencies are not (or cannot be) properly expressed by the Makefile. Builds with such Makefiles will sometimes work, and sometimes not, with results changing based on how long certain tasks take, what needs to be rebuilt, and so forth. Unless you know that the generated Makefile is safe for a parallel run, running in parallel may not be a good idea. You can use the MAKEFLAGS environment variable to make a certain "-j" option the default to ensure default parallel builds. I think I even did this myself a bit back for a few days until I discovered how many things I was using with non-parallel-safe Makefiles, and how much of a pain it was to remember to turn the option off versus explicitly turning it on when I knew it was safe.
|
|
|
Logged
|
|
|
|
InfiniteStateMachine
|
|
« Reply #4063 on: June 01, 2015, 04:16:45 PM » |
|
What happens if you try to run that executable on fewer cores?
It doesn't affect the executable itself, only the Make process. So for me, it will kick off up to 8 concurrent compilations instead of having to do them all serially. Surprising that it defaults to one core. Probably because it is the safest option. Make and Makefiles have been around a very, very long time. The Makefiles for some projects do not parallelise well- usually because they were written when parallel builds were not a viable or tested option; or the build process for a project is complicated enough that the true set of parallel-safe dependencies are not (or cannot be) properly expressed by the Makefile. Builds with such Makefiles will sometimes work, and sometimes not, with results changing based on how long certain tasks take, what needs to be rebuilt, and so forth. Unless you know that the generated Makefile is safe for a parallel run, running in parallel may not be a good idea. You can use the MAKEFLAGS environment variable to make a certain "-j" option the default to ensure default parallel builds. I think I even did this myself a bit back for a few days until I discovered how many things I was using with non-parallel-safe Makefiles, and how much of a pain it was to remember to turn the option off versus explicitly turning it on when I knew it was safe. Ah that makes sense, it is quite an old system and certainly predates the advent of parallel processing. I rarely use make these days, if at all. Most of the more modern build systems i've used default to max threads unless specified.
|
|
|
Logged
|
|
|
|
Code_Assassin
|
|
« Reply #4064 on: June 02, 2015, 02:13:26 AM » |
|
I just want to say that shapes are awesome, and if you don't agree - go home.
|
|
|
Logged
|
|
|
|
oahda
|
|
« Reply #4065 on: June 02, 2015, 03:04:43 AM » |
|
I am very much in love with geometry indeed.
|
|
|
Logged
|
|
|
|
Boreal
Level 6
Reinventing the wheel
|
|
« Reply #4066 on: June 04, 2015, 11:59:26 AM » |
|
Finally making some headway on my Haskell process/shader/whatever EDSL stuff. Implemented a suite of functions for going from rose trees to directed acyclic graphs and back - Haskell makes trees extremely easy but graphs are more difficult. I also stopped trying to find a generic solution to register allocation and realized it was tightly coupled to the compilation step, so you get something like this:
Mantle IL (minimal register usage): Expression tree -> Allocate registers with reuse -> Compress to DAG -> Compile -> GPU bytecode
Vulkan SPIR-V (LLVM, single static assignment): Expression tree -> Compress to DAG -> Allocate registers uniquely -> Compile -> LLVM bytecode
|
|
|
Logged
|
|
|
|
Boreal
Level 6
Reinventing the wheel
|
|
« Reply #4067 on: June 07, 2015, 11:41:53 AM » |
|
Refactoring has never been so beautiful as it is in Haskell. I still think I can simplify the `forestToDagM` monadic function a little more using `mapM`, though.
|
|
« Last Edit: June 07, 2015, 11:50:27 AM by Boreal »
|
Logged
|
|
|
|
Code_Assassin
|
|
« Reply #4068 on: June 07, 2015, 12:35:29 PM » |
|
snip
what is this lol
|
|
|
Logged
|
|
|
|
Boreal
Level 6
Reinventing the wheel
|
|
« Reply #4069 on: June 07, 2015, 01:15:45 PM » |
|
|
|
|
Logged
|
|
|
|
Layl
|
|
« Reply #4070 on: June 07, 2015, 01:41:28 PM » |
|
Monads are the thing I've never known I wanted. Rust Result<_, _>, Option<_> and try!() <3
|
|
|
Logged
|
|
|
|
Boreal
Level 6
Reinventing the wheel
|
|
« Reply #4071 on: June 07, 2015, 01:58:03 PM » |
|
They're also great for asynchronous calls.
For instance, C++17 has a proposal to make `std::future` monadic by giving it a `next()` method that acts like monadic bind.
Add applicatives (wait for multiple futures) and monoids (wait for fastest future) on top of that, and futures become very powerful.
|
|
|
Logged
|
|
|
|
diegzumillo
|
|
« Reply #4072 on: June 07, 2015, 03:40:21 PM » |
|
Might not seem too exciting but these are my first steps in shader programming and I'm super excited! I'm following Unity Cookie's tutorial and during the lambert shader tutorial I got bored of what we were doing and decided to see if I could do something interesting. So I made a little fake sub surface light scattering effect. Shader has always been black magic to me. I'm happy to see this thing unfolding.
|
|
|
Logged
|
|
|
|
gimymblert
|
|
« Reply #4073 on: June 07, 2015, 03:51:49 PM » |
|
what did you use to fake it?
|
|
|
Logged
|
|
|
|
diegzumillo
|
|
« Reply #4074 on: June 07, 2015, 03:55:53 PM » |
|
At the intersection between light and shadow I added color. Then you have to adjust the colors and parameters in the material to get the right look. float3 sssDir = _SSSStrength*(_SSS.rgb)* pow( 1-abs(dot(normalDirection, lightDirection)), _SSSSharpness ); edit: Just noticed that using the cross product is a bit smarter
|
|
« Last Edit: June 07, 2015, 04:01:58 PM by diegzumillo »
|
Logged
|
|
|
|
gimymblert
|
|
« Reply #4075 on: June 07, 2015, 04:25:32 PM » |
|
oh okay that make sense, That's very smart you might be onto something! Try varying the sharpness based on the mesh curvature with adding a slight blue tint (or any colors, blue is for skin) to the light as the the sss spread!
|
|
|
Logged
|
|
|
|
ProgramGamer
|
|
« Reply #4076 on: June 07, 2015, 06:49:29 PM » |
|
I need to learn shaders.
|
|
|
Logged
|
|
|
|
diegzumillo
|
|
« Reply #4077 on: June 07, 2015, 08:14:51 PM » |
|
oh okay that make sense, That's very smart you might be onto something! Try varying the sharpness based on the mesh curvature with adding a slight blue tint (or any colors, blue is for skin) to the light as the the sss spread!
I just know the math, I'm an educated idiot :D But I'll definitely come back to this shader project. For now I have lots of stuff to learn still. And I just spent 3 hours trying to find the dumbest bug in the world. Jesus, these shader fellas are sensitive.
|
|
|
Logged
|
|
|
|
oahda
|
|
« Reply #4078 on: June 07, 2015, 11:54:12 PM » |
|
Dag and nat mean day and night in Danish. wat r u do Also, what's std::future? Never heard of it. .-.
|
|
|
Logged
|
|
|
|
gimymblert
|
|
« Reply #4079 on: June 08, 2015, 08:47:45 AM » |
|
oh okay that make sense, That's very smart you might be onto something! Try varying the sharpness based on the mesh curvature with adding a slight blue tint (or any colors, blue is for skin) to the light as the the sss spread!
I just know the math, I'm an educated idiot :D But I'll definitely come back to this shader project. For now I have lots of stuff to learn still. And I just spent 3 hours trying to find the dumbest bug in the world. Jesus, these shader fellas are sensitive. To be frank I dont know much about shader language either, I focus on the big concept and left the confusing part intact, I generally modify existing shader used as template and focus only on relevant part, even though the shader end up completely different, I can make shader about light and texture effects, but AO / post processing? Those are still confusing until I have a need to look at them. Basically I hijack shader code and look at input (texture and variable) and output (the returned color) and then modify the inside so that the output is what I want.
|
|
|
Logged
|
|
|
|
|