working on something. currently been kinda hush hush about what it actually is but it will be out in the wild very soon.
also, here's something seperate from that project. i'm trying to use depth buffers and vertex displacement to map points to geometry. not having too much luck getting it to fit to the actual, unfortunately. i'm using an orthographic camera buffer here but perspective just produces yet even more issues. i know that there's a lot i should take into account but i can't think of where to start honestly.
been working a lot on this lately. i rewrote the whole rendering system and allowed for distance-based interpolation so that the beam is always moving at the same speed and the resulting image is a lot sharper while bringing up the refresh rate significantly. also been messing with strokes and such to see about making it good enough for use for actual drawing.
once i finish up work with the new version of electricanvas, which involves proper scope simulation and a vastly improved GUI, I'll be getting back to the 3D work since the rendering system has gotten so optimized.
also been studying shaders a lot lately and really want to find a way to implement it with the renderer better, possibly rendering the audio within the GPU/shader itself so i can access absurdly complex mesh data and sweep through it without slowdown. I've seen WebGL examples that do audio rendering and it works roughly the same, but I still have no idea if shaderlab shaders can even write to audio channels in Unity, and I haven't seen any examples that do it.
Thats some pretty dope stuff – especially the first one; reminds me of the PS1 days.
aaa thank you! i actually got the idea to try it because of looking into PS1 stuff since the precision it had for shaders was rather low. I did a few more experiments today.