Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length

 
Advanced search

1359315 Posts in 63178 Topics- by 55024 Members - Latest Member: emmamax

April 25, 2019, 06:38:03 PM

Need hosting? Check out Digital Ocean
(more details in this thread)
TIGSource ForumsCommunityDevLogsMy graphics journey with the N64
Pages: [1] 2 3 ... 6
Print
Author Topic: My graphics journey with the N64  (Read 13738 times)
deadcast
Level 0
***



View Profile WWW
« on: March 04, 2015, 01:59:14 PM »



Hello. My name is deadcast64 and I'm working on building a 3d engine for a future, no release date game. I probably won't succeed in this project but you gotta live for something. Right now I'm trying to get a triangle to render but I really suck at the C language. ;-; So far I can draw a line. :>>>>> I'm building my engine on top of the https://github.com/DragonMinded/libdragon library which basically lets me plot pixels, draw lines, handle controls and play audio. Very light weight and has no dependencies on any proprietary libs. :D

Here are my machine specs if you have the interest:

NEC VR4300 64-bit
Reality Coprocessor
4 MB RDRAM

Thank you for reading.
« Last Edit: February 12, 2017, 11:02:45 AM by deadcast » Logged
deadcast
Level 0
***



View Profile WWW
« Reply #1 on: March 05, 2015, 10:36:40 AM »



heh my first seg fault. there will be many many many more of these
Logged
deadcast
Level 0
***



View Profile WWW
« Reply #2 on: March 05, 2015, 05:13:07 PM »



helllllo. working on triangle rasterization and can so far draw a flat top triangle. next will program flat bottom triangle rendering so any sort of triangle can be rendered. I've also been writing unit tests for all functions so far so i don't always need to run the program on an emulator.

Logged
gimymblert
Level 10
*****


The archivest master, leader of all documents


View Profile
« Reply #3 on: March 05, 2015, 05:19:03 PM »

it's not mario 64 Sad
Logged

deadcast
Level 0
***



View Profile WWW
« Reply #4 on: March 05, 2015, 05:19:51 PM »

btw heres the github repo for the project https://github.com/calebhc/nEgg
Logged
deadcast
Level 0
***



View Profile WWW
« Reply #5 on: March 06, 2015, 11:00:06 PM »



yay! now i can render a triangle with any arbitrary vertices. : )
Logged
gimymblert
Level 10
*****


The archivest master, leader of all documents


View Profile
« Reply #6 on: March 07, 2015, 11:42:40 AM »

YES the black triangle!

http://coffeeonthekeyboard.com/sumos-black-triangle-371/
Logged

deadcast
Level 0
***



View Profile WWW
« Reply #7 on: March 07, 2015, 01:39:06 PM »


haha that was great. Smiley not quite ready for models yet but getting there.
Logged
deadcast
Level 0
***



View Profile WWW
« Reply #8 on: March 08, 2015, 05:29:13 PM »



small lil update. now specifying vertices to be rendered will take into account the screen's resolution. this basically means that the origin of the cartesian system is the center of the screen. thanks
Logged
gimymblert
Level 10
*****


The archivest master, leader of all documents


View Profile
« Reply #9 on: March 08, 2015, 06:08:13 PM »

Is it software or there is a hardware rasterizer? What about the microcode that handle the "3D functions"?
Logged

deadcast
Level 0
***



View Profile WWW
« Reply #10 on: March 09, 2015, 10:37:00 AM »

Is it software or there is a hardware rasterizer? What about the microcode that handle the "3D functions"?

Hey. Smiley The rendering I've been doing so far has been in the software layer. I actually forgot to mention in my first post that I'm building my engine on top of https://github.com/DragonMinded/libdragon. It's a very lightweight toolchain to get started with.

By 3D functions do you mean like all of the projection/matrix/vector stuff? I'm just writing all of that in C.
Logged
gimymblert
Level 10
*****


The archivest master, leader of all documents


View Profile
« Reply #11 on: March 09, 2015, 10:43:05 AM »

I mean isn't the n64 supposed to have a form of hardware acceleration to handle triangle and mapping? the dev complain that the microcode use for 3D operation was too precise for game and had to write their own to boost performance once nintendo allowed them. Software will only go so far in unleashing the actual hardware perf, but that's a nice exercise nonetheless
Logged

Superb Joe
Level 10
*****



View Profile
« Reply #12 on: March 09, 2015, 11:31:14 AM »

[very deep voice] I'm interested in this.
Logged
deadcast
Level 0
***



View Profile WWW
« Reply #13 on: March 09, 2015, 11:44:03 AM »

I mean isn't the n64 supposed to have a form of hardware acceleration to handle triangle and mapping? the dev complain that the microcode use for 3D operation was too precise for game and had to write their own to boost performance once nintendo allowed them. Software will only go so far in unleashing the actual hardware perf, but that's a nice exercise nonetheless

Hmm I'm not totally sure yet. Smiley I'm still doing a lot of learning about the n64. The way I'm drawing right now is totally in software but I believe I do have access to writing directly to the hardware via the lib I'm using. Once I get a little more of the "basics" implemented I want to start moving over the drawing commands to utilize the hardware. Will be fun to test the difference in performance once I do that. B)
Logged
gimymblert
Level 10
*****


The archivest master, leader of all documents


View Profile
« Reply #14 on: March 09, 2015, 02:15:46 PM »

It will be cool no matter what, to follow whatever you do with it,I always dream to do things on old hardware but I currently barely do thing with current Big Laff
Logged

deadcast
Level 0
***



View Profile WWW
« Reply #15 on: March 11, 2015, 11:28:37 AM »

Ahoy everyone! Just thought I'd give an update and say that any visual progress is going to SLOW way down. Since Gimym JIMBERT made me aware of microcodes and the fact that the n64 contained a hardware triangle/rectangle rasterizer I have decided to halt further rasterization on the CPU. So I'm re-doing my current triangle rendering implementation and digging into the internals of the system. I've been spending many hours in the depths of MAME and trying to reverse engineer how opcodes are bit-packed and processed in the RDP. I've gathered up a BUNCH of material and I'm slowly trying to absorb stuff. Last night I made a patched version of MAME so I can watch all of the opcodes being sent from the CPU to the RDP. This will help me understand more of how the data for triangles is constructed.



Thanks all for the comments so far and I hope to eventually show some non-shaded, hardware rasterized triangles soon. :D
Logged
gimymblert
Level 10
*****


The archivest master, leader of all documents


View Profile
« Reply #16 on: March 11, 2015, 01:08:35 PM »

yay!
Logged

deadcast
Level 0
***



View Profile WWW
« Reply #17 on: March 15, 2015, 11:49:01 AM »

Well after way too long I finally got a hardware rendered triangle! : ) It took so long to figure out how to properly format the vertex data for the RDP. It's definitely strange. The RDP doesn't care about vertex's really, it just wants inverse slopes and edges.



I'm going to clean up the code for this renderer and hopefully find a more efficient way of converting floating point numbers to 16.16 fixed point format. I'd like to also merge this non-shaded triangle rendering code into the libdragon lib. We'll see how that goes.

Also I thought it might be pretty cool someday to actually write a microcode program for the RSP that would take vertex data and get it properly formatted for the RDP. Would probably be a little faster than doing that computation on the CPU. :D
Logged
gimymblert
Level 10
*****


The archivest master, leader of all documents


View Profile
« Reply #18 on: March 15, 2015, 12:15:37 PM »

Huh I should have link to relevant source it would have save you time Sad
I'll try to do that next time, even though I'm a shit programmer I did spend time reading lot of doc about it, so I have some reference. Which microcode you are using? silicon's graphics' fast3D? or nintendo's ugly turbo3d? Investing microcode early can be a huge perf boost worth the investment, you will jump from 1666poly frame (60fps) to 8000+ I think (at 640x480 "hd"), whatever the perf. 10 000 can be achieve but it's low precision makes it ugly (depending on use). Fillrate can be an issue so disabling Zbuffer can help, microcode is your friend here too.

You will have the most difficulty with texture though (4kb, 32²px without mip, 2kb with mip) so big image can be pain in a ass. Most programmer use the ROM of cart as a literal ram to swap texture by streaming. Beware of latency issue with the RDRAM and everybody seems to avoid 64bits operations.



But that's a difficult console to program for!

Quote
A challenge is presented by the texture cache small size of only 4 KB. This leads to developers needing to stretch small textures over a comparatively larger space. The console's bilinear filtering only blurs them further. Due to the design of the renderer, when mipmapping is used, the texture cache is effectively halved to 2 KB.

Toward the end of the Nintendo 64's market cycle, certain developers innovated with new techniques of precomputing their textures, such as the use of multi-layered texturing and heavily clamped, small texture pieces, to simulate larger textures; and with the streaming of precomputed textures into the small texture cache from the large, high speed, cartridge medium.

Examples of this ingenuity are found in Rare's Perfect Dark, Banjo-Tooie, and Conker's Bad Fur Day[citation needed] and in Factor 5's Indiana Jones and the Infernal Machine. Some games use plain colored Gouraud shading instead of texturing on certain surfaces, especially in games with themes not targeting realism (e.g., Super Mario 64).

Quote
The big strength was the N64 cartridge. We use the cartridge almost like normal RAM and are streaming all level data, textures, animations, music, sound and even program code while the game is running. With the final size of the levels and the amount of textures, the RAM of the N64 never would have been even remotely enough to fit any individual level. So the cartridge technology really saved the day.
— Factor 5, Bringing Indy to N64, IGN


Quote
One of the best examples of custom microcode on the Nintendo 64 is Factor 5's N64 port of the Indiana Jones and the Infernal Machine PC game. The Factor 5 team aimed for the high resolution mode of 640×480[8] because of the crispness it added to the visuals. The machine was said to be taxed to the limit while running at 640×480, so they needed performance beyond that which was provided by Nintendo's standard SGI-designed microcode.

The Z-buffer could not be used because it alone consumed the already constrained texture fill rate. To work around the 4 KB texture cache, the programmers came up with custom texture formats and tools to let the artists use the best possible textures. Each texture was analyzed and fitted to best texture format for performance and quality. They took advantage of the cartridge as a texture streaming source to squeeze as much detail as possible into each environment and work around RAM limitations.

They wrote microcode for real-time lighting, because the supplied microcode from SGI is not optimized for this task, and because they wanted to have even more lighting than the PC version had used. Factor 5's microcode allows almost unlimited realtime lighting and significantly boosts the polygon count. In the end, the Nintendo 64 version of the game is said to be more feature-filled than the PC version, and is considered to be one of the most advanced games for Nintendo 64


I know nothing, I just took an interest at some time, I hope it direct you to early best practice.
Logged

deadcast
Level 0
***



View Profile WWW
« Reply #19 on: March 15, 2015, 12:36:56 PM »

Huh I should have link to relevant source it would have save you time Sad
I'll try to do that next time, even though I'm a shit programmer I did spend time reading lot of doc about it, so I have some reference. Which microcode you are using? silicon's graphics' fast3D? or nintendo's ugly turbo3d? Investing microcode early can be a huge perf boost worth the investment, you will jump from 1666poly frame (60fps) to 8000+ I think (at 640x480 "hd"), whatever the perf. 10 000 can be achieve but it's low precision makes it ugly (depending on use). Fillrate can be an issue so disabling Zbuffer can help, microcode is your friend here too.

You will have the most difficulty with texture though (4kb, 32²px without mip, 2kb with mip) so big image can be pain in a ass. Most programmer use the ROM of cart as a literal ram to swap texture by streaming. Beware of latency issue with the RDRAM and everybody seems to avoid 64bits operations.



But that's a difficult console to program for!

Quote
A challenge is presented by the texture cache small size of only 4 KB. This leads to developers needing to stretch small textures over a comparatively larger space. The console's bilinear filtering only blurs them further. Due to the design of the renderer, when mipmapping is used, the texture cache is effectively halved to 2 KB.

Toward the end of the Nintendo 64's market cycle, certain developers innovated with new techniques of precomputing their textures, such as the use of multi-layered texturing and heavily clamped, small texture pieces, to simulate larger textures; and with the streaming of precomputed textures into the small texture cache from the large, high speed, cartridge medium.

Examples of this ingenuity are found in Rare's Perfect Dark, Banjo-Tooie, and Conker's Bad Fur Day[citation needed] and in Factor 5's Indiana Jones and the Infernal Machine. Some games use plain colored Gouraud shading instead of texturing on certain surfaces, especially in games with themes not targeting realism (e.g., Super Mario 64).

Quote
The big strength was the N64 cartridge. We use the cartridge almost like normal RAM and are streaming all level data, textures, animations, music, sound and even program code while the game is running. With the final size of the levels and the amount of textures, the RAM of the N64 never would have been even remotely enough to fit any individual level. So the cartridge technology really saved the day.
— Factor 5, Bringing Indy to N64, IGN


Quote
One of the best examples of custom microcode on the Nintendo 64 is Factor 5's N64 port of the Indiana Jones and the Infernal Machine PC game. The Factor 5 team aimed for the high resolution mode of 640×480[8] because of the crispness it added to the visuals. The machine was said to be taxed to the limit while running at 640×480, so they needed performance beyond that which was provided by Nintendo's standard SGI-designed microcode.

The Z-buffer could not be used because it alone consumed the already constrained texture fill rate. To work around the 4 KB texture cache, the programmers came up with custom texture formats and tools to let the artists use the best possible textures. Each texture was analyzed and fitted to best texture format for performance and quality. They took advantage of the cartridge as a texture streaming source to squeeze as much detail as possible into each environment and work around RAM limitations.

They wrote microcode for real-time lighting, because the supplied microcode from SGI is not optimized for this task, and because they wanted to have even more lighting than the PC version had used. Factor 5's microcode allows almost unlimited realtime lighting and significantly boosts the polygon count. In the end, the Nintendo 64 version of the game is said to be more feature-filled than the PC version, and is considered to be one of the most advanced games for Nintendo 64


I know nothing, I just took an interest at some time, I hope it direct you to early best practice.

Thanks much for the help and no worries! :D I was able to find quite a bit of docs lying around on the internet for the n64. I was looking at some of the hardware patents yesterday. :p

What Factor 5 did was really impressive! Would be awesome to have a chat with them all about their experience with the console but I doubt it would be fun for them. I've actually seen that article on gamasutra. It got me really intrigued for some reason.

Actually that triangle isn't using any microcode. I just computed the edge coefficients on the cpu then sent them directly to the RDP. Doesn't seem optimal for sure but it's a start. Here's the code snippet https://gist.github.com/calebhc/109ea3593125f5001403. The ringbuffer commands are implemented in the libdragon library I'm using. That lib doesn't yet support direct RSP communication. I plan on trying to fix that. :D

Yeah, I definitely want to have a really good start before trying to make a simple game engine. I really need to worry about efficiency all the time or I'll end up with a 1 fps scene. lol
Logged
Pages: [1] 2 3 ... 6
Print
Jump to:  

Theme orange-lt created by panic