Welcome, Guest. Please login or register.

Login with username, password and session length

 
Advanced search

1411430 Posts in 69363 Topics- by 58416 Members - Latest Member: JamesAGreen

April 19, 2024, 10:50:35 PM

Need hosting? Check out Digital Ocean
(more details in this thread)
TIGSource ForumsDeveloperTechnical (Moderator: ThemsAllTook)The grumpy old programmer room
Pages: 1 ... 267 268 [269] 270 271 ... 295
Print
Author Topic: The grumpy old programmer room  (Read 738274 times)
InfiniteStateMachine
Level 10
*****



View Profile
« Reply #5360 on: June 24, 2017, 10:29:46 AM »


I just wanted to mention that worrying about cache issues at such an early point that it is defining your overall methodology and architecture is IMHO right up there with the most premature of premature optimisation. Cache issues are the sort of thing you look into in the 0.001% of your codebase that you have measured and determined is chewing up 10% of the overall execution time. You then come up with a very domain-specific solution to that particular problem. IMHO cache issues should definitely *not* be used to promote the general superiority or inferiority of any particular methodology because overall performance is going to be based on the actual execution path and data access ordering in that application. Unless you are doing a rewrite of a rewrite of a rewrite, you will never successfully predict this ahead of time.

Please don't say or suggest "do X because cache" unless X is something like "When I'm hand-optimising a frequently-called codepath for a specific set of processors I do Z" and you replace "because cache" with at least a paragraph explaining how the particular access patterns that result are optimal. These generalisations confuse people new to the craft because they don't yet understand caching issues and are prone to just believing those who appear to be demonstrating superior knowledge of the subject matter,

With respect. Gentleman


While I generally agree with this it should be mentioned that there's methodologies that make your code hard to optimise later - and writing code that easy to optimise isn't premature optimization, it's smart.

I feel like once you get to a certain level of experience it becomes almost harder to program in a way that is that difficult to optimize. When I think of the times where I've seen this happen, I remember the visitor pattern tending to be involved.

I would say one place I do think about perf up front is rendering but the way you have to feed data to the gpu, performance oriented code pretty much comes naturally.

Wait till you make planet generation with fast moving ship

That sounds to me more like something solved at a higher level rather than optimizing your cache coherency.
Logged

gimymblert
Level 10
*****


The archivest master, leader of all documents


View Profile
« Reply #5361 on: June 24, 2017, 03:09:56 PM »

It's a lot of dumb operation, with procedural generation, you need those perf, it's not even my idea, voxel game HAVE to do it just to have decent performance.
Logged

Ordnas
Level 10
*****



View Profile WWW
« Reply #5362 on: June 24, 2017, 03:19:39 PM »

Donald Knuth said: "Premature optimization is the root of all evil”, but is this statement always true?

Suppose that, during the design process, we need to make an openworld game, and that we probably need quad trees and occlusion culling to support our streaming layer system, and ask ourself how much VRAM we need, and how many CPUs we have... in this scenario, do you think that is a good idea to make optimizations right from the start, because this is just part of the design process? Or is still too risky and time consuming?
Logged

Games:

gimymblert
Level 10
*****


The archivest master, leader of all documents


View Profile
« Reply #5363 on: June 24, 2017, 03:37:45 PM »

If premature optimization was the root of all evil, people would jump directly to programming instead of modeling the problem and UML tools would be totally useless, we would cowboy our way into code.

Those type of aphorism are meant to be warning against excessive behavior, YAGNI.
Logged

GuiltyGreens
Level 0
*


View Profile
« Reply #5364 on: June 24, 2017, 06:53:43 PM »

Ordnas, that's a good example of when you should be aware of your limits before getting too far with the game. Mike Acton talked about that example for Sunset Overdrive, https://youtu.be/qWJpI2adCcs?t=1004
Logged
qMopey
Level 6
*


View Profile WWW
« Reply #5365 on: June 24, 2017, 09:12:51 PM »

Differences between premature optimization and software engineering are knowledge and skill.
Logged
Garthy
Level 9
****


Quack, verily


View Profile WWW
« Reply #5366 on: June 24, 2017, 09:40:47 PM »

100% agree Coffee. Cache coherency is something I rarely ever think about when I'm programming.

My experience has been similar. When optimising code, by the time I get close to that level the focus has typically shifted to another area that needs more attention.

Wait till you make planet generation with fast moving ship

That sounds interesting. I have worked on similar projects before.

it should be mentioned that there's methodologies that make your code hard to optimise later

Personally I can't agree with this as a general rule. However, there are situations where it is true. There are poor methodologies, there are poor applications of good methodologies, a poor approach can impede effective optimisation, and a blind adherence to a methodology *during* targeted optimisation can severely limit your options.

If you were actually referring to these or something similar, then I believe we are in agreement.

and writing code that easy to optimise isn't premature optimization, it's smart.

Absolutely. Thinking ahead and keeping your options open is excellent planning.

I feel like once you get to a certain level of experience it becomes almost harder to program in a way that is that difficult to optimize

I tend to agree. In addition, personally I have found that much of the time effective optimisation ends up being analysis of part of an existing part of a solution and the design of a different solution that you hope to be a better one. It is not unusual to completely change the approach you have taken in general for just that part of the solution if it leads to improvement, regardless of the prevailing methodology in the rest of the project.

That sounds to me more like something solved at a higher level rather than optimizing your cache coherency.

I feel this is very good advice and I would strongly suggest following it.

The greatest gains in optimisation come not from doing the same thing more efficiently and quickly, but from identifying large chunks of redundant or unimportant work that you are doing and eliminating it.
« Last Edit: June 28, 2017, 05:26:02 AM by Garthy » Logged
JWki
Level 4
****


View Profile
« Reply #5367 on: June 25, 2017, 01:13:08 AM »

If premature optimization was the root of all evil, people would jump directly to programming instead of modeling the problem and UML tools would be totally useless, we would cowboy our way into code.

Those type of aphorism are meant to be warning against excessive behavior, YAGNI.

Well UML tools ARE totally useless.

EDIT: So here's what makes me a grumpy programmer today:

For one thing, I forgot to push some changes on a private project that I made at home and now I'm kinda stuck doing stuff on it on the road because it's in a very early stage so I can't really go and work on a subpart of it because the next things I have to do are dependent on the changes I made so that's annoying.

Then there's cmake, which I've gotten rid of completely for personal stuff but still have to deal with for uni and work and it's driving me crazy - I'm trying to compile a project that depends on QT and they have a very very complex CMake structure with dozens of individual scripts and I have to work with VS2017's built-in cmake support because my VS installation is broken in a way that the external CMake cannot find or work with it for some reason so I have to set variables per .json definition file and for some reason whatever I do, cmake configuration step keeps failing, telling me it can't find my QT installation. I've already tried everything from configuring to hardcoding the path into the build script and it keeps failing and debugging cmake scripts is... well, it's horrible and gruesome and I don't want to have to do it.

I fucking hate cmake, there, I said it. Can just everybody abandon it and use something sane please (like premake).
« Last Edit: June 25, 2017, 03:03:22 AM by JWki » Logged
Garthy
Level 9
****


Quack, verily


View Profile WWW
« Reply #5368 on: June 25, 2017, 03:20:39 AM »


I fucking hate cmake, there, I said it.

cmake is the first genuine cross-realm build tool.

In Heaven they use it to compile pre-existing single libraries with simple source files, no generated source files, no complex dependencies, for projects that require no changes, maintenance, rebuilding, or use any Windows debug libraries or build flags, that produce a single library file output only for use in other projects that are built with another build tool.

In Hell they use it for everything else.
Logged
InfiniteStateMachine
Level 10
*****



View Profile
« Reply #5369 on: June 26, 2017, 04:38:36 AM »

It's a lot of dumb operation, with procedural generation, you need those perf, it's not even my idea, voxel game HAVE to do it just to have decent performance.

Ah ok. I would need to know more about the problem. This is similar to minecraft?
Logged

qMopey
Level 6
*


View Profile WWW
« Reply #5370 on: June 26, 2017, 09:16:08 PM »

In Heaven they use it to compile pre-existing single libraries with simple source files, no generated source files, no complex dependencies, for projects that require no changes, maintenance, rebuilding, or use any Windows debug libraries or build flags, that produce a single library file output only for use in other projects that are built with another build tool.

In Hell they use it for everything else.


So in heaven they use it on projects that don't need it. And we live in hell.

Got it.
Logged
Garthy
Level 9
****


Quack, verily


View Profile WWW
« Reply #5371 on: June 26, 2017, 09:25:54 PM »


In Heaven they use it to compile pre-existing single libraries with simple source files, no generated source files, no complex dependencies, for projects that require no changes, maintenance, rebuilding, or use any Windows debug libraries or build flags, that produce a single library file output only for use in other projects that are built with another build tool.

In Hell they use it for everything else.


So in heaven they use it on projects that don't need it. And we live in hell.

Got it.

OH. MY. GOD.  Waaagh!

Logged
Ordnas
Level 10
*****



View Profile WWW
« Reply #5372 on: June 27, 2017, 05:17:46 AM »

It's a lot of dumb operation, with procedural generation, you need those perf, it's not even my idea, voxel game HAVE to do it just to have decent performance.

Ah ok. I would need to know more about the problem. This is similar to minecraft?

gimymblert, did you try to batching the "cubes" as much as possible + use texture atlas?
Logged

Games:

qMopey
Level 6
*


View Profile WWW
« Reply #5373 on: June 27, 2017, 10:38:56 AM »


In Heaven they use it to compile pre-existing single libraries with simple source files, no generated source files, no complex dependencies, for projects that require no changes, maintenance, rebuilding, or use any Windows debug libraries or build flags, that produce a single library file output only for use in other projects that are built with another build tool.

In Hell they use it for everything else.


So in heaven they use it on projects that don't need it. And we live in hell.

Got it.

OH. MY. GOD.  Waaagh!

Hahaha  Screamy Screamy Screamy
Logged
ferreiradaselva
Level 3
***



View Profile
« Reply #5374 on: June 27, 2017, 02:55:57 PM »

I abandoned STB vorbis to use libvorbisfile, because I thought STB vorbis it didn't have any function to decode at specific position, which would make impossible to make streams for large audio files. Turns out STB vorbis does have functions to decode at specific position! Time to erase what I did with libvorbisfile (mostly unsuccessful) and go back to STB vorbis.

Update:
Designing an architecture that takes in account both "normal" audio and stream audio makes me grumpy.
« Last Edit: June 27, 2017, 05:38:56 PM by felipefsdev » Logged

gimymblert
Level 10
*****


The archivest master, leader of all documents


View Profile
« Reply #5375 on: June 27, 2017, 11:12:01 PM »

Quote
gimymblert, did you try to batching the "cubes" as much as possible + use texture atlas?

That's not the relevant part, I'm not doing (yet?) voxel, I'm generating pcg terrain at very hi speed movement. Ie data with locality and no dependency. Ie the same function going through a coherent memory page with similar data and few inputs.
Logged

Ordnas
Level 10
*****



View Profile WWW
« Reply #5376 on: June 28, 2017, 02:41:41 AM »

Quote
gimymblert, did you try to batching the "cubes" as much as possible + use texture atlas?

That's not the relevant part, I'm not doing (yet?) voxel, I'm generating pcg terrain at very hi speed movement. Ie data with locality and no dependency. Ie the same function going through a coherent memory page with similar data and few inputs.

Could this be helpful to you: https://www.shadertoy.com/view/MdX3Rr ? Maybe you could take a similar approch for generating a pcg terrain with good performance.  Smiley
Logged

Games:

gimymblert
Level 10
*****


The archivest master, leader of all documents


View Profile
« Reply #5377 on: June 28, 2017, 05:17:37 PM »

I had remve the part about GPU computing, but I'm not at that level yetn though it might be simpler once I get the kick of it.

But one thing is that my data is not entirely local (well it is I generate such as there is matching data at edges), unlike many pcg that use it as heightmap or placement, I want circulation across tile, I have a local algorithm that is supposed to handle simulation of simulation of persistant npc on schedule that travel beyond a "local" tile, I achieve this on paper using correlation other causation (hence simulating the simulation) and a high branching factor to filter out at the tile level a quasi infinite number of npc down to the right npcs at teh right time. Simpler that it looks because there is just a key leap of perspective to do that makes it trivial (hence correlation, the continuity of the npc is just an emergent property of the underlying model, just like pixel simulate movements by lighting up at interval without actually moving).

But since I'm dumb I need to figure out how to map properly a planet sphere so that I quesry neighbourg across corner and edge, once I have that I can map the entire planet with predictable path, if path is predictable, then time of travel is predictable (it's a function of length and speed as parameter) which mean I can have instant pathfinding using stochastique routine (ie choose a random sequence out of a multi path template).
Logged

Ordnas
Level 10
*****



View Profile WWW
« Reply #5378 on: June 29, 2017, 02:11:02 AM »

I had remve the part about GPU computing, but I'm not at that level yetn though it might be simpler once I get the kick of it.

But one thing is that my data is not entirely local (well it is I generate such as there is matching data at edges), unlike many pcg that use it as heightmap or placement, I want circulation across tile, I have a local algorithm that is supposed to handle simulation of simulation of persistant npc on schedule that travel beyond a "local" tile, I achieve this on paper using correlation other causation (hence simulating the simulation) and a high branching factor to filter out at the tile level a quasi infinite number of npc down to the right npcs at teh right time. Simpler that it looks because there is just a key leap of perspective to do that makes it trivial (hence correlation, the continuity of the npc is just an emergent property of the underlying model, just like pixel simulate movements by lighting up at interval without actually moving).

But since I'm dumb I need to figure out how to map properly a planet sphere so that I quesry neighbourg across corner and edge, once I have that I can map the entire planet with predictable path, if path is predictable, then time of travel is predictable (it's a function of length and speed as parameter) which mean I can have instant pathfinding using stochastique routine (ie choose a random sequence out of a multi path template).

It seems rather complicated.
Logged

Games:

gimymblert
Level 10
*****


The archivest master, leader of all documents


View Profile
« Reply #5379 on: June 29, 2017, 07:16:10 PM »

Depend, conceptually it's simple, it's hard to tell because I break the typical view of the object I want to simulate. Then I haven't fully implemented and only work on simple case, so who know how detail it can be, if you are expecting GTA level of detail I'm not sure.

But the concept is simple, let say you have a simple city generation and we will ignore traversal. Let say that city have only home and workstation. You know that people are at home from 16h to 8h and at work from 8h to 16h, it's divergent, you can't be at home and at work at the same time. So let say you seed home and work with ID, home and workstation with the same ID seed will generate the same npc, and since the npc is only active in that place at the given divergent time, it's like they go from home to work and viceversa, it can only be at one place at a time because they are generate by each place at different time. My attempt is to generalized that idea to every place, even the street tile, simulating the simulation of deplacement.

The second trick is simple too. Let's say you want to generate a landscape with a river, river flow from mountain to sea, in general people generate the mountain first the try to find the river path, which mean analyzing the mountain, which takes time. But the truth is that the river correlate to the mountain slopes that cause it, so why not generate the river first and then find the mountain? you know where the starting point is necessary at a higher level than the lower points, so placing the river first make more sense and give you more freedom, similarly you can generate the sea at the end point of the river. It's the same to generate city like port city that need to be close to the sea, generate the condition first THEN the correlated landscape.

The third trick is that a path is basically a line connecting a point A to a point B. It doesn't matter how twisted it is, it can be express as a percent, so knowing the interval of time you can express where on the path a character is at a given moment, and you can break the path into smaller interval. The idea is to build structure such as while not knowing all the entire path, you can deduce if a npc is present on a particular segment. This is because we distribution of population through PGC, we treat population exactly like a path, an interval and a percent, so each area have a percent of the whole which is a smaller interval, so you can deduce the index (which will be used as a seed) of npc in a zone based on that interval recursively. Then Npc generation is simply correlated to the tile his interval lands on.

If I succeed at making a good simple implementation of the first example, I will see if I can generalize to complex event schedule and story.
« Last Edit: June 29, 2017, 07:23:40 PM by gimymblert » Logged

Pages: 1 ... 267 268 [269] 270 271 ... 295
Print
Jump to:  

Theme orange-lt created by panic