Welcome, Guest. Please login or register.

Login with username, password and session length

 
Advanced search

1411430 Posts in 69363 Topics- by 58416 Members - Latest Member: JamesAGreen

April 19, 2024, 10:24:47 PM

Need hosting? Check out Digital Ocean
(more details in this thread)
TIGSource ForumsPlayerGeneralDoes the FPS affect LAG
Pages: [1]
Print
Author Topic: Does the FPS affect LAG  (Read 876 times)
Cryogenic
Level 0
*


View Profile
« on: April 14, 2019, 02:58:28 PM »

Hi there! I have this question around since someone said otherwise. I always belived that higher FPS does not affect in anything with the LAG. If you have 15 FPS/s or 500 FPS/s, it will matter not because the server will only take the player´s actions, not the envirormental graphics or the filters or the res or anything related to the graphic quality in which you are playing, nor the FPS.

But this guy says the opposite. He says that higher FPS will lag more the game because the server will ask every fps like a video streaming. The amount of FPS has a direct impact in the online gaming´s speed. So the problem would be solve in locking at 30 fps the game.

Who is right?

Thx in advance
« Last Edit: April 16, 2019, 02:12:59 PM by Cryogenic » Logged
Schrompf
Level 9
****

C++ professional, game dev sparetime


View Profile WWW
« Reply #1 on: April 14, 2019, 11:15:08 PM »

"Lag" in it's original meaning means "network latency". Simply viewed this would mean: no, lag is not affected by the framerate of the rendering. But actually, it is affected to quite some extend.

You, the player, do this: you click, you press keys, you wiggle the gamepad, whatever floats your boat.

The game does not process this immediatly. What happens is that the micro-CPU in your control device turns these into messages and sends them over to the computer/console. There it's processed and put into some queue and *might* trigger an interrupt, but in reality mostly sits in some queue until the current application (the game) fetches it.

Your game has this update cycle where it repeatedly checks input sources, calculates how this affects the game characters, moves everything in the game world forward a tiny bit, and sends your stuff over to the server to tell how the player might want to act. This introduces a delay of one frame at max where your inputs sit around until the game reaches the point where it processes your stuff.

By now we have an input latency of up to one frame, plus a tiny world processing offset, because world processing is quick. Todays games spend their computing time mostly on graphics. So... 16ms.

The message itsself needs time to reach the server. This is the actual network latency part, and it can take 5ms to >100ms. If you're from Europe playing on a server in the USA, you have ~50ms of network latency simply because of the speed of light in fiber.

The server has a similar update cycle, minus the graphics. It runs in time slices, too, and network messages are incorporated only at one point of its cycle. Game servers are artificially slowed down to work at a fixed update rate, usually same or lower than a usual game framerate. So your player input sits there until the server gets to that point in its cycle, then its incorporated and the world is moved forward by one time slice. Then every details about the game world is sent back to every client. This adds another ~16ms.

One more serving of actual network latency for the trip back. 5 to >100ms.

Your application at your computer receives this world update after some time, but again it just sits there until your game reaches the point in its cycle where it incorporates network updates. This might add up to one frame.

Then builds instructions for the GPU to display all of these game world entities. The GPU adds these instructions to at least one more queue, and gets cracking on it one after another. Most of the time it's still busy with the instructions from the last frame, but there's an artificial limit on how many instructions the GPU driver queues up until it blocks. To my knowledge this is usually two frames. One GPU instruction is "show what you rendered", and this is the point where the player inputs finally affect some image you can see with your eyes.

Right now we're at 4 frames at most, plus two world processing times, plus the real network latency of pushing messages back and forth through many computers all over the world. 4 frames are ~60ms, world processing is maybe 1 or 2ms at most, actual network latency is 20 to 200ms. And only the network latency part of all of this chain is actually displayed at the screen as "ping".

So TL;DR: game framerate actually matters, and with fast internet it's even the main source of latency. Fast internet and infrastructure: 50 to 70ms of latency, nearly all of it caused by game performance. Slow internet: >200ms latency, computer performance doesn't really matter anymore.

In recent times a lot of less-computer-savage people say "LAG!!" and simply mean "low frame rate" even in single player games. So this meaning of "LAG" is obviously directly tied to frame rate.

[edit] restructured to clarify.
« Last Edit: April 14, 2019, 11:22:58 PM by Schrompf » Logged

Snake World, multiplayer worm eats stuff and grows DevLog
Cryogenic
Level 0
*


View Profile
« Reply #2 on: April 15, 2019, 08:08:26 AM »

"Lag" in it's original meaning means "network latency". Simply viewed this would mean: no, lag is not affected by the framerate of the rendering. But actually, it is affected to quite some extend.

You, the player, do this: you click, you press keys, you wiggle the gamepad, whatever floats your boat.

The game does not process this immediatly. What happens is that the micro-CPU in your control device turns these into messages and sends them over to the computer/console. There it's processed and put into some queue and *might* trigger an interrupt, but in reality mostly sits in some queue until the current application (the game) fetches it.

Your game has this update cycle where it repeatedly checks input sources, calculates how this affects the game characters, moves everything in the game world forward a tiny bit, and sends your stuff over to the server to tell how the player might want to act. This introduces a delay of one frame at max where your inputs sit around until the game reaches the point where it processes your stuff.

By now we have an input latency of up to one frame, plus a tiny world processing offset, because world processing is quick. Todays games spend their computing time mostly on graphics. So... 16ms.

The message itsself needs time to reach the server. This is the actual network latency part, and it can take 5ms to >100ms. If you're from Europe playing on a server in the USA, you have ~50ms of network latency simply because of the speed of light in fiber.

The server has a similar update cycle, minus the graphics. It runs in time slices, too, and network messages are incorporated only at one point of its cycle. Game servers are artificially slowed down to work at a fixed update rate, usually same or lower than a usual game framerate. So your player input sits there until the server gets to that point in its cycle, then its incorporated and the world is moved forward by one time slice. Then every details about the game world is sent back to every client. This adds another ~16ms.

One more serving of actual network latency for the trip back. 5 to >100ms.

Your application at your computer receives this world update after some time, but again it just sits there until your game reaches the point in its cycle where it incorporates network updates. This might add up to one frame.

Then builds instructions for the GPU to display all of these game world entities. The GPU adds these instructions to at least one more queue, and gets cracking on it one after another. Most of the time it's still busy with the instructions from the last frame, but there's an artificial limit on how many instructions the GPU driver queues up until it blocks. To my knowledge this is usually two frames. One GPU instruction is "show what you rendered", and this is the point where the player inputs finally affect some image you can see with your eyes.

Right now we're at 4 frames at most, plus two world processing times, plus the real network latency of pushing messages back and forth through many computers all over the world. 4 frames are ~60ms, world processing is maybe 1 or 2ms at most, actual network latency is 20 to 200ms. And only the network latency part of all of this chain is actually displayed at the screen as "ping".

So TL;DR: game framerate actually matters, and with fast internet it's even the main source of latency. Fast internet and infrastructure: 50 to 70ms of latency, nearly all of it caused by game performance. Slow internet: >200ms latency, computer performance doesn't really matter anymore.

In recent times a lot of less-computer-savage people say "LAG!!" and simply mean "low frame rate" even in single player games. So this meaning of "LAG" is obviously directly tied to frame rate.

[edit] restructured to clarify.

That was deffently a good and complete answear. That helps a lot, thx. So if i understood correctly, the GPU driver is the one that can affect the latency because of its queue limit in a fast internet connection, but it doesn´t matter when you have a slow connection because the instructions´ queue comes later (in a matter of ml of course).

Then, if i have 200 ping in Battlefield V (for example) and running it at 120 FPS (max FPS pike), the lag is coming from the slow internet connection. But if i have 50 ping and my FPS drops to 100 or less, then it´s a problem of my GPU driver that can´t hold all the coming queues in time.

That is my understanding of your comment, i don´t know if this is what you mean. But it makes sense to me. Thx again.
Logged
Schrompf
Level 9
****

C++ professional, game dev sparetime


View Profile WWW
« Reply #3 on: April 16, 2019, 12:05:36 AM »

Yes, sort like this. It's more like this:

Your inputs needs a bit of time until it's processed and its results sent to the server. Roughly one frame, so it's lower when got higher fps.

Then the message goes over the internet and needs time to reach the server. This is roughly halve of what games display as "ping".

The server needs a bit until it processes your message and sends you the new game state. Again roughly one frame, but "server frame", you don't have any saying in this and your hardware doesn't matter.

Then the server sends back the stuff, which needs the other halve of the "ping" time.

Then it's at your computer again, where it waits for up to one frame (so your fps matter again), and then sits up to two frames at the GPU until you (the human) can observe the result.

So: fast internet means low ping, means the LAAAG is a bit server and mostly your game fps. slow internet means long ping means your hardware doesn't really matter because the pure communication latency trumps the whole chain.
Logged

Snake World, multiplayer worm eats stuff and grows DevLog
Schoq
Level 10
*****


♡∞


View Profile WWW
« Reply #4 on: April 16, 2019, 06:57:40 AM »

So the problem would be solve in locking at 30 fps the game.
since everyone who knows this better than me is being super wordy I just wanna highlight specifically that is incredibly false

additionally, even if it were true a tiny bit your longer reaction times resulting from the low temporal fidelity rendering would probably offset that
Logged

♡ ♥ make games, not money ♥ ♡
Cryogenic
Level 0
*


View Profile
« Reply #5 on: April 16, 2019, 07:49:27 AM »

Yes, sort like this. It's more like this:

Your inputs needs a bit of time until it's processed and its results sent to the server. Roughly one frame, so it's lower when got higher fps.

Then the message goes over the internet and needs time to reach the server. This is roughly halve of what games display as "ping".

The server needs a bit until it processes your message and sends you the new game state. Again roughly one frame, but "server frame", you don't have any saying in this and your hardware doesn't matter.

Then the server sends back the stuff, which needs the other halve of the "ping" time.

Then it's at your computer again, where it waits for up to one frame (so your fps matter again), and then sits up to two frames at the GPU until you (the human) can observe the result.

So: fast internet means low ping, means the LAAAG is a bit server and mostly your game fps. slow internet means long ping means your hardware doesn't really matter because the pure communication latency trumps the whole chain.

Thx again!

So the problem would be solve in locking at 30 fps the game.
since everyone who knows this better than me is being super wordy I just wanna highlight specifically that is incredibly false

additionally, even if it were true a tiny bit your longer reaction times resulting from the low temporal fidelity rendering would probably offset that

So basically you are saying all Schrompf said is wrong? How so?
Logged
Schoq
Level 10
*****


♡∞


View Profile WWW
« Reply #6 on: April 16, 2019, 08:01:10 AM »

?
Logged

♡ ♥ make games, not money ♥ ♡
Schoq
Level 10
*****


♡∞


View Profile WWW
« Reply #7 on: April 16, 2019, 08:07:23 AM »

no?
Logged

♡ ♥ make games, not money ♥ ♡
Cryogenic
Level 0
*


View Profile
« Reply #8 on: April 16, 2019, 08:34:28 AM »

no?

My bad, i didnt read it correctly. Now i understand!
Logged
Pages: [1]
Print
Jump to:  

Theme orange-lt created by panic