"Lag" in it's original meaning means "network latency". Simply viewed this would mean: no, lag is not affected by the framerate of the rendering. But actually, it is affected to quite some extend.
You, the player, do this: you click, you press keys, you wiggle the gamepad, whatever floats your boat.
The game does not process this immediatly. What happens is that the micro-CPU in your control device turns these into messages and sends them over to the computer/console. There it's processed and put into some queue and *might* trigger an interrupt, but in reality mostly sits in some queue until the current application (the game) fetches it.
Your game has this update cycle where it repeatedly checks input sources, calculates how this affects the game characters, moves everything in the game world forward a tiny bit, and sends your stuff over to the server to tell how the player might want to act. This introduces a delay of one frame at max where your inputs sit around until the game reaches the point where it processes your stuff.
By now we have an input latency of up to one frame, plus a tiny world processing offset, because world processing is quick. Todays games spend their computing time mostly on graphics. So... 16ms.
The message itsself needs time to reach the server. This is the actual network latency part, and it can take 5ms to >100ms. If you're from Europe playing on a server in the USA, you have ~50ms of network latency simply because of the speed of light in fiber.
The server has a similar update cycle, minus the graphics. It runs in time slices, too, and network messages are incorporated only at one point of its cycle. Game servers are artificially slowed down to work at a fixed update rate, usually same or lower than a usual game framerate. So your player input sits there until the server gets to that point in its cycle, then its incorporated and the world is moved forward by one time slice. Then every details about the game world is sent back to every client. This adds another ~16ms.
One more serving of actual network latency for the trip back. 5 to >100ms.
Your application at your computer receives this world update after some time, but again it just sits there until your game reaches the point in its cycle where it incorporates network updates. This might add up to one frame.
Then builds instructions for the GPU to display all of these game world entities. The GPU adds these instructions to at least one more queue, and gets cracking on it one after another. Most of the time it's still busy with the instructions from the last frame, but there's an artificial limit on how many instructions the GPU driver queues up until it blocks. To my knowledge this is usually two frames. One GPU instruction is "show what you rendered", and this is the point where the player inputs finally affect some image you can see with your eyes.
Right now we're at 4 frames at most, plus two world processing times, plus the real network latency of pushing messages back and forth through many computers all over the world. 4 frames are ~60ms, world processing is maybe 1 or 2ms at most, actual network latency is 20 to 200ms. And only the network latency part of all of this chain is actually displayed at the screen as "ping".
So TL;DR: game framerate actually matters, and with fast internet it's even the main source of latency. Fast internet and infrastructure: 50 to 70ms of latency, nearly all of it caused by game performance. Slow internet: >200ms latency, computer performance doesn't really matter anymore.
In recent times a lot of less-computer-savage people say "LAG!!" and simply mean "low frame rate" even in single player games. So this meaning of "LAG" is obviously directly tied to frame rate.
[edit] restructured to clarify.
That was deffently a good and complete answear. That helps a lot, thx. So if i understood correctly, the GPU driver is the one that can affect the latency because of its queue limit in a fast internet connection, but it doesn´t matter when you have a slow connection because the instructions´ queue comes later (in a matter of ml of course).
Then, if i have 200 ping in Battlefield V (for example) and running it at 120 FPS (max FPS pike), the lag is coming from the slow internet connection. But if i have 50 ping and my FPS drops to 100 or less, then it´s a problem of my GPU driver that can´t hold all the coming queues in time.
That is my understanding of your comment, i don´t know if this is what you mean. But it makes sense to me. Thx again.