But other factors, such as the cost of a system, availability of games (higher-end systems tend to be more difficult to develop for), mainstream appeal of games on a system, and so on can explain this.
in other words, the importance of tech is overrated and there are stronger factors.
You missed the point. "How important is tech?"
is a stupid question, because tech development is obviously paramount throughout the industry. It doesn't follow that cutting-edge products are the most profitable (AFAIK they almost never are).
If mainstream products didn't similarly undergo improvement, nobody would buy them. Plus, sales of cutting-edge products can help fund development of cheaper, more accessible ones. More importantly, actual tech from higher-end stuff trickles down into the lower end eventually. You can't tear the two apart and proclaim "tech is overrated"
. They're part of one drive that isn't ending anytime soon.
This is just my opinion, but while that is true, the overall quality of games has gone down in general from retro game times. The simpler systems had much more interesting and varied titles compared to now. Part of that was most likely the low cost and high speed of development, which allowed companies to take risks and make games around ideas rather than just what sold well last time
Agreed. Eventually the backwards, stubborn fools who responsible for funding projects will realize their folly, or a new funding model will come to dominate. In either case, it's true that 'modern' games require more time and money to develop. Much of this could be alleviated by more tools R&D (which a few big players are heavily invested in as is).
Tech dev drives the industry- just not always in the right direction IMO.
The quality of a game is constrained by the tech behind it.
This is false. Good, even great games can be produced using out-of-date, obsolete technology. And awful, terrible games can be produced using state-of-the-art tech. The history of video games is littered with examples of this phenomenon. The original NES is a perfect place to start.
Quality has nothing to do with technology. It is not measured by how many polygons can be pushed, but by the design decisions behind the game. You can make a good game using two pixels and a background color. And you can make an atrocity of a game that no one will want to play using Unreal Engine 3 and a budget of tens of millions of dollars.
Somewhat poor phrasing on my part, but notice I said 'constrained'. It's more useful to say that better tech enhances expressivity: you can make a better game with newer tech, because you're not burdened with the limitations of older tech. You can pull off more even if you set limitations (like a monochrome palette). Or, you can pull off the same with less effort, giving you more leverage in development.
For pushing technology as a driving force, the money simply isn't there. I think this is a lesson that this console cycle in particular has hit home.
We'll see. There's a limit to how many times people buy the same game. Technology and creativity are intertwined, and stagnation thereof arises from the same source. I do think the next gen & near-future PC games will be less-than-stellar technologically, but to think that trend is terminal is to miss the forest for the trees.