There are only so many permutations a developer can test while building their game.
On a large enough scale, this is true. On a realistic scale, it is not.
Back when I worked in game development, There were literally hundreds of different physical hardware configurations that were tested above baseline. Baseline being the minimum spec with a driver and OS base that was two stable releases back. So, let's say you had an Nvidia card. Two WHQL releases back and up would be tested. Let's say the box/specs stated "Windows XP". Windows XP, XP with SP1, SP2, and SP3 would be tested. Whatever was missing generally was installed by the redistributables as part of installation.
Although heinously expensive, there was a hardware virtualization program that would simulate edge cases that could be tested against using automation. Unfortunately, I do not recall the name of it but it had a fairly huge database of variables for almost any kind of GPU, CPU, and OS you could test against that would mostly meet or exceed minspec.That's how a bug in Intel's HD graphics drivers were discovered. Technically, the iGPU met the minimum requirement for the title but Intel didn't quite implement how their drivers interacted with DX to spec so it produced some pretty gnarly graphics corruption.
At any rate, the short answer is "Because, fuck you, you'll buy it anyway and we've learned from the sales data that you don't care if we put anything more than the bare minimum in to make sure it at least loads."
If you want more, demand more. If you're not getting it, vote with your wallet and patience. I'd love to play the newest Jedi
game but I am certainly not going to spend $70 for it nor deal with the stuttering and low framerates that have been demonstrated. I can wait until it hits a deep discount and has those issues ironed out.
"Just take a look around you, what do you see? Pain, suffering, and misery." -Black Sabbath, Killing Yourself to Live.
“Man was born free, and he is everywhere in chains” -Jean-Jacques Rousseau