I remember a time when games didn't require patches. Or if there was a patch, it was to add hardware support or address some totally obscure scenario. Around 1999-ish was when the corporate pushing really began setting in and products were being rushed to market faster and faster. QA was cut back or liquidated. Patches began flooding the market after tighter dev cycles and limited QA. Initial patches were small and eventually became more frequent. People were intially going "wtf", but after customers were trained and comfortable with the process it became the new standard. "Low investment, High return" was what a former EA exec informed us of at a developers convention lecture. We kept buying it and they all milked it.
IMHO, when it comes to patching: Hardware/driver issues are excusable, especially with a hardware market that is inundated. Gameplay, scripting, UI type issues are inexcusable. Especially with what I've seen in the past few years. Mind you, it's not just games. It is software industry wide (my current line of work). It's ridiculous on both sides of the curtain.
You know selling dlc before you patch the client doesn't impress upon me the need to support your shit. -massdev