They also have a vastly different output medium. NTSC is roughly 640x240, 60 fps. Since it's interlaced that winds up being 640x480 at 30. The amount of silicon needed to drive that is pretty trivial nowadays. And there's absolutely no advantage in rendering faster since the display device is utterly incapable of doing anything with it. Their input is also locked to 1/frame, and the hardware is unlikely to do anything other than that.
The PC is wildly different -- the output device (your monitor) is often capable of much higher refresh rates and the input sampling can vastly exceed 1/frame (PS/2 mice are more limited unless you tweak things).
I wish people would stop comparing computer rendered images to film/video though. They're simply not comparable. When you film/tape something in real life you don't "lose" information between frames. If the object is moving fast enough that you can't precisely capture it on a single frame then you get motion blur. Our brains are extremely capable of seeing a blurred object in successive frames and sorting it out as the proper object that's moving very, very fast. For computer generated images the same is not true -- each frame is drawn disctinctly with no blur whatsoever (shortly before 3Dfx went under they came out with the "T-buffer" to handle this issue... screenshots looked like ass, but it's not something designed for static images. I never saw it in reality, but the theory behind it is sound), so if an object is moving faster than your frame rate it will appear to stutter across the screen. You can simply ensure that nothing will move that fast, but if the user's system gets bogged down to a really low fps then there may not be anything you can do to prevent it.