Any NVidia chipset from the past couple of years
can decode H.264 in hardware, with minimal CPU usage. Heck, even my lowly fanless GF6200 card here can do it.
The link seems to indicate only recent (2007+) chipsets can handle higher resolution video with minimal CPU use. It also seems to indicate that these benefits are more pronounced on mid to high end products, not the lower end. That means potentially paying close to $100 for a video card. Even the 6200 I had last week was $70
I'm more interested in MPEG2 decoding, which "duke" currently does in software here. Over the holidays I may upgrade it to latest MythTV-svn (bleeding edge), which has good support for the NVidia hardware decoding stuff.
You might want to look at the AMD/ATI MPEG 2 features.
MPEG2 HD is definitely taxing.
The capture system requires very little CPU power, no GPU power and only moderate disk performance, even for multiple simultaneous record and playback streams. I find it optimal even for one viewing location, and when I add another, it's a no-brainer as I'll be able to access the exact same interface with the very same recorded and live content regardless of where I'm sitting.
I haven't found any way to BYO streaming appliance as cheap as buying one premade. Certainly not with typical motherboards and CPUs. One option would be to get something like a Popcorn internal board and then run alternate software on it (since it runs on some linux I believe)