Quote:
If bunches of related equipment, say computers & attached (USB, Firewire, DVI-D, etc.. ) accessories, all use polarized wiring, then their PSU's will be in phase and maybe that means they'll have lower interference/emissions or some such thing -- pure speculation on my part.


Switch mode PSUs take their bite equally on each half of the mains cycle so on a single phase supply it makes no difference...

They employ a full wave rectifier across the mains supply that feeds a big capacitor. The capacitor supplies a switching circuit that chops this rectified raw mains, at around 20kHz, into a step-down transformer. The capacitor is there to ensure the fast switching circuit has sufficient power in between the crests of the mains 50/60Hz sine wave.

Now, getting to my point here: I suspect you were thinking about power factor (PF)? This relates to where on the mains sine wave you draw your power. The utility co.'s want us all running with a PF ratio of 1 - an example of such a device is a simple resistive electric heater. With a PF of 1 we are presenting an even load throughout the cycle, anything less than 1 means were are 'unevenly' loading the supply. PF<1 is bad because it is more expensive to produce and distribute and it 'looks' to our electricity meters that we are using more kWH.

Switch mode PSUs are inherently bad here - they only draw current at the crests of the sine wave when the mains voltage exceeds the voltage in the capacitor. Natively, they have a PF of around 0.5 but nowadays internal correction components tend to make them around 0.7 or higher.