As with so many things, there are multiple levels of 'standards' involved with USB charging.

At the core the problem is that the basic USB 'standard' was built around a 500ma max current per USB device. That is only 2.5 watts. Not enough for many devices and the reason for so many work arounds and extensions to the standard.

Some of the details were recently discussed in this iPhone related thread

Some USB chargers implement the Apple specification signaling and available current levels. See the other thread for details. Essentially Apple style dumb USB chargers come in three versions;
500ma max (older iPods and such)
1000ma (iPhone 5 watt USB charger)
2000ma (iPad 10 watt USB charger)
Apple devices will properly charge from chargers that can supply the required milliamperes or more. The device will simply consume what it needs, as the charger will never force more current that the device is willing to consume. So an iPhone will happliy draw about 1000ma from an iPad charger.

Correspondingly an iPad will limit its power draw from an iPhone charger to 1000ma as it can tell that the charger is limited to that capacity. It will simply charge more slowly than if it was connected to a proper iPad spec charger.

Connect an iPad to a powered USB hub or non-Apple style charger and it will limit itself to 500ma power draw. That is actually less power than the iPad needs when the display is active so an iPad in that situation will augment the 2.5 watts from the charger with some power from its own battery. The iPad will simply run down more slowly, and will display a Not Charging notice beside the battery icon.

Note that USB charging from computers and other 'smart' devices is done differently than from 'dumb' chargers. The signaling for allowable current levels is done via USB message protocol when charging from smart power sources.

Non-Apple style dumb USB chargers implement one of the non-Apple 'standards' of which there are several. Going from memory (and typing on an iPad);
500ma with D+ and D- pins open
500ma with D+ and D- pins shorted
500ma with D+ and D- pins resistor linked to indicate 500ma current available
1800ma 'China new standard' (there may be confusion here with a similar 1500ma China spec charger)

There are USB chargers that can supply other maximum current levels but they may not have the necessary D+ and D- pin signals to allow a given device to know the extra power is there.

And of course there will be cheaply made chargers that lie about their current capacities and/or just confuse the device that is trying to charge itself.

Motorola (among others) had/has their own proprietary flavours of USB charging specs, some of which conflict with the other 'standards'. This is why a USB charger from Nokia may not charge a Motorola phone or vice-versa, for example.

The recent European micro-USB charging standard is an attempt to fix the problem. I have not looked at how that specification handles USB charger current capacities in excess of 500ma.

I don't know if Android manufacturers have coordinated their specs for high power USB charging (greater than 500ma) or what those specs may be. Just haven't looked.

It is unclear to me whether there is any mobile/car charger (or even a charger spec) that can actually supply all the power required to fully satisfy an active LTE high end smart phone when all features are singing and dancing and at the same time charge the battery.