From an engineering perspective, the entropy content of a unit quantity of energy is representative of the amount of useful work that can be derived from it. By work I generally mean move something; useful work might be turning a drive shaft, for example, as opposed to waste heat that cannot.
Thermodynamics gives us a relationship between entropy S and energy E (taking some mild liberties):
dS/dE = 1/T
T in this case is the temperature. High entropy content is bad, so we can see energy that can achieve a high temperature must have lower entropy and hence be capable of doing more work for a unit mass. This follows obviously for chemical fuels. Based off their combustion temperature, the Carnot cycle predicts the theoretical maximum efficiency with which they can do work.
efficiency = (T_hot - T_cold) / T_hot
This result is used in something called exergy or availability analysis which is based off the Carnot cycle efficiency limitations. Exergy is really a wolf (entropy) in sheep's clothing. I won't go into more details on exergy at this time.
So we can easily figure out the efficiency of chemical fuels and from that either their exergy or entropy density, whichever you prefer. But how about electricity?
We might just say that your standard best electric motor has an efficiency of 95 % and leave it there. But that doesn't really tell us what the fundamental limit is. After all, superconducting electric motors can do better, and do.
I have been digging around in research journals looking for an answer, but found nothing thus far. As such, I basically decided to do some basic analysis. The basic model for a metallic conductor -- the Drude model -- states that conduction electrons move freely in a conductor as a free electron gas. A correcting factor, a damping time Tau, is inserted to reflect the collisions electrons can have with crystal defects and phonons. Tau can be derived from conductivity.
Tau = Conductivity*electron mass / (electron density * electron charge^2)
For Copper, Tau = 2.5 x 10^-14 s.
From Tau, we can find the drift velocity of electrons under an electric field,
v = e * electric field * Tau / m
where e is electron charge and m electron mass. And we can relate the temperature of an ideal gas (which electrons are in a pure sense) to the individual kinetic energy of an electron,
0.5 * m * v^2 = 1.5 * k_b * T
where k_b is Boltzmann's constant. Solving for temperature I find that,
T = 3 * (m /k_b)(conductivity*electric field/electron density*e)^2
The conductivity times the electric field is the current density in a conductor (usually abbreviated J, and I get the distinct impression I'm doing this ass backwards). One could relate the current density to the power density (p) and potential (voltage - V):
J = p/V
However, I think I have again taken a bigger than blog-sized bite, so I'll stop and leave it as an exercise for the reader to realize that the entropy content of electricity is very low indeed. The result that you should take from this is a realization that electricity is the best means of carrying useful work that we have, and probably will ever have.
A comparison of electricity to hydrogen is very illuminating. The 2nd law of thermodynamics is rather explicit. If you are reading about a hydrogen powered system, take note of its electricity powered equivalent. In all likelihood, the electrical system is more efficient. And in all likelihood, if an electrical system outperforms hydrogen now, it probably always will. I can see now that I probably should have just spewed forth numbers and arguments regarding revesibility rather than doing the analysis, but I do call myself Entropy Production for a reason. Among chemical fuels, hydrogen is king when it comes to an entropy (or exergy) analysis. It can do more work per unit mass than any other fuel (except maybe Acetylene). However, it remains just a chemical fuel.
Hydrogen can't hold a flame to Electricity.