Sorry, but I think
you´re assuming this. In fact, being part of "the industry" myself, I´m pretty sure it´s that way.....
The industry has AFAIK no standard guitar (
think: ONE SINGLE GUITAR, not a series of 10 or more, because each piece of wood resonates differently), string guage, string material and construction, string manufacturer, pick strength, pick material, pick manufacturer, pick
ing strength, direction of pick stroke, tuning, pickup height relative to the string, wire length(yes, this also effects it by adding capacitance and increasing the distance between) the "+ and - poles", ROOM TEMPERATURE, .... that they at some time "set" as a standard for measuring output im Millivolts. But all of these factors affect the reading, and exactly that is the reason that not all manufacturers use them as a measurement. As you can see, this measurement requires quite a bit of control over the conditions to be reliable, and much more to be usable "between manufacturers".
Nor has the Industry set a standard for calibration of these results such as Artie suggested. His idea is of course great, but again, the Industry hasn´t done it yet. It´s like the truck (for laymen: "axle") manufacturers in the early days of skateboarding. Early on, decks were undrilled when you bought them. Why? Because no one had standardized truck baseplates. As soon as that happened, decks started coming pre-drilled....
While the measurements ARE usually representative of realtive output between pickups of one manufacturer, until a standard as outlined (there´s much more than that bit I posted) they´re more or less useless as soon as you start looking at another manufacturers´s catalog. Ballpark precision at best