Luminosities are expressed as magnitudes. The ancient astronomers recorded apparent magnitudes into the groups: bright, intermediate, and faint. These three groups would be sub-divided into a 6-category system and each group would become known as a magnitude. Each magnitude was brighter or fainter than the next magnitude by an approximately constant but undetermined factor, which is now known to have been of the order of 2.5. The apparent magnitude of a star reflects the apparent luminosity. The absolute luminosity is defined on a similar scale, with the distance taken as 10 parsecs: 2,000,000 astronomical units, about ten times the actual distance to the star nearest the Sun. Magnitude scales give brighter stars a lower numerical magnitude and fainter stars, a higher numerical magnitude. The brightest stars have apparent magnitudes near 0, the faintest stars seen by the eye are 6th magnitude. Absolute magnitudes range from 15 (faintest) to -10 (brightest).
Distances can be determined, for stars near the Sun, using the parallax method. Given apparent magnitude and distance, the absolute magnitude can be calculated.
Astronomers plotted apparent magnitude versus spectral type (roughly, stellar surface temperature), but found only a disordered scattering of points that seemed to mean nothing. However, as distances became available, spectral type was plotted versus absolute magnitude: a quite distinctive pattern appeared, which meant that there were physical laws ruling the origin and evolution of stars. The Hertzsprung-Russell diagram (the HR diagram) has become a key interpretive tool, indispensable for discussing the history of stars.