That reference mentions a general definition that bandwidth is the range of spectrum taken by a signal, but otherwise it seems to be looking at it solely from the point of view of digital signal transmission.

I'm not too happy with the "bandwidth is measured in megabits per second" either. The latter is the rate at which data is transferred, and although everything else being equal a higher data rate requires a greater bandwidth, there are other variables in the equation.

In its purest form, bandwidth is the measure of the difference between the lowest and the highest frequencies employed to transmit the signal, be it analog or digital, and is measured in Hertz (or kHz, MHz, etc).

The term was in use long before fiber-optic cable was invented, and can refer to a signal sent down a twisted pair of wires, a coaxial cable, a microwave waveguide, or broadcast through the air.

For example, the signal sent down a regular phone line is normally limited to a range of about 300 to 3400Hz, a bandwidth of some 3.1kHz, while a good hi-fi system might have a bandwidth of 20kHz or more.

For comparison, an AM broadcast radio signal has a bandwith of 9 or 10kHz, an FM radio signal 200kHz, and a TV signal (U.S. system) about 6MHz.