The MIL-STD-1553 spec states that the impedance on the stub/bus should be 78 ohms. Why do manufacturers create cables with variations in impedance levels and still claim that they meets the spec? For example, I can find 50, 75, 93, and 124 ohms...

This issue has an interesting history. When the MIL-STD-1553 designers did bus design optomization, they picked an impedance value that would permit longer buses and keep a stronger signal strength. Although not commonly used, some designers do use it, and there are legacy designs that still demand it. But it may be difficult to find terminators supporting anything other than 78 ohm.

Also, the spec uses Zo values (not really 78 ohms), so you can tune the bus to fit different amplitudes and minimize waveform distortion. But 78 ohms is the accepted standard. Unless you are going really far, stay with the industry value of 78 ohms.