With Solid-State amplifiers, the impedance is not usually an issue unless you go below the rated impedance for that amp. Most ALL SS amps are good down to 4 ohms. They are a direct transfer of energy device. They do not care if the speaker impedance is 12.5 ohms or if it is 5.5 ohms. It will convert X voltage across that impedance to produce X watts of power. You may notice that most ALL SS amps state perhaps 100 watts @ 4ohms and then 50 watts @ 8ohms and then 25 watts @ 16ohms, this is because the amp produces a fixed voltage for it's output power. The conversion of that voltage into watts is determined by the speaker impedance. 8ohms is half of 16 ohms, so it will double the wattage dissipated. 4ohms is half of 8ohms so it will again double the wattage dissipated.
That being said ALL amplifiers have a minimum impedance that they can drive. In most consumer-grade amps, it is usually 4ohms. Given the THD ( total harmonic distortion ) rating that the specs state, I would venture a bet that the 150 watts and 75 watts respectively is because of the fact that they only showed you the max power it would produce, hence a higher THD. Powering an 8-ohm load out of the stereo out would simply cut your peak wattage in half and your THD would go from 10% to 4% or something like that. If you look at the back of the Power-Block, you will see next to each output that it says 8ohms MIN, or 4ohms MIN. This simply means that in Bridge mode, the MINIMUM impedance you can drive with that amplifier is 8ohms. So you must have an 8ohm speaker or higher to safely power it. For stereo operation, you cannot go below a 4ohm speaker cabinet for either channel.
Tube amplifiers have a different requirement and should only be connected to a speaker that matches the impedance selection set on the amplifier. That is a different subject though.