The 1 megohm input resistance is standard for most oscilloscopes. If you want to attenuate the input, you generally use a probe with built in attenuation.
For really high frequency work, the 1 megohm input becomes useless due to shunt capacitance and propagation delay via the probe cable. Instead, the oscilloscope will have a 50 Ohm input impedance which can be connected via a transmission line and thus have no inherent frequency limitation. If you want to attenuate the signal in this case, you use a 50 Ohm attenuator.
Most decent oscilloscopes these days have the option of 1 megohm or 50 Ohms. In both cases one must respect the maximum rating. Typically for 1 Megohm, a few hundred volts or less is the highest safe value, which can be increased by using an appropriate probe. Typically for 50 Ohms it's around a few Volts, often expressed in dBm or milliwatts.