You very likely are exceeding the core's Volt-microsecond capability. Perhaps remanence is higher and you are reaching saturation earlier than calculated.
But even then 135Vus seems to me a little on the low side to properly drive an SCR.
If I remember correctly, they are at least 300Vus.
How do you determine if a specific volt-second rating can drive a particular SCR, like i said the SCR datasheet says Igt as 2 to 15mA ?
Depends on application.
Is the SCR ready to fire (V_AK positive)? If so, a few microseconds at, say, 10 * I_gt will do.
Why 10 times? Because Igt is the guaranteed minimum, and it will turn it on slowly. You get faster switching, lower switching loss, and higher peak dI/dt capacity, with higher drive. In short, it's cooler and safer. And it doesn't cost much drive power.
If it's not ready to fire, like in an AC application, then the better method is AC drive, using an antiparallel diode to deliver pulses of gate current while clamping the transformer's flyback voltage. In a mains application, the delay between firing pulses won't matter much (i.e., if it's being driven for 10us, then the transformer resets for 10us, then repeat, it will only fire in that 10us window, every 20us; but that's fine for mains that's moving in the 10s of ms).
If it's an inverter application, better to use a voltage doubler circuit, to deliver continuous gate current, so the SCR turns on as soon as it becomes forward biased. The primary winding should be driven with a full wave bridge (any will do: half bridge, full bridge, push-pull).
The trigger circuit is then mostly the same, but instead of the enable signal producing a single pulse, an oscillator is gated by the enable signal.
Tim