the decoder has to first delimit a frame... it counts n rising/falling clock edges after the frame start.
With CS signal the start of frame is easy: the first rising/falling clock edge after CS.
with timeout.. the decoder is idle, at the first edge it will start counting.. the frame ends either at the nth edge or at timeout.
in your configuration you have put 4.23 us for timeout. that's too short, your packet is about 100 us long. what is happening is that it acquires the first bit (zero) then there is timeout, so it shows a zero byte. at the following edge it starts again, again zero. if you increased the timeout you would see the complete byte
if you zoom out you will see the events labels too tight, that's when the event list comes in handy.
if you zoom out too much you will see garbage because the decoding is done from the screen memory (even if trace is 1Meg samples long, the screen buffer is not.) so it doesn't decode properly