(EDIT a big portion of the original question has been somewhat answered so it is out of date. For a more up to date version of what I am looking for now, please skip to EDIT 1, and use the following as just a reference for context.)
I am trying to decode a software UART from a microcontroller on my oscilloscope. To do this, I am probing the Tx line with reference to ground, and using the decode feature on my oscilloscope (Siglent SDS 1104X-E). To configure the UART on the oscilloscope side, I have to choose a custom baud rate, as the baud rate of my software UART isn't fine tuned properly. To do this, I measure the period of a bit in the UART data. Then, using the period I simply divide 1bit by the period to get a value in bps.
For example, in some UART data that I had sent, I measured the length of 1 bit to be 0.2208ms, so
(1 bit) / (0.2208 * 10^(-3) s) ~= 4529 bps
I then set the custom baud rate on my oscilloscope to this value, and press the decode button expecting to see a nicely decoded signal; however, the receiver appears to have received garbage as seen below
https://imgur.com/a/xMyAuT4" alt="" class="bbc_img" />
What confuses me is that if I dial down the baud rate to around 4200 Baud, the signal is decoded perfectly as seen below
https://imgur.com/a/NihQgJ9" alt="" class="bbc_img" />
This value is so far off from the actual baud rate that I calculated from the signal, so what is going on here? The data is accurate and clean, but the baud rate required for decoding it is so far off.
I would also like to note that I don't believe that this is an issue with the oscilloscope, because when I send the data to my PC through an FTDI breakout, and monitor it with
screen /dev/ttyUSB0 4529
I receive garbage data. The characters received are unknown and result as the unknown ascii character symbol; however what's different in that scenario is that even if I change around the baud rate for `screen`, it is still unable to decode the data no matter what I change it to. What's even weirder with that is that I am receiving the correct number of characters for whatever I send, its just the characters themselves are for some reason unrecognizable, even though the oscilloscope can decode them fine. So I am very perplexed as to what is going on here.
EDIT 1:
I now know that the reason the UART receiver was unable to decode at the measured baud rate (from 1 bit) was because the bit lengths were not equal. The baud rate was varying enough across each bit transmitted that it was unreadable at the receiver end. (I am guessing that the receiver can read it at a lower baud rate because just on chance, the bit timings space out enough that it lines up properly at a slightly lower baud rate). So now the real question is what is causing these variances in the bit timings?
Aside:
Here is the code for the transmit function and the timing/delay function in the firmware for the microcontroller ATtiny84A
void timer_delay(void) // Timer which sets the delay to give the bits their specific lengths.
{
TCNT0 = 0; // Reset the time
TCCR0B |= (1 << CS01); // start the timer with /8 prescaler.
while (!(TIFR0 & (1 << OCF0A))); // Wait until the compare interrupt flag is set
TIFR0 |= (1 << OCF0A); // Reset the Compare flag
TCCR0B &=~ (1 << CS01); // Stop the timer.
}
void uart_tx(unsigned char *transmit_data)
{
uint8_t string_length = strlen(transmit_data);
for (unsigned char character = 0; character < string_length; character++)
{
// Set TX low for start bit
PORTB &=~ (1 << PORTB1);
timer_delay();
for (unsigned char character_bit = 0; character_bit < 8; character_bit++) // Loop through each bit in the character
{
if ((1 << character_bit) & transmit_data[character]) // Check if the bit is a 1
{
PORTB |= (1 << PORTB1); // transmit a 1
timer_delay();
} else { // else if the bit is a 0
PORTB &=~ (1 << PORTB1); // transmit a 0
timer_delay();
}
}
PORTB |= (1 << PORTB1); // transmit the stop bit
timer_delay();
}
}
P.S. If someone could tell me how to insert inline pictures in these posts, I would be very grateful.