Author Topic: Bit Timing Issues With a Software Bit-Banged UART  (Read 4290 times)

0 Members and 1 Guest are viewing this topic.

Offline KalciferTopic starter

  • Regular Contributor
  • *
  • Posts: 82
  • Country: ca
Bit Timing Issues With a Software Bit-Banged UART
« on: May 06, 2021, 05:22:08 am »
(EDIT a big portion of the original question has been somewhat answered so it is out of date. For a more up to date version of what I am looking for now, please skip to EDIT 1, and use the following as just a reference for context.)

I am trying to decode a software UART from a microcontroller on my oscilloscope. To do this, I am probing the Tx line with reference to ground, and using the decode feature on my oscilloscope (Siglent SDS 1104X-E). To configure the UART on the oscilloscope side, I have to choose a custom baud rate, as the baud rate of my software UART isn't fine tuned properly. To do this, I measure the period of a bit in the UART data. Then, using the period I simply divide 1bit by the period to get a value in bps.

For example, in some UART data that I had sent, I measured the length of 1 bit to be 0.2208ms, so

(1 bit) / (0.2208 * 10^(-3) s) ~= 4529 bps

I then set the custom baud rate on my oscilloscope to this value, and press the decode button expecting to see a nicely decoded signal; however, the receiver appears to have received garbage as seen below

https://imgur.com/a/xMyAuT4
" alt="" class="bbc_img" />

What confuses me is that if I dial down the baud rate to around 4200 Baud, the signal is decoded perfectly as seen below

https://imgur.com/a/NihQgJ9
" alt="" class="bbc_img" />

This value is so far off from the actual baud rate that I calculated from the signal, so what is going on here? The data is accurate and clean, but the baud rate required for decoding it is so far off.

I would also like to note that I don't believe that this is an issue with the oscilloscope, because when I send the data to my PC through an FTDI breakout, and monitor it with

Code: [Select]
screen /dev/ttyUSB0 4529
I receive garbage data. The characters received are unknown and result as the unknown ascii character symbol; however what's different in that scenario is that even if I change around the baud rate for `screen`, it is still unable to decode the data no matter what I change it to. What's even weirder with that is that I am receiving the correct number of characters for whatever I send, its just the characters themselves are for some reason unrecognizable, even though the oscilloscope can decode them fine. So I am very perplexed as to what is going on here.

EDIT 1:

I now know that the reason the UART receiver was unable to decode at the measured baud rate (from 1 bit) was because the bit lengths were not equal. The baud rate was varying enough across each bit transmitted that it was unreadable at the receiver end. (I am guessing that the receiver can read it at a lower baud rate because just on chance, the bit timings space out enough that it lines up properly at a slightly lower baud rate). So now the real question is what is causing these variances in the bit timings?

Aside:
Here is the code for the transmit function and the timing/delay function in the firmware for the microcontroller ATtiny84A

Code: [Select]
void timer_delay(void) // Timer which sets the delay to give the bits their specific lengths.
{
TCNT0 = 0; // Reset the time
TCCR0B |= (1 << CS01); // start the timer with /8 prescaler.
while (!(TIFR0 & (1 << OCF0A))); // Wait until the compare interrupt flag is set
TIFR0 |= (1 << OCF0A); // Reset the Compare flag
TCCR0B &=~ (1 << CS01); // Stop the timer.
}

void uart_tx(unsigned char *transmit_data)
{
uint8_t string_length = strlen(transmit_data);
for (unsigned char character = 0; character < string_length; character++)
{
// Set TX low for start bit
PORTB &=~ (1 << PORTB1);
timer_delay();

for (unsigned char character_bit = 0; character_bit < 8; character_bit++) // Loop through each bit in the character
{
if ((1 << character_bit) & transmit_data[character]) // Check if the bit is a 1
{
PORTB |= (1 << PORTB1); // transmit a 1
timer_delay();
} else { // else if the bit is a 0
PORTB &=~ (1 << PORTB1);  // transmit a 0
timer_delay();
}
}

PORTB |= (1 << PORTB1); // transmit the stop bit
timer_delay();
}
}


P.S. If someone could tell me how to insert inline pictures in these posts, I would be very grateful.
« Last Edit: May 07, 2021, 07:17:51 am by Kalcifer »
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11236
  • Country: us
    • Personal site
First of all, have you tried to zoom in on the bit to measure the bit duration accurately? At this time scale there is a lot of variation in 1 pixel.

And then, don't expect real hardware receivers to be able to accept random non-standard baudrates like this.

For the receiver, first of all, install a real terminal program (minicom or picocom), then do your baudrate setting and transmit some bytes to see what baudrate was actually set.
Alex
 

Offline KalciferTopic starter

  • Regular Contributor
  • *
  • Posts: 82
  • Country: ca
First of all, have you tried to zoom in on the bit to measure the bit duration accurately? At this time scale there is a lot of variation in 1 pixel.

Yes. I have zoomed in as far as feasible into one bit to measure its duration, the value of which was stated in the question.

And then, don't expect real hardware receivers to be able to accept random non-standard baudrates like this.
I adjusted the baud rate to be about 4735 baud (measured on the oscilloscope) and then set the oscilloscope to decode at a standard 4800 baud, and yet it was still unable to decode it without manually dropping the baud rate to about 4400 baud.

For the receiver, first of all, install a real terminal program (minicom or picocom), then do your baudrate setting and transmit some bytes to see what baudrate was actually set.
Okay, heres where things are getting interesting. I installed "picocom" and connected it to the serial port where I am receiving the UART data. I set it to 4800 baud and received the same garbage data as before, BUT I set it to 4400 baud, and all of a sudden I was able to receive the data properly, EVEN THOUGH the measured baud was 4735... what the heck is up with that?
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11236
  • Country: us
    • Personal site
Can you post pictures of those zoomed in bits?

Measure the baudrate on the output of the FTDI chip and see what value do you get. This will indicate true baudrate of the receiver, since transmitter would use the same baudrate generator.
Alex
 

Offline coppercone2

  • Super Contributor
  • ***
  • Posts: 9420
  • Country: us
  • $
your compiler might be doing it, i had this problem, I had to write fake numbers in the compiler to get the correct baud, with bare code. drove me nuts
 

Online 2N3055

  • Super Contributor
  • ***
  • Posts: 6599
  • Country: hr
What microcontroller and how you clock it. It better not be internal RC oscillator. That is too inaccurate and unstable for UART to work reliably all the time.
On AVR for instance, you can enable clock out to a uC pin so you can measure accurate clock frequency.
Verify and fix things one at a time.

First check clock.
Second, as Ataradov said, measure the interval better. Use more than 1.4 MPts on scope to gain better accuracy, and zoom in.
Better yet, use automated measurements..

As he said, use standard baud rates.. Makes life easier.
 

Offline JohnnyMalaria

  • Super Contributor
  • ***
  • Posts: 1154
  • Country: us
    • Enlighten Scientific LLC
Measure a broader range on the scope. I can see that the duration of single bits looks variable, so just picking the first one isn't a good idea.

e.g., there's a portion of high-low-high-low-high-low that you could use to get a much better idea of the duration of 6 bits.
 

Offline KalciferTopic starter

  • Regular Contributor
  • *
  • Posts: 82
  • Country: ca
Alright quick update on where I am at:

First of all, have you tried to zoom in on the bit to measure the bit duration accurately?

I came back to this to really verify the bit lengths. Before, I had only been measuring one bit length, and not all the bits individually - I had been assuming that all bits were the same bit length as the one that I was measuring. Big mistake (As assumptions almost always are!). After measuring the bits, I found that there was some pretty substantial variance in the bit lengths. I measured a ~16% difference in length between the start and stop bits which is not inconsequential by any stretch of the word. So there is some strange timing stuff happening behind the scenes which is causing this to happen.

I made some changes based on what @2N3055 said:
What microcontroller and how you clock it. It better not be internal RC oscillator. That is too inaccurate and unstable for UART to work reliably all the time.
On AVR for instance, you can enable clock out to a uC pin so you can measure accurate clock frequency.

I originally was using the internal clock of the microcontroller (originally at ~8MHz, with a fuse bit prescaling it to ~1MHz). I modified a fuse bit to clock it out to an external pin and measure it. Lo-and-behold the clock signal was garbage. I was seeing some pretty substantial and definitely not insignificant jitter from the internal clock. So, I needed a better clock source. I currently do not have any 8MHz crystal oscillators (only 16MHz ones, but that would require the voltage to go up higher than I would like), so I changed another fuse bit to allow an external clock source which I am providing with my function generator (Siglent SDG 2042X). The clock signal is now a pretty darn accurate and stable 8MHz. I also removed the prescaler fuse bit, so the internal clock is now 8MHz instead of 1MHz.

Using this new stable clock I tried sending another character, but unfortunately the timing issue persisted.

I was looking at my code, and I think what might be happening is that the microcontroller is wasting some time trying to compute some operations in the uart transmit function which is appearing as varying delays in the output signal. If this is the case, I'm not entirely sure how I should go about fixing this.

I will add the code into the original question for anyones reference.

 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11236
  • Country: us
    • Personal site
Re: Bit Timing Issues With a Software UART
« Reply #8 on: May 07, 2021, 05:23:44 am »
Wait, I just looked at your code. You are bit banging UART with manual delays. This is not great. At the very least I would not restart the timer, but keep the timer running  and use current value for the delay. Interaction between the prescaler and the timer clock may introduce high variance of the delay. So will other interrupts.
Alex
 

Offline KalciferTopic starter

  • Regular Contributor
  • *
  • Posts: 82
  • Country: ca
Re: Bit Timing Issues With a Software UART
« Reply #9 on: May 07, 2021, 06:10:41 am »
Wait, I just looked at your code. You are bit banging UART with manual delays. This is not great. At the very least I would not restart the timer, but keep the timer running  and use current value for the delay. Interaction between the prescaler and the timer clock may introduce high variance of the delay. So will other interrupts.

If I understood you correctly, this is what I came up with based on what you said
Code: [Select]
CCR0A |= (1 << WGM01); // waveform generator mode 2
TIMSK0 |= (1 << OCIE0A); // Enable compare interrupt

void timer_delay(void)
{
TIFR0 |= (1 << OCF0A); // Reset the Compare flag
if (TCNT0 > (255 - timer_value))
{
OCR0A = timer_value - 255 - TCNT0;
} else
{
OCR0A = TCNT0 + 8;
}
while (!(TIFR0 & (1 << OCF0A))); // Wait until the compare interrupt flag is set
}

and then I start the timer at the beginning of the uart transmit function and end it at the end of the function. Unfortunately, the problem was not solved using this method. Would it be better to use an interrupt service routine to transmit the data? so transit a bit during each interrupt service routine.
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11236
  • Country: us
    • Personal site
Re: Bit Timing Issues With a Software UART
« Reply #10 on: May 07, 2021, 07:01:26 am »
You read current timer value, add necessary interval, wait until timer value reaches the calculated value. No need to rely on overflow flags or anything like that.

Also, I'm sure there are a ton of soft UART implementations for AVR.

No matter what you do, you need to make sure that your delay routine is capable of generating consistent delays,
Alex
 

Online 2N3055

  • Super Contributor
  • ***
  • Posts: 6599
  • Country: hr
Re: Bit Timing Issues With a Software UART
« Reply #11 on: May 07, 2021, 07:12:46 am »
Like Alex said, I would find implementation that works. Maybe one that uses USI, if you're not using it for something else.
Or, if there is no hard need for Attiny84, use an AVR with UART...
 

Offline KalciferTopic starter

  • Regular Contributor
  • *
  • Posts: 82
  • Country: ca
Re: Bit Timing Issues With a Software UART
« Reply #12 on: May 07, 2021, 07:17:31 am »
You read current timer value, add necessary interval, wait until timer value reaches the calculated value.

Okay, from that, here is what I have
Code: [Select]
void timer_delay(void)
{
TIFR0 |= (1 << OCF0A); // Reset the Compare flag
OCR0A = TCNT0 + timer_value;
while (!(TIFR0 & (1 << OCF0A))); // Wait until the compare interrupt flag is set
}

No need to rely on overflow flags or anything like that.

If I may ask, how come? If the timer is at say 250 when its max count is 255 and I want to add 40, wouldn't that overflow the timer, so the compare condition would never be met would it?

Also, I'm sure there are a ton of soft UART implementations for AVR.

There's a few; however, from what I have found thus far, they all rely on using parts of the USI for the UART. The ATtiny84A only has a single USI which I am already using for something else, so that method will not work for me. So, I must get a bit banged method going.
 

Offline KalciferTopic starter

  • Regular Contributor
  • *
  • Posts: 82
  • Country: ca
Re: Bit Timing Issues With a Software UART
« Reply #13 on: May 07, 2021, 07:19:57 am »
if there is no hard need for Attiny84, use an AVR with UART...
I could, I have other microcontrollers that have a UART, but I honestly want to figure out how to get this working. I'm in no rush to get the communication working, I just want to learn as much as I can about what's going on with this.
 
The following users thanked this post: 2N3055

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11236
  • Country: us
    • Personal site
Re: Bit Timing Issues With a Software UART
« Reply #14 on: May 07, 2021, 07:23:19 am »
You obviously need to handle overflows like this. But also, if you are waiting in a loop, why not just do a software loop with a fixed number of cycles? At least as a starting point.

But generally you need to figure out what is wring with the delay generation function. I don't really know.  I abandoned AVRs years ago and my life has been better for that.

Here is my soft UART (RX only) implementation in assembly https://github.com/ataradov/logic/blob/master/firmware/soft_uart.S . Don't know if that helps at all though.
Alex
 

Offline KalciferTopic starter

  • Regular Contributor
  • *
  • Posts: 82
  • Country: ca
Re: Bit Timing Issues With a Software Bit-Banged UART
« Reply #15 on: May 07, 2021, 08:18:50 am »
But also, if you are waiting in a loop, why not just do a software loop with a fixed number of cycles? At least as a starting point.

Alright, I used a simple for loop to generate the delay in place of the timer
Code: [Select]
for (uint8_t i = 0; i < 100; i++);
and it works perfectly. There is no perceivable variance in the bit lengths. Decoding works perfectly as well. Now this begs the question: Why don't the timers work for this? Is this an ill use of a timer for simply generating a delay when I could just use a software loop?
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11236
  • Country: us
    • Personal site
Re: Bit Timing Issues With a Software Bit-Banged UART
« Reply #16 on: May 07, 2021, 08:27:28 am »
Timers should work well for that. There is something you are doing wrong there. And I can't help you here, I have not used AVRs in years.
Alex
 

Offline MIS42N

  • Frequent Contributor
  • **
  • Posts: 511
  • Country: au
Re: Bit Timing Issues With a Software UART
« Reply #17 on: May 07, 2021, 11:17:45 am »
Unfortunately, the problem was not solved using this method. Would it be better to use an interrupt service routine to transmit the data? so transit a bit during each interrupt service routine.
I have written a few software UARTS. They all use timers and interrupts and written in assembler. I had a go at writing some AVR code a while back, IIRC you can mix assembler and C, define an interrupt routine such that it doesn't have all the state saving and restoring overhead (NAKED?) and just save and restore the registers etc. you use. To get it working, set up a timer to run continuously and reset every bit time (use the timer compare mode), but don't enable the compare interrupt so the interrupt routine is not invoked. When there is a byte to transmit, the C code tests a busy flag (initially clear) and loops until it is clear. When it is clear, set it, place the byte where the assembler can get to it, clear the timer's output compare flag (which could create an immediate interrupt if not cleared), and enables timer interrupts.  First entry to the interrupt routine, set the output to the start bit value. On subsequent interrupts, shift the byte and set the output as required. When all the bits are shifted, the next interrupt sets the stop bit, clears the busy bit, and disables the interrupt. Meanwhile the C code may have created another byte and is spinning. As soon as the busy flag is clear it will put another byte in place and the process repeats.

This may work with an interrupt routine coded in C, I have not tried it. If you are working with a low baud rate like 1200, it is worth a try. An assembler routine should work quite happily at 9600 or higher.

I didn't say how to receive, it is a bit trickier.

Don't be spooked by people who say various clocks are not good enough. Serial transmission should be tolerant of clock mismatch, 2 or 3% should be fine.

 

Online 2N3055

  • Super Contributor
  • ***
  • Posts: 6599
  • Country: hr
Re: Bit Timing Issues With a Software UART
« Reply #18 on: May 07, 2021, 11:31:14 am »
Unfortunately, the problem was not solved using this method. Would it be better to use an interrupt service routine to transmit the data? so transit a bit during each interrupt service routine.
I have written a few software UARTS. They all use timers and interrupts and written in assembler. I had a go at writing some AVR code a while back, IIRC you can mix assembler and C, define an interrupt routine such that it doesn't have all the state saving and restoring overhead (NAKED?) and just save and restore the registers etc. you use. To get it working, set up a timer to run continuously and reset every bit time (use the timer compare mode), but don't enable the compare interrupt so the interrupt routine is not invoked. When there is a byte to transmit, the C code tests a busy flag (initially clear) and loops until it is clear. When it is clear, set it, place the byte where the assembler can get to it, clear the timer's output compare flag (which could create an immediate interrupt if not cleared), and enables timer interrupts.  First entry to the interrupt routine, set the output to the start bit value. On subsequent interrupts, shift the byte and set the output as required. When all the bits are shifted, the next interrupt sets the stop bit, clears the busy bit, and disables the interrupt. Meanwhile the C code may have created another byte and is spinning. As soon as the busy flag is clear it will put another byte in place and the process repeats.

This may work with an interrupt routine coded in C, I have not tried it. If you are working with a low baud rate like 1200, it is worth a try. An assembler routine should work quite happily at 9600 or higher.

I didn't say how to receive, it is a bit trickier.

Don't be spooked by people who say various clocks are not good enough. Serial transmission should be tolerant of clock mismatch, 2 or 3% should be fine.

Well check datasheet then..

Accuracy at given voltage & temperature : ±10% at 25°C for factory cal

±1% for user cal at voltage and temperature as calibrated... Tempco is pretty much 1% per 1°C... if temperature of device is more than 5°C different than calibration temp, you're out of spec...
It can jump around as it pleases. And you make it work on desk, install it in a box and then intermittently it will fail.

Internal RC on AVRs is good for blinking lights and non time critical stuff. It works as SPI or I2C master because it clocks  the devices.. But for asynchronous UART, works well enough to get you in trouble.
To be avoided...
 

Offline MIS42N

  • Frequent Contributor
  • **
  • Posts: 511
  • Country: au
Re: Bit Timing Issues With a Software Bit-Banged UART
« Reply #19 on: May 07, 2021, 09:59:18 pm »
I checked the datasheet. It says less than 2% variation from 0 to 40 C. ATtiny24A-44A-84A-DataSheet-DS40002269A.pdf pg 244
 

Online 2N3055

  • Super Contributor
  • ***
  • Posts: 6599
  • Country: hr
Re: Bit Timing Issues With a Software Bit-Banged UART
« Reply #20 on: May 07, 2021, 10:36:28 pm »
I checked the datasheet. It says less than 2% variation from 0 to 40 C. ATtiny24A-44A-84A-DataSheet-DS40002269A.pdf pg 244

Sorry, I misread graph, for factor of 8 apparently.
But fact is that uncalibrated one is gambling and might be way off, and if you have 20-30°C difference from calibration temp it will drift right to the edge of spec and be marginal. Which I have experienced myself, that's why I said it. It is not unusual for electronics to achieve 40-65°C inside closed case. Put the box in the sun and it'll go crazy..
Also if you're making few (not that OP needs that), it is much easier to put int he crystal and have both the good initial frequency accuracy and stability, instead of having to calibrate each piece..
 

Offline Renate

  • Super Contributor
  • ***
  • Posts: 1460
  • Country: us
Re: Bit Timing Issues With a Software Bit-Banged UART
« Reply #21 on: May 09, 2021, 11:27:07 am »
Don't measure a single UART bit, just measure a whole start/char/stop.
It's more accurate.

If you're just sending on an ATTiny you can implement it using SPI.
A UART char is 10 bits (with start and stop), you just have to divide a sequence into 8 bit SPI bytes.

I have a pushbutton box with an ATTiny85.
It spits out UART char sequences.
For preset sequences it's a piece of cake to just put the SPI data in as constants.
For dynamic sequences, yes, it is a bit more pain to slice/dice up your UART data into SPI data.
 

Offline MIS42N

  • Frequent Contributor
  • **
  • Posts: 511
  • Country: au
Re: Bit Timing Issues With a Software Bit-Banged UART
« Reply #22 on: May 09, 2021, 12:51:39 pm »
Don't measure a single UART bit, just measure a whole start/char/stop.
It's more accurate.
What does this mean?
If you're just sending on an ATTiny you can implement it using SPI.
A UART char is 10 bits (with start and stop), you just have to divide a sequence into 8 bit SPI bytes.
It can be done using the USI, the documentation is http://ww1.microchip.com/downloads/en/Appnotes/doc4300.pdf. I don't see an advantage, it still uses up a counter. Using the USI will reduce bit jitter, which can be an issue for fast baud rates but at 1200 baud is insignificant. Otherwise it looks to me more complex, needs two interrupt routines rather than one, doesn't take advantage of the counter compare mode.
 

Offline Renate

  • Super Contributor
  • ***
  • Posts: 1460
  • Country: us
Re: Bit Timing Issues With a Software Bit-Banged UART
« Reply #23 on: May 09, 2021, 02:17:26 pm »
What does this mean?
Measure ten bit intervals, not one. Divide result by ten.
It can be done using the USI, the documentation is...
That's basically saying what I'm saying, except that they use two USI transfers per UART byte.
I'm using the USI as 8 bits so I'm doing 1.25 USI transfers per UART byte.
I'm only using one interrupt, to load the USI every 0.8 UART chars.
I'm using one of the timers directly into the the USI clock.
 

Offline MIS42N

  • Frequent Contributor
  • **
  • Posts: 511
  • Country: au
Re: Bit Timing Issues With a Software Bit-Banged UART
« Reply #24 on: May 09, 2021, 09:14:10 pm »
What does this mean?
Measure ten bit intervals, not one. Divide result by ten.
Maybe I'm being dumb, but that still doesn't make sense to me. Transmission rate is in baud, which usually means bits per second. 1200 baud is one bit every 866.667us. That's what you set the timer to make. If you are trying to work out an unknown input you time between two transitions and test bit rates time/1, time/2, ... time/9 because they are the shortest and longest possible bit rates. Ten doesn't come into the equation. The shortest stop bit is one bit but it can be any length so can't be relied on.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf