Hello everyone,
my name is Andrea Zambon, I'm from Italy and I've been curious about electronics for a long time. I've never really built anything even modestly complex until now, as I don't have any formal training in electronics (apart from a digital electronics course at the university, but nothing at all about analog stuff), I'm here to ask for some constructive criticism about my project.
Also constructive criticism about my english is welcome
So, this comes from another hobby that I have, RC model airplanes. As I like quite a bit glow plug engines, I decided to build my own on board glow plug driver. This is a circuit whose purpose is to heat up the engine glow plug(s) when the engine is at idle.
For those who don't know much about glow plug engines, these engines do not use a regular spark plug, but instead have a glow plug that must keep on glowing red-hot as long as the engine is running. Starting the engine is accomplished by connecting an external voltage source (either DC or PWM) to this plug to heat it, then the engine is turned over and (sometimes
) it starts. Once running, the heat inside the engine keeps the plug glowing, so the external voltage can be removed. However if the engine is capable of idling very slowly and/or if it is a 4-stroke engine (that fires only once every 2 complete revolutions of the crankshaft), then the heat from the combustion events may not be enough to keep the glow plug hot enough, and the engine can stop.
From an electric point of view, a glow plug is basically a resistor (with a fairly high positive temperature coefficient as far as I can see). If one applies about 1.2V to it, it starts glowing orange and that temperature, combined with the pressure inside the engine, allows the fuel mix to ignite. In this situation a glow plug draws about 3A of current. If it is soaked in fuel and oil, the same glow plug can draw even more that 4.5A, and that's with the burden voltage of the meter in series with it.
So, back to the project: I wanted to build a circuit that checks the position of the engine throttle (checking the PWM signat that the radio receiver sends to the throttle servo) and, if it is under (or over, depending on how the servo rotates) a given limit, then it must turn on glow plug heating.
I already have a small circuit I built a few years ago that does this (
http://www.ef-uk.net/data/rc-switch.htm ), but it has some drawbacks:
1) it seems to be strongly temperature-sensitive, so I often have to adjust its trigger point;
2) It was built on a veroboard, so it's larger than it really needs to be;
3) Partly due to the previous point, its connectors are placed in inconvenient positions.
Together with this will to improve the old circuit, I also currently have a small bargraph voltmeter mounted in my plane, just to keep an eye on the receiver battery voltage. Having a single circuit doing both of these tasks would be preferable, so I'd also like to incorporate a voltmeter functionality.
So I started with the voltmeter. I wanted to replicate the behavior of the voltmeter that I currently have installed in the airplane, that is the crossover point between the two lowest LEDs must be at 4.65V and the crossover point between the two top LEDs must be at 5.25V (the receiver battery is a 4-cell nimh battery). Also, if the voltage is outside of this range, then the bottom or top led of the bar should remain lit.
I followed the design procedure illustrated in EEVblog #204 to use an LM3914 (changing the values according to my needs) and sure enough... it didn't work
Turns out that I had missed the point in the datasheet where it is said that all inputs must be at least 1.5V below the positive supply voltage. Dave didn't have this problem with his design, but I stumbled right on it.
Ok, lesson learned. I changed the input divider ratio to try and avoid that problem and sure enough it started working, but then another problem popped up: the LM3914 didn't seem to work as expected. There was a large overlap between the segments with the chip set for dot mode, much more than the 1mV value reported in the datasheet. Even when one considers the voltage coming out of the input resistor divider the overlap was still much more than 1mV.
This one had me banging my head for some time. I thought I had some kind of oscillation problem, so I tried decoupling everything that could even remotely benefit from decoupling... until I noticed that the datasheet mentions that dot mode is only usable when the voltage across the whole divider is at least 500mV, and I was quite a bit below that figure.
The head banging turned to head scratching about how could I increase the voltage acrosse the divider. Then it occurred to me that I could drop the input voltage by a fixed amount rather than divide it. This would keep the voltage across the divider quite a bit higher. I thought I could use a zener diode to drop the input voltage by a fixed amount, so that did sound easy.
But more head banging followed. my 3.3V zeners seemed to drop much less than that. I tried with other zeners that I had bought and all of them seemed to drop significantly less than their advertised value. A look at their datasheet revealed that their zener voltage is measured at 20mA, while I was trying to run them at about 1mA...
No way I'm going to piss away 20mA just for a zener, so back to the drawing board. I casually discovered the TL431 voltage reference and, after reading its datasheet, I decided to give it a try. Basically I used a TL1431A to shift the input voltage down by 2.5V, the load being provided by an LM334 set for 1mA. This then goes through an RC filter to slightly smooth it out and then goes into the LM3914.
Now we are talking! Crisp segment transitions and all. But it's not yet "winner winner chicken dinner" time (did I spell that correctly?). I wanted a bargraph display that kept on the bottom or top led when the voltage was outside of the measurement range. The LM3914 already does that for the highest LED, but not for the lowest.
More head scratching, a couple of PNP transistors and a few resistors produced a working solution.
Satisfied with the voltmeter I went to the glow plug driver. The idea here is to generate a fixed-length pulse whenever a pulse from the receiver is detected and then check which one of the two pulses ends sooner. I didn't like the idea of using a 4013 as a monostable as in the old circuit. I wanted a proper timer chip, so it's 555 time
I decided to use a TLC555 (CMOS version, as I didn't need the high power output of the bipolar one) and configured it as a monostable. But then I had to find a way to trigger it.
The 555 is low-level triggered, while I basically needed a positive-edge triggered timer for this circuit. I solved this by using a CMOS inverter (a totem pole of a VP3203 and a BS170) followed by a differentiator, whose output then goes to the 555's trigger.
But then I had to find a way to sample the two square waves (the one coming from the receiver and the one coming from the 555) to see which one falls sooner. I went back to the 4013 for this, used the output of the inverter mentioned earlier for the clock of the 4013 and the output of the 555 served as the data input. This way when the receiver pulse goes back low, the output of the inverter goes high, this clocks the 4013 which samples the output of the 555.
Ok, so I breadboard this, connect it to the receiver mounted in one of my airplanes and... it doesn't work
Or actually, sometimes it works and sometimes it doesn't, which is probabily even worse.
Turns out that I had assumed that the signal coming out of the receiver would be at the same voltage as the battery voltage, while this is not the case. It looks like this signal is capped at 3.3V, even if the battery is at over 5V. My little totem pole inverter couldn't do much...
So I replaced my totem pole inverter with a single BS170 inverter/level shifter to bring the signal up to 5V and that seemed to fix the problem. I had to use separate inverters for the 555 trigger and for the 4013 clock because the differentiator slows down the rise time of the inverter output, but that's not a big deal.
I only tested this part of the circuit up to the 4013 outputs, as after that there's not much going on. The IRF3711 is the same MOSFET type that I'm currently using with my veroboard-built circuit, and it seems to work fine. The circuit only provides an open-drain output to control the power being sent to the glow plug(s). The power itself is provided by an external D-size nimh cell.
In the end I attach the schematics I came up with (vm is the voltmeter and gd is the glow plug driver). Some elements are missing a value:
D1 is a 1N5819 Schottky diode
Q2 and Q3 are MPS2907
All caps are ceramic, except for polarized caps (that are aluminium electrolytics) and for C8, which is a Polypropylene cap (for temperature stability)
All resistors are metal film resistors (1% tolerance)
The potentiometer is a single turn trimmer
The "on off on switch" next to the 4013 chip is mounted externally to the circuit, connected by wires. It basically connects either the Q or -Q output of the 4013 to the following part of the circuit. The center position disables the circuit.
And here is the question: can anyone see anything that could use some improving?