General > General Technical Chat

why are new problems solved with the technology of 30 years ago?

(1/4) > >>

Warning, this is a mini rant :-)

So I was reading the question and answer section of a recent electronics
magazine.  The question was about building a timer that would provide
a 2 hour pulse.  The questioner thought (correctly) that this would be
hard to do with a 555 timer.  The timing capacitor and/or resistor would
have to be huge.  Nasty problems with reliability and accuracy.

The answer proposed was to make the timing period much shorter,
add a 4000 series CMOS 12 bit counter, some logic gates to decode
the terminal count, etc.

OK, this makes sense and will work.  But this is a solution which made
sense in the 1970's!

Today, I'd say you should use an 8pin PIC or AVR micro.  It can monitor
one of its ports for the trigger, set another port active for 2 hours, count
down the timeout, then reset its output.  Simple.  It takes up no more
space than the original 555.  Sounds like a perfect application for a
PIC10F200.  If you pick the 6pin SMD package, it would take up no more
space than a through-hole resistor.  Just don't drop it in the carpet :-)

Now I'm not one of those CPU nuts who thinks everything can be replaced
with software.  There's definitely a time and a place for real hardware or
FPGAs, especially in timing critical or safety critical applications.  But
a 2 hour timer is clearly within the capabilities of a simple micro.

So then the question is, why is this not a proposed solution?  Let's look
at a few possibilities.

1)  High barrier to entry

Well, you do have to buy or build a programmer, but the chips themselves
are cheap.  But once you have a programmer, you have opened up lots of
other applications to you.

2)  Reader not expected to be able to do simple firmware development

I guess this depends on what you consider the audience for your magazine.
I find it hard to believe that anyone getting into electronics these days
wouldn't get into at least simple programming of micros.   Am I wrong about
this?  Is there still such a big divide between hardware and software?
How can you be into electronics these days and ignore the rise of the

3)  Timer needs to work on > 5V (or 3.3V)

This isn't stated in the question/answer, but you potentially have more
voltage choices if you stick with the 555 and 4000 CMOS logic.  Some
micros today only run at 3.3V or less.

4)  Answerer is from pre-micro days

Depending on how old the person who wrote the answer is, he might
be of the generation of jellybean logic and those solutions might come
more naturally.  OK, but then should you be writing for a modern magazine?

OK, mini-rant done for this morning, thanks for listening!


I think you are right, the solution offered is not the most modern.
But: Some people have difficulty programming micros, not only do they not have the equipment but also lack the knowledge of how to write the actual code.
I myself would probably use the same way. Just cobble something together from the gazillion of 74xx and 40xx that are lying around. I'm one of the people who have trouble writing code, I do have a programmer for most PIC and AVR's but still I suck at software. So seen from that point of view it is easier and quicker to wire up some counter in hardware.

Another but: I'm a hobbyist. I don't do this professionally. I imgaine someone who does electronics for a living and has to make money (money=time) out of it would go the other way. That's the professional who's absolutely familiar with his micros and programing languages.

I don't know, but I imagine the answer given in the magazine was aimed at a hobbyist. A professional or somewhat advanced hobbyist wouldn't have to ask in the first place.

In summary I think the answer given was OK. It doesn't hurt if you understand your basic logic gates and stuff.


An issue with micro-controllers is that they have more failure modes: you might have a software bug, or a silicon bug involving timers. It's much less likely that a simple CMOS divider has any silicon bugs. The power usage could also be lower for discrete logic (although I doubt that the old 4000 series is). In commercial settings, writing software, programming the micro in production and testing all cost money. But I agree that using a bunch of logic IC's is old fashioned, anything requiring more than a few gates should be replaced by something like a micro or some sort of programmable logic in my opinion.

The person writing the answer probably wanted to stay in the same area as the original question. Going from a 555 to CMOS logic is a much smaller step than from a 555 to some tiny micro.

I guess then I would like to have seen a statement something to the
effect that as your expertise grows, you should consider redoing
the timer with just a tiny micro.

Maybe I'm expecting too much for a range of ideas to be presented.
But how are people going to grow if they aren't exposed to ideas
beyond their current technical level?


I would've chosen the CD4060 because it needs no programming and both the oscillator and timer are on the same IC.

I'm pretty sure that old CMOS logic as much, or less power than an MCU which is just constructed from a newer form of CMOS.

I need to get into PICs, I built a programmer a couple of years ago but I've not managed to get it working for various reasons. I'm tempted to buy a USB programmer but I think I'll give it one more go.


[0] Message Index

[#] Next page

There was an error while thanking
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod