Author Topic: Using a temperature sensor to light an LED - is this the right direction?  (Read 10959 times)

0 Members and 1 Guest are viewing this topic.

Online vk6zgo

  • Super Contributor
  • ***
  • Posts: 7589
  • Country: au
As drawn, I think the schematic needs some work.  The led is connected to the emitter of the npn but in parallel, it is shorted to ground so you won't really have any current flowing through the led ever.

 

Even if he sorts out the LED circuit,there is no voltage on T1 collector ,as it is connected  to the wrong end of R2.

Reading the answers on this thread reminds me of the story about a Technician,an Engineer,& an Accountant.

They all had a cat & a dog,& wanted to allow the animals to go in & out of the house when the door was closed.

The Tech bought & fitted a "doggy door" which was light enough for the cat to also use.
Both animals are happy.

The Engineer couldn't stand the idea of a big door being wasted on the cat,so cut two doors,large & small,with interlocks to prevent the cat using the dog's door,or the dog trying to use the cat's door & getting stuck.------he is up to version 3.2 now.
Both animals are confused.

The Accountant cut out a cat sized door & tried to push the dog through it! ;D
 

Offline casper.bang

  • Frequent Contributor
  • **
  • Posts: 311
  • Country: dk
  • Pro SE, amateur EE.
    • BangBits
Reading the answers on this thread reminds me of the story about a Technician,an Engineer,& an Accountant.

They all had a cat & a dog,& wanted to allow the animals to go in & out of the house when the door was closed.

The Tech bought & fitted a "doggy door" which was light enough for the cat to also use.
Both animals are happy.

The Engineer couldn't stand the idea of a big door being wasted on the cat,so cut two doors,large & small,with interlocks to prevent the cat using the dog's door,or the dog trying to use the cat's door & getting stuck.------he is up to version 3.2 now.
Both animals are confused.

The Accountant cut out a cat sized door & tried to push the dog through it! ;D

Point taken. Can we agree though, that  a temperature alarm is not much worth if it runs out of battery after 14 days and fails silently?!

6 months ago I did a mailbox notifier using various combinations of low-power CMOS flip-flops and 555 timers, but it would run out of power far too quickly and lay dormant until I realized the mailbox was full and that I needed to change batteries. I then rebuilt it with a cheapo MCU and now it should have battery for a couple of years,  with fewer components and more features. The extra features is kind of a free lunch; would be silly not to take advantage of it.
 

Offline madires

  • Super Contributor
  • ***
  • Posts: 7765
  • Country: de
  • A qualified hobbyist ;)
Very nicely put! :-) Do CS students learn how a CPU works in detail nowadays?

Not in my experience. They all learn Java and have no idea about things like memory management, let alone what the cpu is actually doing.
CS students doing programming should all have to do a module in assembly language, IMO.

It's worse than I thought. More bloatware ahead :-( I second your suggestion to have them doing assembler for one CPU/MCU at least. Otherwise they'll learn how to write code vs. the CPU and not for it. Unfortunately throwing more GBytes and GHz at the problem is inexpensive these days.
 

Offline madires

  • Super Contributor
  • ***
  • Posts: 7765
  • Country: de
  • A qualified hobbyist ;)
The Accountant cut out a cat sized door & tried to push the dog through it! ;D

That would be the hungry-for-power circuit powered by a battery  ;D
 

Offline cthree

  • Frequent Contributor
  • **
  • Posts: 258
  • Country: ca
As drawn, I think the schematic needs some work.  The led is connected to the emitter of the npn but in parallel, it is shorted to ground so you won't really have any current flowing through the led ever.

 

Even if he sorts out the LED circuit,there is no voltage on T1 collector ,as it is connected  to the wrong end of R2.

Reading the answers on this thread reminds me of the story about a Technician,an Engineer,& an Accountant.

They all had a cat & a dog,& wanted to allow the animals to go in & out of the house when the door was closed.

The Tech bought & fitted a "doggy door" which was light enough for the cat to also use.
Both animals are happy.

The Engineer couldn't stand the idea of a big door being wasted on the cat,so cut two doors,large & small,with interlocks to prevent the cat using the dog's door,or the dog trying to use the cat's door & getting stuck.------he is up to version 3.2 now.
Both animals are confused.

The Accountant cut out a cat sized door & tried to push the dog through it! ;D

Love it!  :-DD
 

Offline cthree

  • Frequent Contributor
  • **
  • Posts: 258
  • Country: ca
Very nicely put! :-) Do CS students learn how a CPU works in detail nowadays?

Not in my experience. They all learn Java and have no idea about things like memory management, let alone what the cpu is actually doing.
CS students doing programming should all have to do a module in assembly language, IMO.

It's worse than I thought. More bloatware ahead :-( I second your suggestion to have them doing assembler for one CPU/MCU at least. Otherwise they'll learn how to write code vs. the CPU and not for it. Unfortunately throwing more GBytes and GHz at the problem is inexpensive these days.

IMHO if you can't write C code then you can't program. Everything else is just compiler semantics.
 

Offline casper.bang

  • Frequent Contributor
  • **
  • Posts: 311
  • Country: dk
  • Pro SE, amateur EE.
    • BangBits
IMHO if you can't write C code then you can't program. Everything else is just compiler semantics.

That's a black-n-white sweeping statement that's very common to see among engineers who tends to think in absolutes. The truth is far more nuanced though, since it depends on what you need to do; don't tear down your garrage with a box-cutter and don't open your packages with a circular saw. Pick the right tool for the job!

Some of the worst code I've read over the years, are by embedded engineers who are so stuck in their old bit-twiddling C world that they lack all other elements of what I would consider to be professional software construction (i.e. no logging, no unit tests, homemade data structures use rather than tried-n-tested library versions, inability to make the code easy to read, silly optimizations which the compiler can do better etc.). When we write software today, we're writing as much to our colleagues as we are writing to the compiler, and C is just not very good at expressing intent (i.e. no type-safety, const can be cast away etc.).

I understand what you mean, C as a (very lightweight) abstraction fits well in the embedded and systems world, with strong ties to fundamental CS topics such as Turing machines etc. However, as program complexity and size increases, you quickly run out of abstractions. Sure you can roll your own VTABLE to achive polomorfism and mark-and-sweep to get garbage collection in parts of your system, but that makes you more of a fool than anything else - people smarter than you already came up with these abstractions in a cohesive language.

The latest iteration of polyglot language hybrids like Scala, Go, Clojure and even C# attack the concurrency behemoth head-on, something that will always remain a nightmare to do in C. At the end of the day, you'll want to make software that solves your problem rather than spending your time hunting down memory leaks and deadlocks.
« Last Edit: June 30, 2013, 07:24:04 am by casper.bang »
 

Offline cthree

  • Frequent Contributor
  • **
  • Posts: 258
  • Country: ca
I'm aware of the nature of the assertion but I didn't say you should always use C or that abstraction to a higher level isn't useful or even necessary. What I'm saying is if all you know is php or java then you aren't a complete programmer just as someone that who can use a nail gun but not drive a nail with a hammer isn't a real carpenter.

The Java VM is written in C. The operating system which runs the Java VM is written in C. C is where the rubber meets the road and a pro must know it. C is the software equivalent of Ohms law. If CS graduates come out of school not knowing it backwards and forwards then they are not graduating ready to be professional programmers. That is a huge problem.

I've been a professional programmer and software engineer for over 25 years. I became a journeyman when I learned to program in C. I became a master when I mastered it. I hardly ever use it. I do all my real work in ObjectiveC, ruby and python but without it I couldn't call myself a pro.
 

Offline WhiteowlTopic starter

  • Newbie
  • Posts: 4
I guess I'm attracted to the no-MCU route because I'm fairly confident about my programming abilities, but I don't know much about electronics, being a software programmer.

Just for the record (having a similar background as you); writing low-level C/assembly for a 1K device, interpreting datasheets, setting up interrupts etc. is a very different programming model than writing high-level Java/C# for 32-core, 64GB servers revolving around MVC pattern, ORM framework, CAP theorem, SaaS etc.

Originally I thought I'd find my way back into electronics using mostly discrete parts and TTL/CMOS logic gates, but quickly realized the amount of freedom and possibilities a low-cost MCU provides in this day and age - especially to those not scared off by a compiler tool-chain (software devs).

Anyway, have fun and lets hear what you come up with. :)

Well, I'm no spring chicken, and I've been programming since I was a kid, so I've done lots of assembly, C and pascal along the way.  I guess the way I see it is that if I were to use say, an AVR chip (I only have an arduino uno, but as I understand it, the programming is the same across the line), then it's simply reading an analog input pin, then deciding about writing an output pin, so not much of a challenge, and not that much electronics apart from a resistor on the LED.  In that scenario, the real challenge would be to figure out how to program the ATTiny or whatever MCU, set it up with its power, and basically replicate the parts of the arduino that I need.  Comparing this to the non-MCU solution, it just seems less satisfying.  Anyway, once I get the components and build something, I'll post it up here so I can get some feedback on the solution.
 

Offline casper.bang

  • Frequent Contributor
  • **
  • Posts: 311
  • Country: dk
  • Pro SE, amateur EE.
    • BangBits
...it's simply reading an analog input pin, then deciding about writing an output pin, so not much of a challenge, and not that much electronics apart from a resistor on the LED.

Oh well, I see it differently - just another skill-set. Btw. why the resistor? It just transforms useful electric power into rather useless heat.
 

Offline madires

  • Super Contributor
  • ***
  • Posts: 7765
  • Country: de
  • A qualified hobbyist ;)
Well, I'm no spring chicken, and I've been programming since I was a kid, so I've done lots of assembly, C and pascal along the way.  I guess the way I see it is that if I were to use say, an AVR chip (I only have an arduino uno, but as I understand it, the programming is the same across the line), then it's simply reading an analog input pin, then deciding about writing an output pin, so not much of a challenge, and not that much electronics apart from a resistor on the LED.  In that scenario, the real challenge would be to figure out how to program the ATTiny or whatever MCU, set it up with its power, and basically replicate the parts of the arduino that I need.  Comparing this to the non-MCU solution, it just seems less satisfying.  Anyway, once I get the components and build something, I'll post it up here so I can get some feedback on the solution.

The Arduino hides a lot of low level stuff inside libs already. If you start with a barebone AVR you'll need to setup a few registers and deal with interrupts or register flags for reading a value from the ADC. 
 

Offline lgbeno

  • Frequent Contributor
  • **
  • Posts: 349
  • Country: 00
Using a temperature sensor to light an LED - is this the right direction?
« Reply #36 on: July 01, 2013, 01:42:45 pm »
This discussion took an interesting turn :) ... 

Kudos to whiteowl for trying the non-MCU route first because it is very much the same argument in circuits.  MCUs are great devices and very often the best solution.  However they can be an injustice to learning circuit fundamentals just like Java or C# is to C.

I am only a programmer as a means to an end.  I have however worked with a lot of career programmers on embedded projects.  Some who have also wrote huge PC applications.  I find it irritating when they dwell so long on how to abstract every single thing into a neat little C++ object when the desired function can be implemented in an hour with 15 lines of straight C code to simply modify some register value.  If I wanted some test code to run in the lab (not even production code) they usually have to modify the abstraction layers which takes weeks!  Doesn't make either of us very happy.

In embedded, it also drives simple solutions into overpowered processors with mega memory.

I think it all ends with my realization that I'm a hardware guy through and through...  I will say on the PC side, I really enjoy python, it's a great tool.
 

Offline lgbeno

  • Frequent Contributor
  • **
  • Posts: 349
  • Country: 00
Using a temperature sensor to light an LED - is this the right direction?
« Reply #37 on: August 11, 2013, 04:29:46 pm »
How is the project going?


Sent from my iPhone using Tapatalk 2
 

Online mariush

  • Super Contributor
  • ***
  • Posts: 5029
  • Country: ro
  • .
I would have probably used a simple IC like MCP9700 : http://uk.farnell.com/microchip/mcp9700t-e-lt/thermistor-linear-active-smd-9700/dp/1084621

It gives you temperature from -40 to 125c with about +/-2c accuracy, works from about 2.5v so you can run it from 2aa batteries... and outputs a simple voltage which you can read with a comparator or the adc of a microcontroller.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf