General question for the engineers out there. How often do you directly use advanced math (differential equations, fourier analysis, etc.) in your job?
...
I concur with the consensus: the daily grind of a workaday EE involves scant reference to the higher-order math so beloved by academics.
But, say you find yourself putting CPUs into your circuits, and writing software to control interprocessor communications. Then, the situation changes. Personally, I found the switch from circuits to networked processors jarring. Even after pursuing (and getting) an MSCS, still I felt an intellectual void when weighing design alternatives. It took years to dispel my worries, and reach a conclusion.
I had to crack open the old discrete structures book. After some reflection, I realized that relational algebra, finite state automata, regular expressions, parsers--all are merely different aspects of algebraic semigroups. Following such an insight led to high-confidence forays into areas that working programmers tend to avoid: formal specification, automated testing, even metaprogramming (very dramatic when it's used judiciously).
Think that algebraic semigroups will never affect your daily work? Think again. Last week, I happened upon
www.systemsafetylist.org. One topic: the Toyota firmware lawsuit. The expert witness for the class action plaintiff had posted his direct examination. It went something like this:
Q: How many global variables did the Toyota firmware exhibit?
A: Over 10,000.
Q: How many would good software engineering practice dictate?
A: Zero.
Etc., etc. Went on for days.
Toyota had to settle, cost them millions.
Thing is, control logic is modelled completely by a Reactor/Proactor pattern. And the sparse decision trees typical of real-time control programs are straightforward to generate with logic programs (After all, EEs don't really use the karnaugh mapping techniques shown in their circuit texts; it's all automated.). As for the thousands of global variables, who knows? I am skeptical. But I did not see any online cross examination of the expert.
So, if you are an EE, and find yourself writing software that describes occasionally communicating finite state machines (sound familiar?), it would behoove you to get smart on your relational algebra, regular expressions, first order predicate calculus, and parser/generator logic. You will then be well prepared when the boss asks for formal specifications--you will have been there long before. As for metaprogramming, you can put it off for a while. But it is absolutely shocking to see an expert write a program that writes your program, I can tell you.
Hope this helps.