I would rather see the students spend more time on applications and less time on theory. Two reasons: Calculus is taught for math majors and that filters out a lot of engineering types who will NEVER need the underlying theory. Who has ever used the limit definition of a derivative? Better to spend more time on Related Rates problems and less on pedantic nonsense. Second, Calculus is just a tool. It many ways it is like a sophisticated slide rule. Pump in functions and grind out derivatives. Machines do that very well. What machines can't do is think! How do I solve this problem? How do I describe the problem for the machine? Calculus may, of may not, be involved.
I have a Great Courses calculus video series presented by a professor at the University of Florida. Apparently, they have two math tracks: One for math majors and another for engineers. Kudos!
Consider the problem of AC circuit analysis. Just the usual Rs Ls and Cs where the impedance is given. Now write a system of equations to solve either by mesh or nodal analysis. Let's say it is, oh, 6x6. We had one of those the other day! OK, let's back off to a more reasonable 4x4...
Solving this by hand with complex numbers will inevitably wind up with simple algebra errors. What is learned? Well, you learn to avoid this analysis at ALL COSTS. You learn to HATE complex numbers! You begin to question why you picked this major! Instead of dumping the equations into a number cruncher, you have to spend an inordinate amount of time to get the wrong answer! But you could have learned to more effectively use a number cruncher - something that might actually be useful out in the world. Wrong answers in the real world have consequences.
I think this a pretty common joke already:
en-gi-neer
[en-juh-neer]
1. A person who solves problems that you did not know you had, using methods you do not understand.
2. A person that has forgotten more math than you ever knew.
But to be honest, I somewhat disagree with your post. Mathematics and engineering are married, that's just the way it is.
How much you're exposed to mathematics is a matter of choice, and also the level you're solving problems at. In my opinion and probably the majority of the folks on this forum, building electronics on PCBs and writing firmware typically does not require lots of math. But there are also plenty of engineers hard at work creating new components, quantitatively testing and verifying systems, and writing "middleware" libraries that one often takes for granted.
How are you going to be sure a cryptographic algorithm is implemented correctly?
How is an analog IC designer going to determine the bias for his circuit, know for certain his circuit is not going to oscillate given a set of load impedance, and what stages of his amplifier should be modify in order to do so?
Who actually invented some of these fancy analog circuits in the first place? I mean some of the ones that are counter intuitive; like an amplifier circuit that cancels out it's own noise.
How is a digital designer going to decide on the best binary adder design in an ASIC?
Or perhaps the reverse question: why do we care about lookahead adders when it's slower on most FPGAs anyway?
I think any college program is trying to accomplish 2 goals. One path is onto the job market (that wants highly practically trained students so they are valuable straight away) and the other is onto further education programmes (up to bachelor, master and PhD). The latter requires students with a more formally trained background in maths, continuity between programmes is preferred.
Additionally I think one should learn skills where they are taught best. In academia the courses are taught by professors of which you expect to have a very thorough understanding of the material, and can teach it most rigorously and efficiently. Practical skills can be obtained with hobby projects (show you're autodidact) and internships, and aboveall; every day on the job itself.
But I think rarely it will happen that an employer is going to pay you a degree because you need to know more about theory of microwave engineering, but have never seen or worked with a Maxwell equation before.