High tech is fascinating and a lot of fun, and may even be good for work, now.
But don't fool yourself to think that simply being great at something, degree or not, will get you a stable job for the rest of your life. Stay out of debt and lighten your burn rate. Keep aware of the fact that there are a lot of incomplete contracts in the world. Assume nothing about our future.
Tech careers may soon become precarious work.
Which itself may rapidly dry up, wages falling, due to automation.
Automation is only a thread to unskilled, lowtech jobs. Engineers and technicians are the ones who will be designing, optimising ans maintaining automated processes. No degree is necessary to become a technician who will often get paid more to fix a machine, than the engineer who designed it!
Automation is only a threat to unskilled, lowtech jobs. Engineers and technicians are the ones who will be designing, optimising and maintaining automated processes. No degree is necessary to become a technician who will often get paid more to fix a machine, than the engineer who designed it!
Automation is only a threat to unskilled, lowtech jobs....
No degree is necessary to become a technician who will often get paid more to fix a machine, ...
Automation is only a threat to unskilled, lowtech jobs. Engineers and technicians are the ones who will be designing, optimising and maintaining automated processes. No degree is necessary to become a technician who will often get paid more to fix a machine, than the engineer who designed it!I don't quite agree. Engineering is outputting a solution to a problem. Even though it's not trivial in a lot of cases, it can be automated.
Looking at the developments, even creative jobs aren't safe. It may take a little longer before humans are made obsolete in these areas, but it will happen sooner or later.
The question "what did you ever make, repair, assemble, take apart, try,... for yourself, without anyone asking for it ?" is an easy trick to rougly classify them.
I love how you say they're not protected and then proceed to list the titles as protected
You fail to see the difference between the long words and the abbreviations.
Not difficult, try again.
I did say "title of engineer" to avoid having to go through the entire list. But if you go full grammar nazi sure, then we'd have to call it honorific. But here's the actual relevant law: https://data-onderwijs.vlaanderen.be/edulex/document.aspx?docid=12722
A Master of Science degree in engineering It is an academic degree to be differentiated from a Master of Engineering degree. (abbreviated MSE, M.Sc.Eng. or MScEng) is a type of Master of Science degree awarded by universities in many countries.
Can you give any examples?
The office jobs which have been replaced with automation so far, haven't been the creative ones, mainly: draughtsmen, secretaries and administrative rolls and people are still required to operate the computers which replaced them. Engineers are still required to design machines, it's just they directly use the CAD software, rather than getting a draughtsmen to draw it.
So far, design and programming rolls haven't been automated and going by the lack of progress in real general artificial intelligence (not search and pattern recognition algorithms which many deem to be AI, which isn't) that won't happen any time soon.
AI is good at cute mimicry of ideas it has been taught but spontaneous original thought as yet has not materialised at all. Until that happens it’s just a puppet of human intelligence.
If that does happen we are universally fucked.
Can you give any examples?
The office jobs which have been replaced with automation so far, haven't been the creative ones, mainly: draughtsmen, secretaries and administrative rolls and people are still required to operate the computers which replaced them. Engineers are still required to design machines, it's just they directly use the CAD software, rather than getting a draughtsmen to draw it.
So far, design and programming rolls haven't been automated and going by the lack of progress in real general artificial intelligence (not search and pattern recognition algorithms which many deem to be AI, which isn't) that won't happen any time soon.An engineering problem is just that. If you input the correct parameters, software should be able to come up with the most ideal solution. You'll probably see a gradual shift from software aiding the engineer to software doing the engineering work, but in the end, a human is a weak link in the chain. Software is able to evaluate much more options much faster. Just like pilots are mostly obsolete now and ride along for emergency and regulatory purposes, the same will happen to engineers, until they are made truly obsolete.
I don't see this perceived lack of progess in regards to artificial intellegence either. There are numerous experiments where AI is tasked with something creative and while some of the results are horrible, other results are freightingly good. Currently they require input of work done by humans, but there is on reason they couldn't sustain their own cycle of creating things and feeding on the successes. You can teach a human creative thinking, and it appears you can teach AI too.
Again, 99% fluff and lies sounds about right for humans too
Iterative processes have been able to develop new methods and aproaches for something, so you might call that original thought. It produces solutions for complex problems.
If that AI takes off too fast, I can't help but wonder what if it learns the rules of the "encryption game" and no password/connection is safe anymore...
Again, I disagree on both. Google developed an AI that can learn a game knowing the rules and playing it, rather than by being programmed to understand it or getting large datasets supplied. This AI beat the pants off the previous AlphaGo, which in turn beat the pants off one of the best human players. It learnt to play at this level within three days. More importantly, it should be able to learn itself other games without a lot of manual intervention too. You could argue whether this AI understands the concepts of the game, but you can't argue with the results. It simply works.
Humans also learn by brute force. We literally need to repeat something over and over to refine and hone our skills and that's exactly how this AI does it. Obviously, it can play a lot more matches than humans, so it learns quicker. There isn't a human who can be explained the rules of chess and suddenly is a chess master. He needs to go through the motions.
Solving engineering problems isn't much different. You have a limited set of constraints and a computer can optimize within them. It's probably already quite feasible to have AI design a bridge that's both as strong as needed and as cheap as possible.
"Silver explained that as Zero played itself, it rediscovered Go strategies developed by humans over millennia. “It started off playing very naively like a human beginner, [but] over time it played games which were hard to differentiate from human professionals,” he said. The program hit upon a number of well-known patterns and variations during self-play, before developing never-before-seen stratagems. “It found these human moves, it tried them, then ultimately it found something it prefers,” he said. As with earlier versions of AlphaGo, DeepMind hopes Zero will act as an inspiration to professional human players, suggesting new moves and stratagems for them to incorporate into their game. "
https://www.theverge.com/2017/10/18/16495548/deepmind-ai-go-alphago-zero-self-taught
I can see your point but I think understanding concepts is the crucial part of AI which is missing. For example one can learn the concept of gravity and they can intuitively apply it to other things, be it designing a bridge or launching a rocket. A human will know what will work and what won't, intuitively, without having to try it out. Being smart can eliminate a lot of trial and error.
I can see your point but I think understanding concepts is the crucial part of AI which is missing. For example one can learn the concept of gravity and they can intuitively apply it to other things, be it designing a bridge or launching a rocket. A human will know what will work and what won't, intuitively, without having to try it out. Being smart can eliminate a lot of trial and error.I think you would need to define what "understanding" means. What it really means, while being properly specific. In humans intuition just means the ability to subconciously calculate a predicted outcome after many experiences with a phenomena. The AI can calculate a predicted outcome after many experiences with a phenomena.
I understand where you're coming from, because it "feels" different when a computer does it. But then I have to remind myself that we're just fancy neural networks and that out bodies don't seem to do something computers won't be able to do too. It's just hardware versus hardware. Brains just do it with a sophistication and complexity as of yet unmatched by computers.
However, until I can get a computer to design a circuit, or write a program to do X for me, with minimal input, I'll remain sceptical about general AI.
However, until I can get a computer to design a circuit, or write a program to do X for me, with minimal input, I'll remain sceptical about general AI.At that point you won't be required.