haha I have no idea why I wrote this incredibly difficult word, which means TAKE OVER or whatever you are thinking.
The problem is not with the word obviously but with the concept. What does "taking over" means? What would that lead to? That's the whole point.
We have no freaking clue what "machines taking over" could possibly mean in reality. Some book/film authors have tried their rendition of that, in dramatic but not realistic ways.
I mean, buddy, I write content. What if someone prefers Chatgpt over me? That would be sad, but I've written wonderful articles about Chatgpt. I mean, I don't Despise Chatgpt; it's like I care more about people hehe
Now that's less vague. AI taking over means AI making your current job obsolete? That's a pretty narrow and self-centered definition of taking over. But a very common source of concern, when people are actually concerned at all.
By that criterion, many technological novelties have rendered some jobs obsolete. Most of our current tools have made some jobs obsolete. Does that mean that they have taken over the human species? Not really.
We just have to adapt and do other things.
Making humans "obsolete" is a whole step beyond that, if that's what one means by "taking over". It's the singularity when machines are not just tools anymore, and when "life" itself has no meaning anymore.
Whether that's something we want for our future is kinda up to us. Or is it. Those who think everything is predetermined will probably say that it's not up to us anyway.
But "actively" working at possibly making us in our entirety (not just some jobs or some other aspects that will make us evolve and do other things) *unwanted* is a suicidal behavior.
Which is as interesting from an anthropological POV as it is disturbing.