Products > Computers

ROBUST.. AI for what matters

<< < (3/3)

golden_labels:

--- Quote from: SiliconWizard on June 23, 2022, 06:40:52 pm ---What an odd statement. But you need to define "technological progress" first. Good luck.
--- End quote ---
I do not need any luck. The concept of technology is understood well enough. The concept of new developments occuring is the definition by itself, assuming one understands the three words involved: new, development, to occur. You may split hair if you wish over the exact bounds, but that will have zero relevance to the matter discussed.


--- Quote from: SiliconWizard on June 23, 2022, 06:40:52 pm ---Oh yeah. This is exactly Elon Musk's reasoning, as far as I have understood him. He's kept defending his projects with this very argument.
--- End quote ---
Because Elon Musk said something as an argument to defend his poor ideas, I am wrong because I referred to it in an unrelated statement? Aside from trying to argue that something is wrong only because a given person said that, how do those two things relate?


--- Quote from: SiliconWizard on June 23, 2022, 06:40:52 pm ---Of course, unless you make some additional statements along with this, it's completely... flawed. One of those additional considerations might be something like, uh, ethics? (If that still means anything to anyone?) Or common sense. (Ditto?)
--- End quote ---
And what ethics, which are normative, have to do with a descriptive statement?


--- Quote from: SiliconWizard on June 23, 2022, 06:40:52 pm ---Indeed, without any ethical consideration, anything would go. Since developing a deadly poison that could kill the whole humanity, or even any living being on Earth, is something definitely doable, that means I would be perfectly entitled to do it, just because otherwise someone else will? Really?
--- End quote ---
You have invoked an extreme case to discuss something of a very different scale, using it as those were comparable situations. But, ignoring that: yes, progress in that particular, arbitrarily chosen technology may be slow or non-existent. But you can’t just choose some specific items and make statements about different items. There is very little incentive to develop a chemical weapon that would destroy life on Earth. That’s not true for machine learning technologies, including weapons.


--- Quote from: SiliconWizard on June 23, 2022, 06:40:52 pm ---Good thing you added that the whole thing "is going to be a horrible philosophical mess". I'm not convinced that this is really philosophical at all.
--- End quote ---
It’s already a subject of philosophy, which contradicts your opinion.


--- Quote from: SiliconWizard on June 23, 2022, 06:40:52 pm ---Some common sense should definitely help. If not, we're doomed and no philosophy is going to save us.
--- End quote ---
Philosophy is not going to save us. But “following common sense”, which is philosophy, will. Eee?

And that is one of the examples of why it will be a “philosophical mess”. “Common sense solves it” and similar arguments.

The concept is well known in philosophy, psychology and sociology, as a thing perceivable by humans. But it actually being “common” or being a viable method of solving anything is not taken seriously anymore. Yes, of course it is natural to believe that one’s common sense is right and will help. The problem is that the common sense of 7bn other people is different than yours. And even for a single person it’s fuzzy and very fluid. You were expecting me to define “technological progress”, yet brought up something that is just a nebulous blob, half of which is nothing more than an effect of congitive biases.


Finally, to lighten up the mood: .

Navigation

[0] Message Index

[*] Previous page

There was an error while thanking
Thanking...
Go to full version