If the computer still had a human oversight, it would be great. Sometimes what sounds good in theory does not always work well. For example, you could have world peace if you destroyed all the humans.
Tell that to the lions and gazelles.
@indirect76 and to the ants and the aphids
When ASI occurs you'd better hope they keep a few of us around in zoos but 99% of all human life would be wiped out as soon as possible its the only logical step to prevent total ecological destruction
@McVinegar the best way to fix our planet is to drastically reduce the amount of people on it. Logically what's the only way to achieve world peace? Remove the cause of war ie people. The fastest way to fix the environment? Remove the most destructive animals ie people. Even if programmed to bennifit mankind well there's protecting us from ourselves or ensuring the longevity of the species both valid justifications for global genocide if you remove emotion from the decision making process. We will be the architects of our own demise when we build something smarter than us we become surplus to requirements snd then it's game over.
If it worked well, I think people would destroy it in the name of freedom or just fear and misunderstanding.
There's some fun concepts here for dystopian sci-fi though. I mean, obviously the biggest threat to world peace and the environment are people themselves. So you could have a benevolent computer that determines the only way of fulfilling it's goal is to reduce the human population, or stick them all in the Matrix or something.
I’m not sure if that is possible without inherent bias built in by the programmers. Computers can’t really decide anything. They only execute algorithms developed by people. There in is the problem. We can’t hand our mess over for easy solutions.
AI is progressing all the time.
@CallMeDave True but I think it's a fundamental limitation that will not be overcome and that may be a good thing.
@arca2027 I haven't looked it up, but isn't there an effort to make a computer that can program itself, or another computer?
@CallMeDave possible but in my mind that still leads back to the first coding assumptions problem I mentioned earlier.
I’ll take this one.
Being a software engineer I can say that you can only write software to do what the programmer already knows how to do. The problem is coming up with an algorithm that churns out world peace and environment/economic stability. So creating an algorithm for these goals that are not exactly objective is the first oroblem.
The second problem is something called P vs NP problems. The gist is, P(polynomial) problems are considered ‘easy’, as in they can be computed in a short time. An example would be anything that your phone can currently do. NP(nondeterministic-polynomial) problems require many orders of magnitude longer to compute with current technology. An example of a NP problem would be factoring a number that is a product of two large (100+ digits) prime numbers. Something like this may take billions of year even if we had a trillion times more computing power worldwide. The problems you describe are each most orobably NP.
Third, even if the first and second problem were solved, you would still have the problem of implementing the plan set out by the computer. You would have to give the computer complete control (bad idea), or trust that everyone will be on board, which they won’t be.
Quantum computers or neural networks may help though.
I'm pretty sure that's how we get the matrix
It’d be a good idea if we were the borg. Since we aren’t, I’ll take my chances without that computer program. Mainly because I have a feeling about how that whole scenario would end with.
Wonderful concept..... Unfortunately politicians will never allow it........
Politicians would want to control it and they are not known for putting public interest above their own.
Not impressed. Computer programs are only as good as the programmers. Who decides the morals, values, right and wrong? What if you're identified as non-essential or a threat for your opinions? What about glitches and hackers? Who decides what constitutes world peace? And lastly....Who controls the computer?
It'd be for me, obviously. Next question?