« on: 19/02/2019 17:13:41 »
The creativity of some dreams astonishes me - occasionally they seem to have been written by an intelligence that isn't me, keeping a clever twist in the plot hidden until the last moment and then revealing it at the right time for maximum effect, but also showing that it had been planned early on. There's definitely someone else in here who can't speak to me directly, but who tries to communicate through dreams.You were right to believe that you were still a child. :0) It's effectively as if our mind was sometimes playing games with itself, like a kid talking to his imaginary friend. That feature from magination is probably the main reason why people still believe in god. God can't help us though whereas randomness can. It took a while before we discovered the use of randomness, but religions are forced to deny it otherwise they know they could be replaced. You may be unable to believe that intelligence needs it, but you can probably still admit that it explains our dreams better than someone else trying to communicate. If you can, then you could ask yourself why our brain produces such a feature. Either it's only a secondary effect of imagination, either it is a real feature, a property of mind without which we wouldn't be as intelligent as we are.
Like us, your AGI needs to be able to simulate things before executing them, which is part of imagination's job. The only thing it would be missing then is simulating improbable things once in a while in case it would pay off, and then I bet it would realise that it pays off often enough to integrate it. That's what I think has happened to our mind while we were evolving from animals to humans. If simulating all the possibilities beginning by the most evident would have been better, evolution would have chosen this way and it didn't. We will probably be able to build biological computers some day, so evolution could have done so too, but it didn't. The way mind moves its data is slow, and it could probably have been as fast as computers if it had been useful, but there is no use to think million times faster than we can move, so it didn't. The way it remembers the data is imprecise, and it could probably have grown biological chips instead, but there is no use to be million times more precise than the environment we are in, so it didn't. The inverse is possible too. It is also possible that computers are the next evolutionary step to a higher intelligence, but I prefer to think that they will think like us, because this way, I can imagine that they will replace us instead of only caring for us. I'm going from bottom to top and you are going from top to bottom, but we are both aiming at the same target: the future humanity.
I think we're both dreaming anyway, so no need for me to take my ideas too seriously. To me, that kind of dream is similar to the ones I have while sleeping, because I also get the feeling that it comes from nowhere, but I could also think, like you, that someone is trying to communicate with me. Do you sometimes have that feeling about your ideas or do you always feel that they are yours? That they always come from your own deductions and calculations for instance? If you do, then it is no surprise that you want your AGI to think like you. If not, then it means that either our ideas comes from randomness like I think, either they come from someone else like you think. Those two different interpretations both mean that unpredictable things happen in our minds, but they don't lead to the same behavior. Those who think that someone talks to their mind may get dangerous for others for instance, whereas I think there is no danger to consider that our ideas suffer randomness. God talking to us is one of the features that make the religions dangerous. A religion about randomness wouldn't have the same issue, it would preconise freedom over security, improvisation over constancy, education over coercion, imagination over memory. It would work for the long run, whereas actual religions only work for the short one.
The same way, I think that your AGI would only account for the short term, and that adding a bit of randomness to it would account for the long one. Existence accounts for both, so we might need to mix our ideas if we want them to do so, unless there is no other way than to wander from one to the other like the two extremes of our political systems. When the right governs, it effectively works for the short term, while the left works for the long one. Taking social measures is like caring for others and hoping they will care for us in the future. It's a second degree selfish behavior that accounts for the future instead of now. We can't predict the outcome of a society, so it's risky to take such measures, but we nevertheless accept to do so as a society because we already do so individually. Your AGI will behave as if it would know the outcome since it would never take chances, so I think it will only account for the short term. If the right would always govern, I think societies would not evolve. I think it's the random wandering between left and right that produces their evolution. Your turn now, but you're not allowed to answer that nobody has to evolve in paradise! :0)