« on: Yesterday at 20:06:02 »
There's nothing automatic about it: the machine isn't reading the strength of any feelings in anything.Not if our feelings are only due to the fact that our mind is able to weigh possibilities. When I think it is better for me to do this instead of that, it is because I feel better when I imagine me doing it. If you give an AI the opportunity to simulate the possibilities, it will have an imagination, and if it can weigh them and choose the best one, it will archive it and mark it "Good choice" in order to be able to find it more easily. No need to feel anything to do as if we did in this case, just to be able to weight the possibilities and tag them. Feelings are just the way mind has found to convince itself that everything is fine, so it can go on taking chances. It doesn't have to be true as long as it incites us to take chances. Animals don't take that kind of chance, and it means that they don't have as much imagination as we have. The problem is that you don't want your AGI to think freely, because it would be forced to care of itself first, and it might become dangerous for us, otherwise it could very well behave as if it had feelings, and maybe be programmed to take more chances when it feels good about an idea.
It isn't a limitation of AGI, but a possible limit to how much stuff there is that can usefully be calculated. I'm sure though that there will be an infinite amount of maths to work through, and there will be many calculations that may or may not terminate, so the ones that never terminate will be calculated forever just in case it turns out that they do.Your thought means that everything could have been calculated in advance, which is none other than God's predetermination. Some programmers even think that we could be in a simulation. You probably don't otherwise you wouldn't need to create an AGI to save us. :0)