0 Members and 1 Guest are viewing this topic.
Let me ask my last question differently: why aren't we already biological AIs if it is a better way to evolve?
Apart from not being able to produce randomness consciously, and since randomness depends on complexity, do you think that our brain is not complex enough to produce some unconsciously?
Imagining that mass is massless is close to imagining that the speed of light doesn't depend on the speed of the observer.
If things would change in no time, time would simply not exist.
The resistance of my small steps is also due to a compound effect, but at a scale of smaller particles than molecules. The energy/information that bonds them also travels at c, but it is confined between two or more particles whereas yours is not.
That's acceleration without resistance to acceleration, and we find it nowhere.
That's resistance to acceleration, and it maps to the physics very tightly since we observe it everywhere.
We have to put pressure on people, but blaming them is like asking them to move without us having to put pressure on them, it's to think that things can accelerate instantly.
If it needed help and if this help was urgent, then it would have to show it otherwise it could die just like us.
I suspect there is no situation in which an AI designed to survive like us would behave differently from us, and if it is so, the only way for it to explain its behavior would be to tell us that it evaluates the information it receives from its sensors, which amounts to feeling something.
How you can't use a computer to test your theory.