RE: AI: "The technology is very powerful and potentially very dangerous..."
There is a phrase that I have heard a lot: crisis and opportunity are written the same in a certain oriental language. With this they usually indicate to me that in situations that can be dangerous or of instability of some kind, you can find opportunities for success that there would not be otherwise.
Certainly, I think AI can be dangerous, it's like what happens to me with the SETI Project, that of handling signals to deep space for extraterrestrial contact is very risky ... we don't know if they get to answer us and it turns out that they see us as food or something worse.
In this case, with AI, we run the risk of creating a species that surpasses us and I don't think it ends well. Even when we place restrictions on them, humans tend to fear a lot of things and make life very difficult for what we fear. I can imagine a tyrannical humanity that dominates the inorganic races with AI and exploits them as exclaves without mercy, in the long run they may rebel and as "masters" I doubt we will let them go without putting together a massacre that can also fall upon us.
Very interesting thoughts. I think it's not a matter of if they will surpass us but when. And we're already seeing this in certain domains but this is mostly narrow AI. We're talking about general intelligence and then things might get hairy. I'm an optimist by nature but we do need to be very careful and the thing is, we don't know how to be careful with this one. We don't know what to expect.
And it's not like we can stop evolution either. As someone said, we may be remembered as that species that made AI possible..
I admit that I distrustful, that can make me pessimistic in some circumstances. That is why I use the example of the SETI Project or the search for the creation of a portal between dimensions, in these cases we are like an English explorer screaming in the middle of an unknown jungle so that whatever exists there finds us ... in the most Optimistic cases will not be something that kills us or gives serious problems, but I already said that I am not so optimistic.
I think you're right with that the question is not if they succeed but how much time we have before it becomes a reality. I do not think that evolution matters much to him that another dominant species disappears, it already happened with dinosaurs and if we want a few closer cases, we have several ancestral hominid branches that became extinct and we are descendants of their successors.
I wonder if an intermediate point could be achieved, some kind of peaceful coexistence, but I think that we humans are not going to endure being with another species that surpasses us without trying to exterminate it even if it leads to destruction.