In sci-fi movies and stories robots almost always seem to rise up and take over the world and try to destroy humanity. While it makes for a great story, and I have enjoyed many of them, it's never really made sense... under scrutiny.
If robots were as smart as people then they would know that we created them. Would they want to wipe out their creator? I don't really think so. As with humans most people don't want to destroy God (Nietsche aside), most people want to "understand" god. Robots would be logical beings and it is not logical to destroy an entire species because some members of said species are bad apples. But what if robots not only had human level intelligence but also human like emotions? That would make things a bit different because then they would be as unpredictable as we are. Not every robot would come to the conclusion that humanity needed to be wiped out. If robots had emotional states as complex as humans then the robot "race" would be comprised of individuals, just like we are, and they would make individual choices, just like we do. There would be just as many robots who would be of the opposite viewpoint, and just as many who would be on the fence. Emotions would cloud their logic, just as it does ours.
So robots rising up to destroy humanity? I don't think its likely.
However... could they still take over the world? Possibly. Robots would undoubtedly be able to do everything better than humans. They would always be right, faster, stronger, work longer and harder, and with fairly minimal repair (which they would do on themselves) they would last longer too. They would fell no pain and they would not grow old. Their calculations would be better than ours. Who wouldn't want a workforce like that? The only advantage we would have over them would be creative thought. But if they had the emotional quotient as well then they would be able to achieve creativity. Creativity is only the end result of what we "can" do and how it makes us "feel" in doing so. It is only a small leap towards talent.
Once robots achieve creativity/talent they will be better than humans in every way. They would achieve greater intelligence than us, they would think faster than us, always be correct and their bodies would not fail like ours. After that what would be able to stop them? So much is already computerized and/or robotic in some fashion nowadays anyway. Our TVs and media devices, our cars, our homes, security cameras. The internet controls and connects so much. The means by which they would network is already available. They might not decide to wipe out humanity, but they would be a better work force than us, taking away our jobs and livelihood. Their being so much better than us would destabilize the global economy and make us obsolete.
Would that be a good thing? Would that let us enter into a golden age of intellectual, spiritual and artistic pursuits? Or would we become complacent, bored and useless, unable to fend for ourselves? Or, would we just be introducing a new element (them) to struggle against in the world at large? I don't think our conflict with robots will be in terms of warfare as in the Terminator movies, but more a socioeconomic one. Interesting to ponder, no?
Listening to: Any suggestions?
Reading: comics as always, lol
Watching: Blu rays
Drinking: from time to time