Will Google respect the Three Laws of Bots?

… It would then be a matter of humans making critical ethical and technical choices, and when Google is at the center of the equation, very smart who can predict whether the right cards will be played.

Two things that fill the heart with admiration and respect, and are new and ever-increasing, as the thought clings to and applies to it: the starry sky above me and the moral law within me. With this long sentence by Immanuel Kant, it is possible to deal with the future challenges of science, tainted by futurology, in the two main areas of technical development, namely space exploration and robotics. The former envisions an all-human desire for exploration on the horizon, which will eventually drive the species out of its habitat – whether or not it is essential to its survival. And exploration is unique in that it carries with it the unknown and the disturbing, as much as it is desirable: we will go, but where we will go, we do not yet know how humanity will adapt, nor whether our laws, customs, and habits will survive the journey.

On the other hand, robots are more embodied, more grounded, and more modern for the enlightened observer of their day. The improvement of machines has become so common that it is hard to marvel when the escalators at Mairie de Montrouge station speed up as soon as you step foot, or when the new transparent doors of the Navigo stations open in the elegant silence that sci-fi struggles to imagine, often preferring the dizzying bustle of the mechanics that begin . Street furniture isn’t the only work of art that gets overlooked on a daily basis: communication is instant, omniscient and instantaneous, kilometers have lost their power in emotional rupture. Voice commands can be given to a smartphone that performs them more or less efficiently. It’s also been a long time since we’ve forgotten just how automated the kitchen has become, by means of the microwave or the various robots taking care of the recipes that have been programmed to prepare them without difficulty.

Three Basic Laws

These technological advances that would have vexed humans a century ago, and which arrived by accident in 2014, are so familiar and almost invisible to the eye that you see them every day without asking themselves any questions about how they work. But more important is this same eye for imaginary robots: those that build electronic devices on an assembly line, those that move autonomously over rough terrain, those that kill from the sky, those that kill humans. form to help man as he shows his limits; Finally, the one who dares to take on the name of artificial intelligence.

“The first of these laws indicates that the robot” He shall not harm a man, nor permit him, by remaining passive, to be endangered “.”

However, the primitive robots From everyday life, the pressure cooker, the iPhone, the escalator and the robotic auger are no less robotic than the upcoming samples. Essentially for this reason, ethical questions are urgently asked by those researchers and philosophers who understood that the robot is not a future, but a present. This is also why it might be wise to understand how the future of this science is playing out in light of the takeover offers and the financial markets.

When robotic psychologist Susan Calvin, the heroine of many of Isaac Asimov’s short stories, is asked about the dangers of robots, her answer is always the same: A robot cannot be dangerous. He can’t, because the base layer of his brain, the core of his thinking, is programmed to obey the three laws of robotics, necessary to establish a healthy robot-human relationship. The first of these laws states that robots He shall not harm a man, nor permit him, by remaining passive, to be endangered The second law states that robots They must obey orders given to them by man, unless these orders conflict with the first law Finally, the third law requires that the robot It protects its existence as long as that protection does not conflict with the first or second laws “.

In the short story The Evidence, Calvin joins Kant in pushing the definition of the Three Laws even further, or at least in giving a human basis for these principles. ” The Three Laws form the basic guiding principles for a large part of moral systems “, as you say. ” It is clear that every human being has, in principle, an instinct of self-preservation. […] Likewise, every good man, possessing a social conscience and a sense of responsibility, must obey the existing authorities, and listen to his physician, his superior, his government, his psychiatrist, and his fellow, even when it disturbs his convenience or comfort. safety. These are the first two laws covered by the simplest moral principles – Dr. Calvin concludes: ” everyone good A man must also love his neighbor as himself, and risk his life to save someone else’s life. “If someone behaves like this,” He may be a robot, but he may also be a very brave man “.

This very short passage in Asimov’s work takes on special significance if we consider that the author has impressively outlined the real challenges of current and future robotics. Susan Calvin’s sermon not only assumes that robots must be based on moral principles, but also implies that they are human. good Who decides how robots will fit into the future of humanity. Just step aside and watch a story like star Warsstring like female doctor from or be like that break To realize that a robot is not systematically a being driven by the universal values ​​of goodness.

Skynet has bypassed its creator and would like to destroy it; The robots in the Trade Federation are under orders to kill Jedi and anything in their path. Let’s hardly remember Internet men Their only philosophy is to fully assimilate humanity into their mighty robotic empire.

In the military sphere, the question of the responsibility of drones is already emerging: ethics committees are fighting so that the decision to shoot remains humane, while the automation of war sometimes attracts representatives of personnel and armaments companies. But this frontier case, an extreme that allows the morality of a robot to be corrupted in the face of a kill order, is expressed only in war, where, by tragic definition, the norms of civil society are mocked. The concern in this limited case is to downplay the importance of robotics in the development of civilian technology and to consider that the implications that apply to military drones cannot extend to a wider field, which concerns all of humanity.

Two things that fill the heart with admiration and respect, and are new and ever-increasing, as the thought clings to and applies to it: the starry sky above me and the moral law within me. »

Criticism of practical reasonI Kant

Leave a Comment