Elon Musk, Stephen Hawking & Steve Wozniak guide AI experts who want killer robots banned - Physics-Astronomy.org

Elon Musk, Stephen Hawking & Steve Wozniak guide AI experts who want killer robots banned

Hundreds of the world’s top robotics and artificial intelligence expert have issued an open letter urging the United Nations to put a ban on risky autonomous weapons systems. The letter is signed by near 700 researchers and extra than 600 other experts counting famed physicist Stephen Hawking , Elon Musk, Apple co-founder Steve Wozniack, Skype co-founder Jaan Talinn and activist philosopher Noam Chomsky. Its issue accords with the 1st day of the world’s most admired AI meeting, the 2015 International Joint Conference on Artificial Intelligence. The letter says that the developed of autonomous weapons is likely within years and will play a risky role in driving a third rebellion in warfare, after fine particles and nuclear arms. The researchers say that substitute human soldiers by machines is good by lessening fatalities but bad since it lowers the threshold for going to battle.
Autonomous Weapons: An Open Letter from AI & Robotics new Researchers
“Autonomous weapons choose and engage targets without human interference. They might include, for example, armed quadcopters that can hunt for and eliminate citizens meeting certain pre-defined criteria, but do not comprise cruise missiles or remotely piloted drones for which humans create all targeting decisions. Artificial Intelligence (AI) technology has reach a point where the operation of such systems is - almost if not legally - feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the 3rd revolution in warfare, after gunpowder and nuclear arms.
Many arguments have been complete for and against autonomous weapons, for example that replace human soldiers by machines is high-quality by reducing casualties for the owner but bad by thereby lower the threshold for going to battle. The key question for humanity today is whether to begin a global AI arms race or to prevent it from starting. If any main military power pushes ahead with AI weapon development, a global arms race is almost inevitable, and the endpoint of this technological route is clear: autonomous weapons will become the Kalashnikovs of tomorrow.
Unlike nuclear weapons, they require no expensive or hard-to-obtain raw materials, so they will become ever-present and cheap for all significant military powers to mass-produce. It will only be a substance of time until they come into view on the black market and in the hands of terrorists, dictators wishing to improved control their populace, warlords wishing to commit ethnic cleansing, etc. Autonomous weapons are perfect for tasks such as assassinations, destabilising nations, subdue populations and selectively killing a exacting ethnic group.
We therefore consider that a military AI arms race would not be helpful for humanity. There are many ways in which AI can make battlefield safer for humans, especially civilians, without creating latest tools for killing people.
Just as the majority chemists and biologists have no importance in building chemical or biological weapons, the majority AI researchers have no interest in building AI weapons - and do not want others to tarnish their field by doing so, potentially create a major public backlash next to AI that curtails its future societal benefits. certainly, chemists and biologists have broadly supported international agreements that have fruitfully prohibited chemical and biological weapons, presently as most physicists support the treaties banning space-based nuclear weapons and blinding laser weapons.
Elon Musk, Stephen Hawking & Steve Wozniak guide AI experts who want killer robots banned
In summary, we consider that AI has huge potential to benefit humanity in many ways, and that the goal of the field should be to do so. First a military AI arms race is a bad idea, and should be prevented by a ban on nasty autonomous weapons beyond important human control.”

No comments

Powered by Blogger.