Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Elon Musk, CEO of Tesla and SpaceX DPA/PA Images
Artificial Intelligence

Elon Musk backs call for killer robots to be banned

“We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.”

ELON MUSK IS leading demands for a global ban on killer robots, warning that technological advances could revolutionise warfare and create new “weapons of terror” that target innocent people.

The CEO of Tesla and SpaceX joined more than 100 robotics and artificial intelligence entrepreneurs in signing a letter to the United Nations (UN) calling for action to prevent the development of autonomous weapons.

“Lethal autonomous weapons threaten to become the third revolution in warfare,” warned the statement signed by 116 tech luminaries, also including Mustafa Suleyman, cofounder of Google’s DeepMind.

“Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend,” the letter reads.

The innovators also highlighted that the technology could fall into the wrong hands.

“These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways,” the letter reads.

We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.

Both Musk and British astrophysicist Stephen Hawking have regularly warned of the dangers of artificial intelligence (AI).

Melbourne conference 

The renewed plea on autonomous weapons was released as the International Joint Conference on Artificial Intelligence in Melbourne gets underway today, with a record 2,000 of the world’s top AI and robotics experts taking part, organisers said.

One expert said autonomous weapons could make war more likely.

“Today the potential loss of human life is a deterrent for conflict initiation and escalation, but when the main casualties are robots, the disincentives change dramatically and the likelihood of conflict increases,” Professor Mary-Anne Williams of the University of Technology Sydney said.

She warned a killer robot ban may be disregarded by some nations but would stop “countries such as Australia from developing defensive killer robots, thereby being vulnerable to other countries and groups that ignore the ban”.

Another expert said decisions made today would help shape the “futures we want”.

“Nearly every technology can be used for good and bad, and artificial intelligence is no different,” Professor Toby Walsh of the University of New South Wales said.

Organisers said the conference, which concludes on Friday, has attracted a record number of participants from China, reflecting a push by Beijing to become a leading player in the field.

A key focus of the event will be looking at the challenges of developing fully autonomous AI systems, programme chair Carles Sierra of the Spanish National Research Council said.

A UN group on autonomous weapons had been due to meet today but the gathering was postponed until November, according to the group’s website. In 2015, thousands of researchers and personalities launched an appeal to ban autonomous weapons.

© AFP 2017

Read: The United States is about to experience the best eclipse in a lifetime… but how long does Ireland have to wait?

Read: Spanish authorities hope to identify victims of Barcelona terror attacks as manhunt for suspect deepens

Your Voice
Readers Comments
46
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.

    Leave a commentcancel