LSEC - Leaders In Security

Welcome to LSEC, an internationally renowned Information security cluster, a not for profit organization that has the objective to promote Information Security and the expertise in BeNeLux and Europe. Founded by the University of Leuven (K.U. Leuven), supported by European Commission FP7 and leading a unique PAN European Private partnership that interacts with Public Institutions, LSEC connects security industry experts, research institutes and universities, government agencies, end users, funding bodies and technical experts who are driving national and European research agendas. LSEC activities aim to raise cyber security awareness, support innovation and competitiveness of the European IT- Security market and promote the visibility of its members.

%AM, %29 %503 %2015 %11:%Jul

Stephen Hawking fears global AI arms race that will threaten humanity Featured

Written by
Rate this item
(0 votes)

stephen hawking 2

Stephen Hawking, Steve Wozniak, and many other eminent researchers and scientists have published an open letter where they warn about the military use of artificial intelligence.

Autonomous weapons: Third revolution in warfare

Stephen Hawking, Steve Wozniak, and many other eminent researchers and scientists have published an open letter where they warn about the military use of artificial intelligence.

Autonomous weapons are one step beyond the already controversial use of remotely piloted drones for which humans make all targeting decisions. They might include, for example, armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria.

The authors expect that once one major military power will start pushing the AI weapon development, this will cause a global arms race. They outline that these weapons will be relatively easy to create, that will lead to both mass- production and exploitation by terrorist groups.

Advocates for AI weapons would argue that AI weapons would reduce the risk and loss of lives in a military conflict. The authors counter this argument and fear that this would lower the threshold for going to battle, and that the nature of these weapons would make them ideal for offensive and malicious tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group.

The authors conclude their letter, stating that there are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people. A military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control.

Read 980 times Last modified on %PM, %29 %616 %2015 %13:%Jul

LSEC for Security Professionals

lsec-prof

LSEC for Security Companies

lsec-com

 

LSEC for enterprise & government

lsec-gov

 

LSEC for academia & research institutes

lsec-research

Request information about LSEC Membership

Click here

Sign up for our newsletter

Click here

Learn more about current projects & industry collaborations

Click here

Contact us

Click here

logo-acdcec

Privacy | Disclaimer | Responsible Disclosure Copyright LSEC - Leaders In Security 2002 - 2017 - Kasteelpark 10, 3001 Heverlee - Leuven | tel. +32.16.32.85.41