by Mark R. Waser (originally appeared April 24, 2013 at Transhumanity.Net)
George Dvorsky posted You’ll Probably Never Upload Your Mind Into A Computer in the “DEBUNKERY” section of io9 stating “Here are eight reasons why your brain may never be digitized.” The first problem, of course, is that mind and brain are two different things – as different as a software program and the computer it is running on. The second problem is that he then pulls in yet more confusion with “consciousness uploads” and “transfer our conscious thoughts to a computer”.
Ben Goertzel then weighed in with Goertzel Contra Dvorsky on Mind Uploading. Ben states that he prefers Randal Koene’s phrase “substrate independent minds” as less sexy but less misleading – but then goes on to state that “the point is the liberation of the mind from any particular substrate.”
Here’s *my* take. Continue reading
by Mark R. Waser (originally appeared April 3, 2013 at Transhumanity.Net)
Over at Facing the Singularity, Luke Muehlhauser (LukeProg) continues Eliezer Yudkowsky’s theme that Value is Fragile with Value is Complex and Fragile. I completely agree with his last three paragraphs.
Since we’ve never decoded an entire human value system, we don’t know what values to give an AI. We don’t know what wish to make. If we create superhuman AI tomorrow, we can only give it a disastrously incomplete value system, and then it will go on to do things we don’t want, because it will be doing what we wished for instead of what we wanted.
by Mark R. Waser (originally appeared March 27, 2013 at Transhumanity.Net)
The last six months has seen a rising flood of publicity about “killer robots” and autonomy in weapons systems. On November 19, 2012, Human Rights Watch (HRW) issued a 50-page report “Losing Humanity: The Case against Killer Robots” outlining concerns about “fully autonomous weapons that could select and engage targets without human intervention” and claiming that a “preemptive prohibition on their development and use is needed”. Two days later, the United States Department of Defense released Directive 3000.09 which “assigns responsibility for the development and use of autonomous and semi-autonomous functions in weapon systems”. Now, social media is all abuzz because the “International Committee for Robot Arms Control” (ICRAC) has issued a Scientists’ Call to Ban Autonomous Lethal Robots “in which the decision to apply violent force is made autonomously”.
Arms control is an immediate critical issue. Weapons have already been fielded that are disasters just waiting to happen. The most egregious example described by the HRW report is the Israeli Harpy – a fire-and-forget “loitering attack weapon” designed to autonomously fly to and patrol an assigned area and attack any hostile radar signatures with a high explosive warhead. Indeed, HRW invokes our worst fears by quoting Noel Sharkey’s critique that “it cannot distinguish between an anti-aircraft defense system and a radar placed on a school by an enemy force”. Yet, the Harpy has been sold to Chile, India, South Korea, The People’s Republic of China and Turkey. Continue reading