The optimal strategy does not guarantee that you get to the next round — but it *does* give you a greater than 95% chance of doing so.
by Mark R. Waser (originally appeared Dec. 10, 2012 at Transhumanity.Net)
You’re appearing on the “hottest new game show” Money to Burn. You’ll be playing two rounds of a game theory classic against the host with a typical “Money to Burn” twist. If you can earn more than $100, you get to move on to the next round.
One of the current problems in (and arguably with) game theory is the so-called “centipede game” (Rosenthal 1981) – where the so-called logical/rational/correct/”optimal” strategy produces virtually the worst possible result. While there are many variants of the centipede game, they all take the form of players alternately choosing between increasing the pot and continuing the game –or– “cashing out” by taking the majority of the pot and ending the game. The “problem” posed by the game is Continue reading
by Mark R. Waser (originally appeared Dec. 12, 2012 at Transhumanity.Net)
Some problems are so complex that you have to be highly intelligent and well informed just to be undecided about them. – Laurence J. Peter
Numerous stories were in the news last week about the proposed Centre for the Study of Existential Risk (CSER), set to open at Cambridge University in 2013.
After decades of movies of computers and robots going awry, who wouldn’t celebrate this as a good thing? As a researcher in artificial general intelligence (AGI) and ethics who agrees that artificial intelligences (AIs) *are* an existential risk, let me raise my hand. Continue reading