Suppose a random person is living on a desert island without hope of rescue. Call him the Initial Inhabitant, or I.I. Another random person unexpectedly washes up on shore, coughing up water. Call him the New Arrival, or N.A. While N.A. is helplessly gasping for air, what does I.I. do? Just to make the story interesting, let’s suppose that N.A. is much bigger than I.I.

Thomas Hobbes’ prediction, on my reading, is that I.I. will immediately pick up a rock and murder N.A.:

And therefore if any two men desire the same thing, which nevertheless they cannot both enjoy, they become enemies; and in the way to their end (which is principally their own conservation, and sometimes their delectation only) endeavour to destroy or subdue one another. And from hence it comes to pass that where an invader hath no more to fear than another man’s single power, if one plant, sow, build, or possess a convenient seat, others may probably be expected to come prepared with forces united to dispossess and deprive him, not only of the fruit of his labour, but also of his life or liberty. And the invader again is in the like danger of another.

And from this diffidence of one another, there is no way for any man to secure himself so reasonable as anticipation; that is, by force, or wiles, to master the persons of all men he can so long till he see no other power great enough to endanger him: and this is no more than his own conservation requireth, and is generally allowed. (emphasis mine)

Preemptive murder may seem paranoid. But here’s the Hobbesian logic: If I.I. waits for N.A. to catch his breath, N.A. will be strong enough to overpower him if he so desires. It’s therefore in I.I.’s interest to kill N.A. before N.A. becomes a threat.

In my view, the Hobbesian prediction is crazy. Virtually no one alone on a desert island would choose the route of preemptive murder. Yes, it’s possible that N.A. will catch his breath and then attack. But it’s far more likely that N.A. will catch his breath and say, “Boy, am I glad to see you. At least I’m not alone.” And I.I. will say the same thing back. Two normal humans in a Hobbesian scenario become fast friends, not mortal enemies.

Dogmatic Hobbesians won’t accept my prediction, but I think reasonable people will. But here’s a bigger challenge: What is the minimum revision you would have to make to my thought experiment to get a Hobbesian outcome?