• Re: St Peterburg Paradox

    From richard@richard@cogsci.ed.ac.uk (Richard Tobin) to rec.puzzles on Thu Dec 11 17:16:41 2025
    From Newsgroup: rec.puzzles

    In article <10heml0$29ci5$1@dont-email.me>,
    David Entwistle <qnivq.ragjvfgyr@ogvagrearg.pbz> wrote:

    I've read the Wikipedia page:

    https://en.wikipedia.org/wiki/St._Petersburg_paradox

    but the answer provided, that the expected value (E) is given by:

    E = 1/2 x 2 + 1/4 x 4 + 1/8 x 8 + ... 1/n x 2^(n-1)

    [that should be 1/2^n x 2^n - each term is equal to 1]

    doesn't seem right. As I understand it, there's only one payout per game, >so you shouldn't add the series of probability-of-payout x payout, for all >rounds.

    A single game consists of tossing a coin until it comes up heads.
    If that happens on the kth toss, you win 2^k.

    The Wikipedia article is confusing because it refers to the "stake"
    doubling as if you had to pay for each toss. You pay once at the
    start of the game. It's the prize that doubles with each toss.

    There's no real paradox here. Paying an amount based on your expected
    gain is only sensible under conditions where your actual gain is
    likely to be similar to your expected gain, which is not the case here.

    -- Richard
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From David Entwistle@qnivq.ragjvfgyr@ogvagrearg.pbz to rec.puzzles on Fri Dec 12 14:45:08 2025
    From Newsgroup: rec.puzzles

    On Thu, 11 Dec 2025 17:16:41 -0000 (UTC), Richard Tobin wrote:


    [that should be 1/2^n x 2^n - each term is equal to 1]

    Thanks.

    A single game consists of tossing a coin until it comes up heads.
    If that happens on the kth toss, you win 2^k.

    Yes. Gardner starts with an initial prize of $1, whereas the wikipedia
    version starts with an initial prize of $2. It's just a detail.


    The Wikipedia article is confusing because it refers to the "stake"
    doubling as if you had to pay for each toss. You pay once at the start
    of the game. It's the prize that doubles with each toss.

    Yes.

    Running a simulation of 1000 games behaves as expected:

    [Coin tosses before payout : count]

    0 : 0
    1 : 499
    2 : 246
    3 : 119
    4 : 67
    5 : 38
    6 : 13
    7 : 9
    8 : 6
    9 : 1
    10 : 0
    11 : 0
    12 : 1
    13 : 1
    14 : 0
    15 : 0
    ... nothing more up to a count of 30 and beyond.

    The counts roughly halving on each increase in the number of required
    tosses per game, with a few outliers.

    With respect to the amount a player should pay to join a game, Gardener
    says: "The unbelievable answer is you could pay me any amount, say a
    million dollars, for each game and still expect to come out ahead". He
    goes on to give that some context.


    There's no real paradox here. Paying an amount based on your expected
    gain is only sensible under conditions where your actual gain is likely
    to be similar to your expected gain, which is not the case here.

    That does feel bit weird. I'll just avoid playing any similar games. Can I invoke a sort of 'quantization' argument - we're not likely so see any
    events with a very small probability, and consequently very large gains,
    in a real-world situation with a limited number of games, so sweep the mathematical argument under the carpet? It seems weak.
    --
    David Entwistle
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From richard@richard@cogsci.ed.ac.uk (Richard Tobin) to rec.puzzles on Fri Dec 12 16:03:09 2025
    From Newsgroup: rec.puzzles

    In article <10hh9pk$2v506$1@dont-email.me>,
    David Entwistle <qnivq.ragjvfgyr@ogvagrearg.pbz> wrote:

    There's no real paradox here. Paying an amount based on your expected
    gain is only sensible under conditions where your actual gain is likely
    to be similar to your expected gain, which is not the case here.

    That does feel bit weird. I'll just avoid playing any similar games. Can I >invoke a sort of 'quantization' argument - we're not likely so see any >events with a very small probability, and consequently very large gains,
    in a real-world situation with a limited number of games, so sweep the >mathematical argument under the carpet? It seems weak.

    What mathematical argument is that? The expected gain is infinite,
    but "expected gain" is just a mathematical term, not an argument for
    behaving in a certain way.

    People often assume that the "rational" thing to do is to maximize
    your expected gain. But why? If I lose, it's no consolation that
    my expected gain was high. What I want is to maximize my *actual*
    gain.

    So the question is, when is your expected gain a good proxy for your
    actual gain? The weak and strong laws of large numbers provide some
    help - they say that in certain circumstances your gain converges -
    either in probability of almost surely - to the expected gain.
    However, these laws do not apply in this case because the mean and
    variance are infinite.

    Even when the laws of large numbers apply, the convergence is not
    necessarily something you can make use of in real life, since you
    cannot play an arbitrarily large number of games.

    -- Richard
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Phil Carmody@pc+usenet@asdf.org to rec.puzzles on Fri Dec 12 19:16:44 2025
    From Newsgroup: rec.puzzles

    richard@cogsci.ed.ac.uk (Richard Tobin) writes:
    In article <10heml0$29ci5$1@dont-email.me>,
    David Entwistle <qnivq.ragjvfgyr@ogvagrearg.pbz> wrote:

    I've read the Wikipedia page:

    https://en.wikipedia.org/wiki/St._Petersburg_paradox

    but the answer provided, that the expected value (E) is given by:

    E = 1/2 x 2 + 1/4 x 4 + 1/8 x 8 + ... 1/n x 2^(n-1)

    [that should be 1/2^n x 2^n - each term is equal to 1]

    doesn't seem right. As I understand it, there's only one payout per game, >>so you shouldn't add the series of probability-of-payout x payout, for all >>rounds.

    A single game consists of tossing a coin until it comes up heads.
    If that happens on the kth toss, you win 2^k.

    The Wikipedia article is confusing because it refers to the "stake"
    doubling as if you had to pay for each toss. You pay once at the
    start of the game. It's the prize that doubles with each toss.

    There's no real paradox here. Paying an amount based on your expected
    gain is only sensible under conditions where your actual gain is
    likely to be similar to your expected gain, which is not the case here.

    It's a disconnect between the modal (1) or median (1.5) gain and the
    mean gain (oo). I think it's worthy of the title "paradox" though. There
    are two equally valid seeming thought processes that lead to
    contradictory conclusions.

    If I could play the game as many times as I wanted, I'd definitely pay
    the $25 that is mooted in the wikipedia article to play. However, if it
    was a single shot, and I had no collaborators to share risk with, the mathematician in my head has to be gagged, and I don't think I'd
    stretch to $10. I'd rather have a couple of beers.

    Assuming you have savings of $100 you're prepared to lose, a quick sim
    with some non-bizarro heuristics (that remove the infinity) does imply
    that a stake of 6 or below leaves you with a better than evens chance of
    ending up in profit:

    stake,stash: results from 10000 trials (which can consist of many games) 10,100: 7685 losses, 2315 wins averaging 49.1487, 3 broke the bank
    8,100: 6743 losses, 3257 wins averaging 57.1332, 6 broke the bank
    8,100: 6710 losses, 3290 wins averaging 916.9998, 7 broke the bank
    7,100: 5700 losses, 4300 wins averaging 585.0954, 30 broke the bank
    7,100: 5723 losses, 4277 wins averaging 457.4633, 26 broke the bank
    7,100: 5700 losses, 4300 wins averaging 2211.0101, 28 broke the bank
    6,100: 4895 losses, 5105 wins averaging 2197.5716, 86 broke the bank
    6,100: 4837 losses, 5163 wins averaging 2023.9142, 91 broke the bank
    6,100: 4837 losses, 5163 wins averaging 2490.1608, 97 broke the bank

    Here, "broke the bank" means your winnings exceeded 1000 times your
    initial stash.
    The heuristics are that once you've reached a pot over your initial
    stash, you stop playing once you've burnt into half of that nett gain.
    (So if you play and reach 250, you'd be prepared to play until 175, and
    then you'd just bank the 75 nett winnings.)

    If you've got no stash, only enough for the initial stake, then no
    stake is low enough to give you a decent chance of winning, despite
    the fact that you can still on average win decently when you win:

    6,6: 8793 losses, 1207 wins averaging 0.6217, 3 broke the bank
    5,5: 8762 losses, 1238 wins averaging 19.5844, 20 broke the bank
    4,4: 8628 losses, 1372 wins averaging 164.3171, 75 broke the bank
    3,3: 7482 losses, 2518 wins averaging 114.5152, 224 broke the bank
    2,2: 6588 losses, 3412 wins averaging 757.5208, 1137 broke the bank

    Phil
    --
    We are no longer hunters and nomads. No longer awed and frightened, as we have gained some understanding of the world in which we live. As such, we can cast aside childish remnants from the dawn of our civilization.
    -- NotSanguine on SoylentNews, after Eugen Weber in /The Western Tradition/
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From David Entwistle@qnivq.ragjvfgyr@ogvagrearg.pbz to rec.puzzles on Sun Dec 14 11:33:47 2025
    From Newsgroup: rec.puzzles

    On Fri, 12 Dec 2025 19:16:44 +0200, Phil Carmody wrote:

    It's a disconnect between the modal (1) or median (1.5) gain and the
    mean gain (oo). I think it's worthy of the title "paradox" though. There
    are two equally valid seeming thought processes that lead to
    contradictory conclusions.

    If I could play the game as many times as I wanted, I'd definitely pay
    the $25 that is mooted in the wikipedia article to play. However, if it
    was a single shot, and I had no collaborators to share risk with, the mathematician in my head has to be gagged, and I don't think I'd stretch
    to $10. I'd rather have a couple of beers.

    Assuming you have savings of $100 you're prepared to lose, a quick sim
    with some non-bizarro heuristics (that remove the infinity) does imply
    that a stake of 6 or below leaves you with a better than evens chance of ending up in profit:



    Thanks, I get something very similar, when I run a simulation.

    I can be a 'a bit thick' sometimes, but I'd look at it as follows. You
    will always win at least the casino's initial prize, so you can very
    happily pay that, or anything less.

    If you pay twice the initial prize to play, for a single game, you have a
    0.75 probability of not making a profit - either just getting your money
    back, or making a loss, and 0.25 probability of making a profit. You could
    go far that, if you were that way inclined.

    The more you pay up front the less the chance of getting your money back.
    If you were to pay 2^10 times the initial prize, your probability of not getting your money back is 0.999 - it's almost certain you won't get your money back.

    I still suspect I'm not understanding the game, as I don't see any justification for Gardener's statement "The unbelievable answer is that
    you could pay me any amount, say a million dollars, for each game and
    still expect to come out ahead." The draw of a large win may be their, but
    it isn't at all obvious to me. To be fair I've never had any interest in gambling.
    --
    David Entwistle
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Mike Terry@news.dead.person.stones@darjeeling.plus.com to rec.puzzles on Sun Dec 14 19:45:34 2025
    From Newsgroup: rec.puzzles

    On 14/12/2025 11:33, David Entwistle wrote:
    On Fri, 12 Dec 2025 19:16:44 +0200, Phil Carmody wrote:

    It's a disconnect between the modal (1) or median (1.5) gain and the
    mean gain (oo). I think it's worthy of the title "paradox" though. There
    are two equally valid seeming thought processes that lead to
    contradictory conclusions.

    If I could play the game as many times as I wanted, I'd definitely pay
    the $25 that is mooted in the wikipedia article to play. However, if it
    was a single shot, and I had no collaborators to share risk with, the
    mathematician in my head has to be gagged, and I don't think I'd stretch
    to $10. I'd rather have a couple of beers.

    Assuming you have savings of $100 you're prepared to lose, a quick sim
    with some non-bizarro heuristics (that remove the infinity) does imply
    that a stake of 6 or below leaves you with a better than evens chance of
    ending up in profit:



    Thanks, I get something very similar, when I run a simulation.

    I can be a 'a bit thick' sometimes, but I'd look at it as follows. You
    will always win at least the casino's initial prize, so you can very
    happily pay that, or anything less.

    If you pay twice the initial prize to play, for a single game, you have a 0.75 probability of not making a profit - either just getting your money back, or making a loss, and 0.25 probability of making a profit. You could
    go far that, if you were that way inclined.

    You're talking about the /probability/ of "winning" i.e. of making a profit. That's not the same as
    your "expected" winnings, which is a mathematical way of averaging out all possible outcomes: big
    wins/little wins/losses etc. into a single "expected return" figure. The mathematics is not
    "advising" anyone to do anything. What the maths /does/ say is that over a long enough run of games
    we will see the average of all your returns converging to the single "expected return" value.

    So if your "expected return" on a game is more than the stake, then in a "long enough" run of games
    you are "certain" to come out ahead. The law of large numbers, and all that.

    So you're considering paying twice the initial stake to play, and point out that you are more likely
    to lose. Correct, but look at it another way: if you played 1000 times like that, would you on
    average be ahead or behind? That's something you can simulate, and the high likelihood is you will
    be ahead, even though on most games you lost money. It's simply that on the fewer games where you
    made a profit, you won big. (That in itself isn't complicated, right? The St. Petersburg game just
    takes that to an extreme.)


    The more you pay up front the less the chance of getting your money back.
    If you were to pay 2^10 times the initial prize, your probability of not getting your money back is 0.999 - it's almost certain you won't get your money back.

    Right, but that's not what Gardener means by "expect to come out ahead" - he's considering the
    mathematical "expectation value" for your winnings. It is infinite, implying that in a "long
    enough" run of games you are going to come out ahead. (Law of large numbers again...) Is it this
    mathematical claim that you doubt? (Hopefully not, as the maths is the easy bit. Actually, the law
    of large numbers is definitely NOT "easy" to prove, but its application in this example is intuitive
    enough.)

    It's the same as with your "paying twice the stake to play" example, except that it will take MUCH
    MUCH MUCH MUCH...MUCH longer to demonstrate this in a simulation. OK to put this in perspective,
    simulations would need to be long enough that runs of 1024 tails were commonplace! Let's say as a
    ballpark figure that we'll need runs of 2^1024 games - you won't even start to fit one of these runs
    into the age of the universe so far... :)

    Nevertheless mathematically you would "in the long run" come out ahead paying 2^10 times the initial
    stake. (The "long run" being in this case (unfortunately) well beyond the age of the universe...)

    Other posters have explained why in a real life situation you would probably not take such a bet:
    - most obviously, you can't play enough games for the mathematical "expectancy"
    value to become meaningful. In the absence of that, the high probability of simply
    losing all your money becomes the dominant concern. In practice it is not the
    amount of money spent/won that is important - it's the difference that will
    imply for how you live your life. (Balancing with how likely those outcomes are.)
    - reasons you can't play enough games:
    you will die when you've barely got started!
    you will run out of money to buy more games
    the house will at some point refuse to pay up on a big win
    etc.


    Mike.

    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From James Dow Allen@user4353@newsgrouper.org.invalid to rec.puzzles on Mon Dec 22 05:21:14 2025
    From Newsgroup: rec.puzzles


    David Entwistle <qnivq.ragjvfgyr@ogvagrearg.pbz> posted:


    https://en.wikipedia.org/wiki/St._Petersburg_paradox

    but the answer provided, that the expected value (E) is given by:

    E = 1/2 x 2 + 1/4 x 4 + 1/8 x 8 + ... 1/n x 2^(n-1)

    A good way to proceed should be well known. Kelly's Criterion
    is discussed in any good book on blackjack ( :-) ); and is known
    to hedge fund managers, etc.

    The paradox was proposed by Nicolaus Bernoulli. His cousin Daniel
    Bernoulli proposed a solution which I think at least hints at the following.

    To start with, there are two difficulties to be disposed of:
    (1) Will winning $2 trillion really make you twice as happy as
    winning just $1 trillion?
    (2) If you DO win $2 trillion, will the counter-party
    even be able to pay you?

    Let's ignore (2) and work around (1) by adopting a non-linear
    utility function for our money. A good choice which has strong theoretical justification
    (see https://www.princeton.edu/~wbialek/rome/refs/kelly_56.pdf )
    is to set your utility to the logarithm of your total bankroll.
    Thus, supposing you have $115 total and must pay $15 to play the game,
    you should play provided that
    log(115) < log(102)/2 + log(104)/4 + log(108)/8 + log(116)/16 + ...
    Instead of seeking to maximize your expected resultant bankroll,
    you maximize the the expected logarithm of the resultant bankroll,
    or equivalently, maximize the geometric mean of the resultant bankroll.

    The infinite series shown above can be made finite via the approximation
    log(B + 2^k)/2^k + log(B + 2^k+1)/2^k+1 + ... = 4*log(B + 2^k)/2^k
    which is valid when 2^k has grown much much larger than B.
    I think.

    Cheers,
    James
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From James Dow Allen@user4353@newsgrouper.org.invalid to rec.puzzles on Mon Dec 22 09:18:41 2025
    From Newsgroup: rec.puzzles


    James Dow Allen <user4353@newsgrouper.org.invalid> posted:

    or equivalently, maximize the geometric mean of the resultant bankroll.

    I should have written
    Maximize the WEIGHTED geometric mean of the resultant bankroll.
    --- Synchronet 3.21a-Linux NewsLink 1.2