I was reading the first pages of Casino Royale, not the Daniel Craig film, not the David Nival film, but the Ian Fleming novel. In the Daniel Craig movie, James Bond plays Le Chiffre in a high stakes poker game. The film is perhaps the most loyal of any Bond film, rivaled only by Her Majesty's Secret Service, in adherence to Ian Fleming's books, and not too surprising, these are thus two of the best bond films made. Craig is certainly a great Bond but neither of the films which followed, Quantum of Solace or Skyfall, was in the class of Casino Royale. Meanwhile On Her Majesty's Secret Service starred Roger Lazerby, the only one-timer (unless you count Niven). While he was young and energetic, a striking contrast to Roger Moore, there was still a reason Lazerby never returned. So the conclusion is clear: Fleming wrote some solid books. The earlier Casino Royale, the 1967 film starring David Nivan (who was actually a serious candidate to play Bond in Dr. No) was a comedy. I really don't recommend it. I got half-way through and had to stop. But it had one bit of loyalty to the novel that Daniel Craig's didn't have. The game with Le Chiffre wasn't poker. It was baccarat: Chemin de Fer.
I'd never played baccarat so I decided I wanted to try. A description is here. There's several variants: Chemin de Fer is one, where players choose whether to take a 3rd card or not. Another is Punto Banco, where the choice of a 3rd card is automated. I found an on-line game which played Punto Banco, so I tried.
Basically under the standard form (as opposed to the "EZ" form) baccarat is a high-stakes game where you can bet on the "player" or the "bank". If you bet on the player, the expectation value is you'll lose 1.24% of your bet (a 49.38% chance to win your bet) and if you bet on the banker you've got a 50.62% chance to win 95% of your bet (there's a commission), resulting in what Wikipedia says an expectation of losing 1.06% of your bet but I calculate 1.29% assuming the player side odds are correct. So while you can attempt to count cards to improve your odds any effect from this is truly marginal since typically 6 decks are used and, unlike Poker, you're not looking for specific cards. So it's basically pure luck.
There's a classic betting strategy which is to gamble a fixed amount (for example $25) on the first hand. If you lose, double the bet. If you win your second hand you're now up the amount of the first hand. If not, you're down 3 times that first hand. If you ever win, the next hand you go back to your original ($25) bet. If you lose again, double again. If you win this one you're again up the amount of your first bet, but if you lose you're now down 1 + 2 + 4 = 7 times the value of your first bet. You repeat the process until, assuming you eventually win, you're up the value of the first bet. So your total keeps creeping upward: each time you win you're up the original bet from the preceding time you win. It seems like a sure win.
The problem is if you start losing your total starts to plummet quickly. If you run out of money you're done: no more bets. So it's not a sure win after all. The longer you play, the more chance you have to hit a cold streak, and cold streaks can quickly burn through your winnings, cut into your original money, and have you hit rock bottom since your bets are exponentially increasing each time you lose a hand.
So you might try to finesse it. "I'm going to start with a small bet, and set a modest goal which if I hit it I'm out," you smugly conclude. The small bet is necessary because it gives you more chances to win, and go up that bet amount, before you hit the number of consecutive losses needed to bankrupt you. The firm end-goal is necessary because eventually you will hit a cold streak, and the firm end-point keeps the chance of doing so finite. Intuitively one might expect a sweet spot where you can expect to win.
Orson Wells as Le Chiffre, Peter Sellers imitating James Bond, Casino Royale (1967)
But there is no sweet spot. The specific details of he game are complicated, but the expectation value of playing is what it is. When you play the player side, the expectation value is that you will end up with 98.76% of your money you bet in total during the game. The longer you play, the more you bet, the more you're expected to lose.
I played an on-line version where I used this strategy. I started with $5000 in virtual money (this wasn't on-line betting, just a simulator). My goal was to get to $6000. I started with $25 bets. The maximum bet was $200. So if I lost the $25 bet I doubled to $50. If I lost that I doubled to $100. If I lost that I doubled to $200. If I lost that I stuck with $200 until my wins at $200 exceeded my losses, then I went back to $25. Eventually I did reach $6000 and I quit, happier by $1000 in fake money.
So what were my chances to hit my goal? Again the details were unimportant. I can estimate my chances by considering the possible end points and the expectation value of playing. The possible end-points were +$1000 or -$5000. But I know the expectation value each hand, and thus in the end, was no better than 0. So this means I had no better than a 5/6 chance to win $5000, and at least a 1/6 chance to lose all $5000. That totals an expectation value of 0.
On average, I'd win $25 every other hand, not counting the complexity introduced by the $200 maximum. So if I were to hit my goal it would require $1000 × 2 / $25 = 80 hands. Of these hands, approximately 40 would be $25 bets (they come after I win, which is about half the time), approximately 20 would be $50 (they come after I win then lose, which is 25% likely), approximately 10 would be $100 (coming after I win then lose twice), and the remaining 10 would be the $200 max.
If I wanted to be simple and naive about this calculations, neglecting again the maximum bet number, the expectation value of losses in a given round if I were to play infinitely long = sum from n = 0 to infinity of $25 × 2n × 2−(n + 1) × 1.24%. This is unfortunately divergent: an expectation of infinite losses. I need to truncate the series.
Hey! That is not baccarat!
Using instead the estimated number of each type of bet, I get $1000 worth of $25 bets, $1000 worth of $50 bets, $1000 worth of $100 bets, and $2000 worth of $200 bets. This is a total of $5000 worth of bets, which costs me approximately 1.24% of $5000 or $62. So the expectation value of my result shouldn't be $0, it should instead be around -$62. I can calculate my probability of winning assuming this expectation value with probability of winning p: $1000 p - $5000 (1 - p) = -$62. This simplifies to $6000 p = $4938. My chance of winning is reduced from 83.33% to 82.3%, with the chance of losing now 17.7%. This is a slight underestimate of the total number of bets since the $200 cap means I might have a few more $200 bets.
So the result of this is fairly simple, especially neglecting the house advantage. The factor-of-two betting strategy seems to be a sure thing, and indeed in my simulation game as I methodically worked my way from $5000 to $6000 it seemed too good to be true, but really it doesn't get around the fundamental limitation of expectation values. I have a high probability of winning a small amount versus a small probability of losing a large amount. The more I'm willing to lose the greater my probability of winning and the smaller the probability of losing but the greater the loss if you do lose.