Cube Handling

Forum Archive : Cube Handling

 
Volatility

From:   Chuck Bower
Address:   bower@bigbang.astro.indiana.edu
Date:   11 October 1998
Subject:   Volatility
Forum:   rec.games.backgammon
Google:   6vp3ae$tje$1@jetsam.uits.indiana.edu

(Note to readers:  this long post contains some basic material,
probably known to many, as well as some technical details which are
possibly over the head of the typical reader.  Both kinds of info
are intermingled throughout, so readers interested in only one approach
should just skim past the uninteresting part but still try to make it
through to the end, one way or another.)

    My college chemistry book (CHEMISTRY:  Reactions, Structure, and
Properties by Clyde R. Dillard and David E. Goldberg) defines volatility
as:  "tendency toward evaporation".  The word is also used in reference
to securities (i.e. in the stock and commodities markets).  A person
could be referred to as 'volatile', which I would take to mean "very
easily excitable".

    In general, HIGH volatility means "likely to change considerably"
while LOW volatility is the opposite ("very little change expected").
Here is a common type of backgammon position where volatility is high:

  +24-23-22-21-20-19-+---+18-17-16-15-14-13-+
  |          O  O  O |   | O  O  O          |2
  |          O  O  O |   | O  O  O          |
  |                  |   |                  |
  |                  |   |                  |
  |                  |   |                  |
  |                  |   |                  |
  |    X             |   |                  |
  |    X  X          |   |                  |
  |    X  X          |   |                  |
 X|    X  X          |   |                  |
 X| O  X  X          |   |                  |
 X| O  X  X  X       |   |       O          |
  +-1--2--3--4--5--6-+---+-7--8--9-10-11-12-+

Money game.  O on roll.


11/36 of the time, O will hit the blot (and cash).  25/36 O misses,
and probably gets gammoned.  This position is HIGHLY volatile since the
possible immediate outcomes have widely varying expectations.

     Now look at another common type of position:


  +24-23-22-21-20-19-+---+18-17-16-15-14-13-+
  | O  O  O  O  O  O | X |             O    |2
  | O  O  O  O  O  O |   |                  |
  |                O |   |                  |
  |                  |   |                  |
  |                  |   |                  |
  |                  |   |                  |
 X|                  |   |                  |
 X|                  |   |                  |
 X|    X  X          |   |                  |
 X|    X  X          |   |                  |
 X|    X  X          |   |                  |
 X|    X  X          |   |       O          |
  +-1--2--3--4--5--6-+---+-7--8--9-10-11-12-+

Money game.  O on roll.


     Here not much is likely to happen for a couple of rolls.  O will
move checkers around while X rides the pine.  O's winning chances will
be about the same after his/her roll as they are before the roll.  This
is a position with LOW volatility.

     So far we have been talking QUALITATIVELY.  There is a way
to measure volatility (or, more specifically, to compute it).  The
definition of volatility (at least as used by the two common bots,
Jellyfish and Snowie) is not standard, but related.  Let's start out
with Snowie's definition for 1-ply volatility.  (BTW, thanks to Andri
Nicoulin of Oasya for giving me the details on Snowie's calculation).

Consider the 36 dice rolls and the resulting cubeless equity after the best
play for each of those rolls.  Define these quantities as e1, e2, ..., e36.
 ('e' is individual equity or expectation).  Then if we average these 36
outcomes (or take the 'arithmetic mean') we get the 1-ply lookahead equity:

            E = sum(e1,e2,...,e36) / 36.         ('E' is mean equity)

To find volatility, add up the squares of the deviations (i.e. differences)
between the various outcome expectations and the mean equity:

            Vs = sum[ (e1-E)^2 , (e2-E)^2 , ... , (e36-E)^2 ]

Vs is Snowie's definition of volatility.

      Jellyfish goes a couple steps further.  (Note that chronologically
I should first have said what JF does and then Snowie, since Fredrik
Dahl started this four or so years ago.  However, it is easier to
explain by beginning with Snowie's definition.  And I HOPE no one comes
away from this thinking that JF uses SW's numbers to calculate its own
value of volatility!!!!)

            Vj = sqrt(Vs/35)

Vj is Jellyfish's definition of volatility.

      For those who have studied statistics (and still remember it!) you
will recognize Jellyfish's definition of volatility as just the standard
deviation of the 36 possible outcomes, in cubeless equity units.

      So, how does this help us play better backgammon?  Volatility really
comes into importance in backgammon when a player is deciding whether or
not to offer the cube.  This is where a common BG term "missing the market"
is often introduced.  Hypothetically, consider two different positions
which have the same equity, but one has HIGH volatility and one has LOW
volatility. Let's say the equity is such that both positions are correct
takes.  Should the game leader cube?  It depends on the volatility.  If the
volatility is low, then one roll from now the position will likely still be
a take, so the leader (who is considering cubing) can, and probably should,
wait.  If the volatility is high, then next time this person is on roll,
the game will likely be VERY different.  Sometimes s/he will be worse off
(and be glad not to have doubled) and sometimes BETTER off (and wish s/he
HAD DOUBLED).  The "BETTER off" position is so good that now the opponent
has a very clear (i.e. not even close) pass.  Here, the player who thought
about doubling, but didn't, "lost his/her market".

      Although on the surface it looks like it doesn't much matter whether
the player turned the cube or not, in fact when you look carefully (i.e.
"quantitatively") at such problems you will find that waiting to double
in volatile positions actually costs equity in the long run.  In simple
terms, doubling AFTER losing your market is worth less than doubling
before, because in the cases where you gain ground, you get TWICE as
much equity with the cube turned, but you only get a fixed value (the
number on the cube prior to its being turned) if you wait because your
double gets passed.

      Let's look at a SIMPLE (but potentially REAL) example of losing
one's market:

  +24-23-22-21-20-19-+---+18-17-16-15-14-13-+
13|       O     O    |   |                  |
 O|                  |   |                  |
 O|                  |   |                  |
 O|                  |   |                  |
 O|                  |   |                  |
 O|                  |   |                  |64
 X|                  |   |                  |
 X|                  |   |                  |
 X|                  |   |                  |
 X| X                |   |                  |
 X| X                |   |                  |
12| X                |   |                  |
  +-1--2--3--4--5--6-+---+-7--8--9-10-11-12-+

Money game.  X on roll.  Cube decisions??


     If the game continues, O can win by the following occurences:

  a) X doesn't roll a doublet, AND
  b) O bears off both checkers in a single roll.

Condition a) happens 30/36 of the time.  For condition b), O fails with
any 1 (that's 11 rolls), any 2 (that's 9 more), and 43 (2 more).  That
only leaves 14 rolls where O gets both checkers off.  The chances of
BOTH a) and b) occuring is the product of these two probabilities:

           30/36  *  14/36  =  420/1296  =  32.41%.

The seasoned BG players recognizes this as a CLEAR TAKE for X.  But is
it a double?

     Certainly if X doesn't double, s/he loses 1 point 32.41% of the time
and wins 1 point the remainder (67.59%) and the net is 0.6759 - 0.3241 =
0.351 times the value of the stakes.  (It IS a "money" game.)

     If X doubles and O correctly takes, he wins twice this much:

        0.6759*(+2) + 0.3241*(-2) = 0.702 times the value of the stakes.

X lost his/her market by not doubling.  In fact s/he lost more than 1/3
of a point.  BIG MISTAKE.

     In general conditions aren't this simple.  Maybe there are gammons,
and/or maybe you are playing the Jacoby Rule, or maybe it's a match.  And
if there are a lot of rolls left, you might have lots of chances to turn
the cube, as might your opponent.  It sometimes matters whether the cube is
centered or in the possession of the game leader.  (Hey, if backgammon were
simple it wouldn't be nearly as much fun!)

      The bots also perform 2-ply lookahead and can calculate volatility
after the associated 1296 outcomes, which is really what you want to look
at in a double/no double decision.  The motivated reader should be able
to generalize the above equations for the 2-ply case.

      At 2-ply lookahead, JF calculates a whopping 1.017 for the volatility
of the just illustrated position.  But is there a way to actually use the
current equity and the volatility to decide whether a position is a double?
Sort of.  Here is one way, using JF's volatility.  Consider the quantity:


          E + f*Vs

where E and Vs are defined above, and f is a multiplicative "factor".
You want to compare this quantity with the drop/take point (value of equity
where game leader doubles and game trailer has a borderline decision as
to whether or not to take or pass).  One question is:  "what value of
f should be used in order for the following condition to hold?"

            E + f*Vs > T,    then double, otherwise wait.

(Here, T is the drop/take point equity as seen from the leader's point of
view.)

       If you assume that the 1296 outcomes are Gaussian distributed
(even though they are not), then a value of 1 for f means that leader will
lose his/her market 16% of the time, and a value of 0.5 results in 31%
market losers.  I believe Kleinman has studied market losers and concluded
that somewhere in the 20-30% range of market losers is typically where a
double should be offered.  Based on this it looks like the value of f
should be somewhere between 0.5 and 1.  (Clearly this simple 'rule' ignores
how much market is lost.  Kit has emphasized that a lot of small market
losers may still be a hold but even a few LARGE market losers is often a
double.)

      I believe that the bots don't actually use this method, however.
They can look at all 1296 possible outcomes and calculate an semi-cubeFUL
equity based on which of those outcomes will be a cash next time.  Then
they can compare that number with the current equity to decide which is
larger and which cube decision (double or hold) is optimal.  But most of us
humans can't do that within the time constraints of a game (especially
since our opponents don't allow external aids, like pencils, paper,
calculators,...).  So, chalk up one more advantage for the sandbrains.

      Chuck
      bower@bigbang.astro.indiana.edu
      c_ray on FIBS
 
Did you find the information in this article useful?          

Do you have any comments you'd like to add?     

 

Cube Handling

Against a weaker opponent  (Kit Woolsey, July 1994) 
Closed board cube decisions  (Dan Pelton+, Jan 2009) 
Cube concepts  (Peter Bell, Aug 1995)  [Long message]
Early game blitzes  (kruidenbuiltje, Jan 2011) 
Early-late ratio  (Tom Keith, Sept 2003) 
Endgame close out: Michael's 432 rule  (Michael Bo Hansen+, Feb 1998)  [Recommended reading]
Endgame close out: Spleischft formula  (Simon Larsen, Sept 1999) 
Endgame closeout: win percentages  (David Rubin+, Oct 2010) 
Evaluating the position  (Daniel Murphy, Feb 2001) 
Evaluating the position  (Daniel Murphy, Mar 2000) 
How does rake affect cube actions?  (Paul Epstein+, Sept 2005) 
How to use the doubling cube  (Michael J. Zehr, Nov 1993) 
Liveliness of the cube  (Kit Woolsey, Apr 1997) 
PRAT--Position, Race, and Threats  (Alan Webb, Feb 2001) 
Playing your opponent  (Morris Pearl+, Jan 2002)  [GammOnLine forum]
References  (Chuck Bower, Nov 1997) 
Robertie's rule  (Chuck Bower, Sept 2006)  [GammOnLine forum]
Rough guidelines  (Michael J. Zehr, Dec 1993) 
Tells  (Tad Bright+, Nov 2003)  [GammOnLine forum]
The take/pass decision  (Otis+, Aug 2007) 
Too good to double  (Michael J. Zehr, May 1997) 
Too good to double--Janowski's formula  (Chuck Bower, Jan 1997) 
Value of an ace-point game  (Raccoon+, June 2006)  [GammOnLine forum]
Value of an ace-point game  (Øystein Johansen, Aug 2000) 
Volatility  (Chuck Bower, Oct 1998)  [Long message]
Volatility  (Kit Woolsey, Sept 1996) 
When to accept a double  (Daniel Murphy+, Feb 2001) 
When to beaver  (Walter Trice, Aug 1999) 
When to double  (Kit Woolsey, Nov 1994) 
With the Jacoby rule  (KL Gerber+, Nov 2002) 
With the Jacoby rule  (Gary Wong, Dec 1997) 
Woolsey's law  (PersianLord+, Mar 2008) 
Woolsey's law  (Kit Woolsey, Sept 1996) 
Words of wisdom  (Chris C., Dec 2003) 

[GammOnLine forum]  From GammOnLine       [Long message]  Long message       [Recommended reading]  Recommended reading       [Recent addition]  Recent addition
 

  Book Suggestions
Books
Cheating
Chouettes
Computer Dice
Cube Handling
Cube Handling in Races
Equipment
Etiquette
Extreme Gammon
Fun and frustration
GNU Backgammon
History
Jellyfish
Learning
Luck versus Skill
Magazines & E-zines
Match Archives
Match Equities
Match Play
Match Play at 2-away/2-away
Miscellaneous
Opening Rolls
Pip Counting
Play Sites
Probability and Statistics
Programming
Propositions
Puzzles
Ratings
Rollouts
Rules
Rulings
Snowie
Software
Source Code
Strategy--Backgames
Strategy--Bearing Off
Strategy--Checker play
Terminology
Theory
Tournaments
Uncategorized
Variations

 

Return to:  Backgammon Galore : Forum Archive Main Page