If the ER (Expected Return) of a game is 101%, and I gamble $10,000,
then the expected profit is $100 (.01 x $10,000 = $100). This is a
small amount, but only because the "profit" in the calculation is
based on the amount gambled rather than my bankroll. Suppose my
bankroll was $1000, and I used it to gamble $10,000. I gambled the
winnings as I played until I eventually gambled $10,000. When I
finish, in this example, I cash out with $1100, so my actual profit
is 10% of my bankroll. This is a pretty good return on my
investment, and I can keep doing it over and over. Keep in mind I
used small numbers to keep the example simple.
So the question becomes whether it's reasonable to assume I can
gamble $10,000 with a $1000 bankroll. Whether it's reasonable must
depend on the ER and variance of the game I'm playing. So I'm
wondering if Risk of Ruin formulas can be used to determine if it is
reasonable. That is, a RofR formula can have two uses ... (1) to
estimate the odds I'll lose my $1000 and (2) to compute an estimate
of the "multiple" I can expect between my bankroll and how much I
gamble, and thus the percentage return of the game in relation to my
bankroll. So, might this be a new way to compare games using RofR
formulas?
I'm not familiar with the details of RofR formulas, and their
possible uses. It may be this use is already known to people who use
these formulas. For me, personally, I'm only familiar with RofR as
used in the sense of (1) above.