vpFREE2 Forums

Several questions

Now that we've cleared up the question of whether one can
mathematically compute an alternate definition of some variable let's
go back and examine the one I used in my example about doubling down.

I stated that by changing the definition of ER slightly to include
the effects of doubling that an interesting feature appeared. First
let me expand on my definition to add a little preciseness. What is
being added is the expected return per unit doubled to the expected
return per unit gambled. That is, if you assume you would return .995
units per unit gambled in the main bet, and you gamble 1
additional "double unit" per base unit gambled then the newly defined
ER1 becomes (base ER+1)/(1+1) or .9975 for ER=.995.

The overall equation is (base ER+DUW)/(1+DUG) because the amount any
person might bet via doubling is different for each individual and
this requires that part of the equation to be variable. First, let's
examine an individual who never doubles. DUW and DUG are both zero
and low and behold the number comes out to be exactly the same as the
standard definition of ER. This appears to mean that the standard
definition is a SUBSET of the new definition. Of course, it may be
uncomfortable for some of us to view ER as a non-fixed value based on
how much a person doubles. However, is this new definition
better? ... or, worse? ... or just different? Something to think
about.

So, my original point that someone can ask a question differently and
get different answers has led to another way of thinking about a
situation. In my opinion this allows us more insight into the
situation (in this case doubling) and possibly a new way of thinking
about an old idea. If someone doesn't like this alternate definition
that is fine, it doesn't change the old definition.

Dick

> > Hmmmmmm. How about "pressing the double down button"?
>
> That would lump together strings of bets between those hands
> when you chose to double down. You can do that, but you'll
> never convince me that it is mathematically equivalent to "EV".
> It is measuring "average payoff for strings of wagers between
> doubling". That ain't EV.

Thank you Steve for agreeing with my position. All I've ever claimed
was "you can do that". I originally stated "This is one of those
problems where the answer is based on how you ask the question". In
other words, ER "can" have different meanings. As I stated
previously, you may not like one approach but that does not mean it
can't exist (Harry's position).

I don't believe I was agreeing. When I said "you can do that" I meant
that you can compute some sort of metric that way. I firmly believe
that it is incorrect to call that metric by the name "ER" because I
don't think it represents the same concept that most people
associate with "ER".

> Part of my distaste for what you're doing comes from not being
> able to say in advance how many wagers you'll make during
> the "next play". For all games of chance that I've studied,
> including blackjack where doubling and splitting are allowed,
> EV is computed per play, and doubling/splitting is not counted
> as a separate wager.

OTOH, it is possible for a person to have an algorithm where you can
compute the number of wagers. This algorithm can be part of the input.

I'll agree with that in principle, but I'm not convinced that this case
is a good example. The problem here is that you're taking what I
would call a single outcome and treating it as if it was two independent
results. The key word there is "independent". The payoff from the
original bet and from the doubling (or redoubling) are mixed together
and completely correlated, in sharp contrast to any other two outcomes
from the game. This messes up computation of variance.

Let me ask you this. Would you feel it was OK to arbitrarily take the
payoffs from royal flushes and count each of those as if they were
ten independent wagers of 1/10 the size that were paid separately?
That would effectively dilute the EV and greatly reduce the variance
of the game tremendously, but it isn't kosher because it pretends
that the outcomes are not correlated. That's effectively what you
are doing when you try to split up the outcomes from the original
bet and the double -- treating them as if they are not correlated
at all when in fact they are completely correlated. I believe it
is mathematically incorrect to handle payoffs that way. If you
can tell my why these outcomes aren't correlated, or how to
justify (in a mathematical sense) ignoring that correlation, then
I might be presuaded to agree with you.

> > But you DO get different results. The $/hr is different in the two
> > cases. If you want to understand the $/hr then one can argue that

an

> > analysis that treats doubling as a separate bet is better. But, of
> > course, this already shows that the situation "can" be analyzed in
> > two different ways ... which is the exact point I have been

trying to

> > make.
> >
> > > When different
> > > methods are both valid, and the results are different, it means
> >
> > that the
> >
> > > two methods are actually measuring slightly different things

and the

> > > comparision is apples to oranges.
> >
> > No, it does not mean the comparison is "apples and oranges". It
> > simply means you may want to emphasize a different aspect of the
> > issue.
>
> If you get two different answers, then isn't it obvious that you're

computing

> two DIFFERENT things? You say so yourself when you talk about
> different "aspects" of the issue.

No, the two results may simply be two different ways of stating the
results just like my example for octal vs decimal.

Are you claiming that this doubling situation falls into the same
category? If not, then perhaps we could dispense with that (and
other issues that don't apply to this situation) and focus on the
things that are relevant to the issue at hand.

They are different
because you choose to ask the question differently (back to my
original position) and, most importantly, they may lead to insights
not available by another approach. BTW, I've never stated that one
could use a different result outside the context where it originated.

Perhaps we're getting somewhere. I believe that choosing to
"ask the question differently" is usally equivalent to "asking
a DIFFERENT question" in the sense that you are fundamentally
changing what you are measuring when do this. That is related
to my point about not trying to keep calling it an apple when
you've really started talking about an orange.

> EV is a single concept. One issue, one result. If you compute it

two

> different ways and get different numbers, then they simply cannot
> be the SAME concept. Apples to oranges. It really is that simple.

Nope. You just made an assumption when you used the term EV. This
assumption is related to your problem definition. If I change the way
the problem is stated then I may get different answers. If you're now
thinking "well of course, that's obvious", then you can now
understand how I felt when you agreed with Harry's position that
you "must" state the problem one way.

How is that statement any different than saying "apples and oranges
are the same thing because they are both fruit". That would be a
true statement, but all it really amounts to is ignoring the very real
differences between apples and oranges.

> Of course I'm not saying there is only one way to analyze games or
> to produce strategies that are optimal in different ways. What I'm
> saying is that only ONE of those methods of analysis is truly from
> an EV perspective. Other methods look at RoR or "cost" or some
> other aspect. You can group bets together in different ways as
> well, but doing things in different ways and trying to attach the
> same name to it only adds confusion.

I've never stated that the term EV should be attached to anything.
All I've ever claimed is that by stating the problem differently you
can get different answers and with those answers different insights.
Now I think you're beginning to understand why I was amazed that you
would take the position this was impossible.

Based on the summary you give below, it still appears to me that
you are claiming exactly that very thing -- that what you're computing
is still ER but it is the ER obtained by viewing the game in a different
way.

> If you're looking at the same game, played with the same strategy,
> and you compute two different "aspects" and end up with different
> numbers (when expressed in the same base, to dispense with
> the octal/decimal nonsense), then you are comparing apples to
> oranges.

I beleive you are saying that if you ask the same question then you
should get the same answer ... Of course. This has nothing to do with
my claims, which I now hope you better understand. Let me go over the
complete history:

This whole issue got started when ckonwin stated "The double feature
does change the expected return of the game". mklpryy24
responded "Not quite, Lets get into the math, 100% (the amount bet) X
99.5% ( the return of JOB) X 100% ( the even odds of the double up) =
99.5%". Next, I stated "This is one of those problems where the
answer is based on how you ask the question". I next provided an
example where one might want to examine the effect of doubling on
their win/loss rate. Finally, mklpryy24 responded "don't matter".

Understanding this win/loss rate in my example was somewhat simpler
if you use another way of defining the expected return. IMO, using
this alternative approach provides more insight into how the game
will play than simply trying to apply commonly used ER.

This still sounds exactly like you are trying to stretch the concept of
"expected return" to cover two things that are fundamentally different.
That is my only objection to your position.

Dick, I think I understand what you are saying a lot better than you
give me credit for. I used to believe the very thing that you are
talking about, and I used to put forth arguments very similar to
some that you are using here, but in the back of my mind there
was something that bothered me about it. Thinking about this
very issue contributed to clarifying my thinking about alternate
strategies and different objectives and what it really means to
measure something that is fundamentally different than EV (or
ER, which is really just another name for the same concept).

The most basic realization that came from thinking about this
is that sometimes very subtle changes in how you ask the
question lead to measurements that seem like EV but are
not really quite the same thing. I believe what you're doing
here amounts to ignoring a very real correlation between
the dollars that come out from the original bet and the dollars
that come out from doubling. I also believe that when you
compute a number that ignores correlation between
outcomes, then whatever it is you're computing it isn't
something that should be described as ER. My own
belief is that ER/EV should be reserved for describing
an average applied to a probability distribution that
represents independent/uncorrelated outcomes.

I certainly agree with you that there are insights to be
gained by viewing games in different ways and asking
different questions in order to understand new ways
to measure our results. That is the basis or my alternate
strategies, but I've tried very hard to be careful not to
claim that these alternate perspectives are really the
same thing as ER, because they are in fact different.

···

On Tuesday 06 November 2007 1:37 am, mroejacks wrote:

--- In vpFREE@yahoogroups.com, Steve Jacobs <jacobs@...> wrote:

Now that we've cleared up the question of whether one can
mathematically compute an alternate definition of some variable

let's

go back and examine the one I used in my example about doubling

down.

I stated that by changing the definition of ER slightly to include
the effects of doubling that an interesting feature appeared. First
let me expand on my definition to add a little preciseness. What is
being added is the expected return per unit doubled to the expected
return per unit gambled. That is, if you assume you would

return .995

units per unit gambled in the main bet, and you gamble 1
additional "double unit" per base unit gambled then the newly

defined

ER1 becomes (base ER+1)/(1+1) or .9975 for ER=.995.

A person who always doubles once on a .995 game actually bets 1.995
units per average deal not 2. So the total return on amount bet in
this scenario is very slightly lower at .9974937

The overall equation is (base ER+DUW)/(1+DUG) because the amount

any

person might bet via doubling is different for each individual and
this requires that part of the equation to be variable. First,

let's

examine an individual who never doubles. DUW and DUG are both zero
and low and behold the number comes out to be exactly the same as

the

standard definition of ER. This appears to mean that the standard
definition is a SUBSET of the new definition. Of course, it may be
uncomfortable for some of us to view ER as a non-fixed value based

on

how much a person doubles. However, is this new definition
better? ... or, worse? ... or just different? Something to think
about.

So, my original point that someone can ask a question differently

and

get different answers has led to another way of thinking about a
situation. In my opinion this allows us more insight into the
situation (in this case doubling) and possibly a new way of

thinking

about an old idea. If someone doesn't like this alternate

definition

···

--- In vpFREE@yahoogroups.com, "mroejacks" <rgmustain@...> wrote:

that is fine, it doesn't change the old definition.

Dick

--- In vpFREE@yahoogroups.com, "howardwstern" <howard.w.stern@...>
wrote:

>
> Now that we've cleared up the question of whether one can
> mathematically compute an alternate definition of some variable
let's
> go back and examine the one I used in my example about doubling
down.
>
> I stated that by changing the definition of ER slightly to

include

> the effects of doubling that an interesting feature appeared.

First

> let me expand on my definition to add a little preciseness. What

is

> being added is the expected return per unit doubled to the

expected

> return per unit gambled. That is, if you assume you would
return .995
> units per unit gambled in the main bet, and you gamble 1
> additional "double unit" per base unit gambled then the newly
defined
> ER1 becomes (base ER+1)/(1+1) or .9975 for ER=.995.

A person who always doubles once on a .995 game actually bets 1.995
units per average deal not 2. So the total return on amount bet in
this scenario is very slightly lower at .9974937

I didn't say they doubled each time. I stated they "gamble 1
additional "double unit" per base unit". This could be done by
doubling multiple times for awhile and then not at all. However, I
can see how you made the jump that you did. I did this keep the units
easy to work with. In reality they could double much more or much
less than this.

Dick

···

--- In vpFREE@yahoogroups.com, "mroejacks" <rgmustain@> wrote:

> > > Hmmmmmm. How about "pressing the double down button"?
> >
> > That would lump together strings of bets between those hands
> > when you chose to double down. You can do that, but you'll
> > never convince me that it is mathematically equivalent to "EV".
> > It is measuring "average payoff for strings of wagers between
> > doubling". That ain't EV.
>
> Thank you Steve for agreeing with my position. All I've ever

claimed

> was "you can do that". I originally stated "This is one of those
> problems where the answer is based on how you ask the question".

In

> other words, ER "can" have different meanings. As I stated
> previously, you may not like one approach but that does not mean

it

> can't exist (Harry's position).

I don't believe I was agreeing. When I said "you can do that" I

meant

that you can compute some sort of metric that way. I firmly believe
that it is incorrect to call that metric by the name "ER" because I
don't think it represents the same concept that most people
associate with "ER".

Just because MOST people have gotten use to method of operation does
not mean a better approach does not exist. If you think of ER in
terms of a relationship between money bet and money won then there
exists a family of these relationships. While each member of the
family may be different there is also a commonality that should not
be ignored.

> > Part of my distaste for what you're doing comes from not being
> > able to say in advance how many wagers you'll make during
> > the "next play". For all games of chance that I've studied,
> > including blackjack where doubling and splitting are allowed,
> > EV is computed per play, and doubling/splitting is not counted
> > as a separate wager.
>
> OTOH, it is possible for a person to have an algorithm where you

can

> compute the number of wagers. This algorithm can be part of the

input.

I'll agree with that in principle, but I'm not convinced that this

case

is a good example. The problem here is that you're taking what I
would call a single outcome and treating it as if it was two

independent

results. The key word there is "independent". The payoff from the
original bet and from the doubling (or redoubling) are mixed

together

and completely correlated, in sharp contrast to any other two

outcomes

from the game. This messes up computation of variance.

Just because something is difficult should not be a reason to ignore
it.

Let me ask you this. Would you feel it was OK to arbitrarily take

the

payoffs from royal flushes and count each of those as if they were
ten independent wagers of 1/10 the size that were paid separately?
That would effectively dilute the EV and greatly reduce the variance
of the game tremendously, but it isn't kosher because it pretends
that the outcomes are not correlated. That's effectively what you
are doing when you try to split up the outcomes from the original
bet and the double -- treating them as if they are not correlated
at all when in fact they are completely correlated. I believe it
is mathematically incorrect to handle payoffs that way. If you
can tell my why these outcomes aren't correlated, or how to
justify (in a mathematical sense) ignoring that correlation, then
I might be presuaded to agree with you.

I have never said the two were not correlated. In fact, I have said
just the opposite. My position is to extend the definition to
encompass the doubling event to produce "another" relationship that
may be of interest.

> They are different
> because you choose to ask the question differently (back to my
> original position) and, most importantly, they may lead to

insights

> not available by another approach. BTW, I've never stated that one
> could use a different result outside the context where it

originated.

Perhaps we're getting somewhere. I believe that choosing to
"ask the question differently" is usally equivalent to "asking
a DIFFERENT question" in the sense that you are fundamentally
changing what you are measuring when do this. That is related
to my point about not trying to keep calling it an apple when
you've really started talking about an orange.

I never said it should be called exactly the same thing. However, I
do think the fact this value is correlating similar things should be
highlighted in some way. In that sense, they are more like a gala
apple and a delicious apple.

> > Of course I'm not saying there is only one way to analyze games

or

> > to produce strategies that are optimal in different ways. What

I'm

> > saying is that only ONE of those methods of analysis is truly

from

> > an EV perspective. Other methods look at RoR or "cost" or some
> > other aspect. You can group bets together in different ways as
> > well, but doing things in different ways and trying to attach

the

> > same name to it only adds confusion.
>
> I've never stated that the term EV should be attached to anything.
> All I've ever claimed is that by stating the problem differently

you

> can get different answers and with those answers different

insights.

> Now I think you're beginning to understand why I was amazed that

you

> would take the position this was impossible.

Based on the summary you give below, it still appears to me that
you are claiming exactly that very thing -- that what you're

computing

is still ER but it is the ER obtained by viewing the game in a

different

way.

It is a member of the same ER family but not the commonly used ER.
However, that is not to say it isn't a "better" definition.

> > If you're looking at the same game, played with the same

strategy,

> > and you compute two different "aspects" and end up with

different

> > numbers (when expressed in the same base, to dispense with
> > the octal/decimal nonsense), then you are comparing apples to
> > oranges.
>
> I beleive you are saying that if you ask the same question then

you

> should get the same answer ... Of course. This has nothing to do

with

> my claims, which I now hope you better understand. Let me go over

the

> complete history:
>
> This whole issue got started when ckonwin stated "The double

feature

> does change the expected return of the game". mklpryy24
> responded "Not quite, Lets get into the math, 100% (the amount

bet) X

> 99.5% ( the return of JOB) X 100% ( the even odds of the double

up) =

> 99.5%". Next, I stated "This is one of those problems where the
> answer is based on how you ask the question". I next provided an
> example where one might want to examine the effect of doubling on
> their win/loss rate. Finally, mklpryy24 responded "don't matter".
>
> Understanding this win/loss rate in my example was somewhat

simpler

> if you use another way of defining the expected return. IMO, using
> this alternative approach provides more insight into how the game
> will play than simply trying to apply commonly used ER.

This still sounds exactly like you are trying to stretch the

concept of

"expected return" to cover two things that are fundamentally

different.

That is my only objection to your position.

First of all the two things are NOT fundamentally different. They are
only different forms of the same kind of relationship based on
changing the way the question is asked (base assumptions).

Dick, I think I understand what you are saying a lot better than you
give me credit for. I used to believe the very thing that you are
talking about, and I used to put forth arguments very similar to
some that you are using here, but in the back of my mind there
was something that bothered me about it. Thinking about this
very issue contributed to clarifying my thinking about alternate
strategies and different objectives and what it really means to
measure something that is fundamentally different than EV (or
ER, which is really just another name for the same concept).

The most basic realization that came from thinking about this
is that sometimes very subtle changes in how you ask the
question lead to measurements that seem like EV but are
not really quite the same thing. I believe what you're doing
here amounts to ignoring a very real correlation between
the dollars that come out from the original bet and the dollars
that come out from doubling. I also believe that when you
compute a number that ignores correlation between
outcomes, then whatever it is you're computing it isn't
something that should be described as ER. My own
belief is that ER/EV should be reserved for describing
an average applied to a probability distribution that
represents independent/uncorrelated outcomes.

I certainly agree with you that there are insights to be
gained by viewing games in different ways and asking
different questions in order to understand new ways
to measure our results. That is the basis or my alternate
strategies, but I've tried very hard to be careful not to
claim that these alternate perspectives are really the
same thing as ER, because they are in fact different.

I think I have already made my point. I think we concur that the two
ERs are different but disagree on whether their relationships should
be open to further discussion. But, I will digress a little because I
think it is important to understand that extending knowledge often
REQUIRES taking commonly accepted principles and studying them from
different viewpoints.

For example, classical physics reigned for many centuries. Those
knowledgeable in the field assumed they understood nature completely.
However, experimental physics eventually moved down to the atomic
level. Lo and behold, those sacrosanct laws did not hold up. What
appeared is what we now called quantum physics which explained the
world "better" than did classical physics. The first developers of
quantum theories were initially laughed at and it took a long time
before the physics establishment was converted. I'm not claiming that
viewing ER in different ways is in the same ballpark, but I think
being open to new ways of looking at old paradigms is a good idea.

Dick

···

--- In vpFREE@yahoogroups.com, Steve Jacobs <jacobs@...> wrote:

On Tuesday 06 November 2007 1:37 am, mroejacks wrote:
> --- In vpFREE@yahoogroups.com, Steve Jacobs <jacobs@> wrote:

--- over

the
> > complete history:
> >
> > This whole issue got started when ckonwin stated "The double
feature
> > does change the expected return of the game". mklpryy24
> > responded "Not quite, Lets get into the math, 100% (the amount
bet) X
> > 99.5% ( the return of JOB) X 100% ( the even odds of the

double

up) =
> > 99.5%". Next, I stated "This is one of those problems where the
> > answer is based on how you ask the question". I next provided

an

> > example where one might want to examine the effect of doubling

on

> > their win/loss rate. Finally, mklpryy24 responded "don't

matter".

···

*******************************************************************

we are sayin your math is bad , thats all, doubling up does not
increase the return , does not increase the return from 99.5 to
99.75 as you stated, does not matter if you do it once or every time
it is still 99.5 return.
So the ? dont matter as your math is wrong.

If you enjoy it Great! Thats what it is there for.

Maybe using FPDW will help.

1 X 100.7% X 1 = 100.7%
bet X return X double up { even bet so use 1 )

Does the return go up from 100.7? does it go down to 100.35% as your
theory from JOB suggest ? or does it stay the same ??

Now remember the math is the same if you double up once or
everytime .

*********************************************************************
*

For example, classical physics reigned for many centuries. Those
knowledgeable in the field assumed they understood nature

completely.

******************************************************
this aint physics, its basic math, same as what built the pyramids &
stonehenge, & no body would think they understood nature completely.

viewing ER in different ways is in the same ballpark, but I think
being open to new ways of looking at old paradigms is a good idea.

Dick

*******************************************************************
I looked at it a new way & still 2+2=4 : )

1 x return of game X 1 = return of game..

& you dont get credit toward your comp account from the double up
bet.
HMMMM betting more & getting less for it must do somthing to the ER
dont you agree ?

How you phrase the ? dont matter. The math matters. 1 X 99.5 X 1=99.5
not 99.75

M J

--- over
> the
> > > complete history:
> > >
> > > This whole issue got started when ckonwin stated "The double
> feature
> > > does change the expected return of the game". mklpryy24
> > > responded "Not quite, Lets get into the math, 100% (the

amount

> bet) X
> > > 99.5% ( the return of JOB) X 100% ( the even odds of the
double
> up) =
> > > 99.5%". Next, I stated "This is one of those problems where

the

> > > answer is based on how you ask the question". I next provided
an
> > > example where one might want to examine the effect of

doubling

on
> > > their win/loss rate. Finally, mklpryy24 responded "don't
matter".

*******************************************************************

we are sayin your math is bad , thats all, doubling up does not
increase the return , does not increase the return from 99.5 to
99.75 as you stated, does not matter if you do it once or every

time

it is still 99.5 return.
So the ? dont matter as your math is wrong.

Good grief ....

My math was perfect. Your logic, however, is limited at best. You
see, there is no such thing as ONE ER. For example, talk to Blackjack
person about ER. Are you going to tell them that they are crazy and
the only definition of ER is for VP? From your reply I suspect you
just might do that. However, that is nonsense.

So, I will go over it slowly so that you have a small chance of
understanding. ER is a metric. Look it up if you don't understand. It
is a measurement that is based on the assumptions provided when the
input parameters of a problem are defined. For Blackjack the input
parameters are all of the associated factors in BJ. You know things
like surrender, double down, etc. Once the input parameters are
defined then the ER can be determined as a function of those
parameters and has no meaning outside of them. The same thing holds
for video poker, or anything else that is measuring a return. Define
your parameters and whatever you compute is based on that input. If
you WANT to define the ER as a FUNCTION of the initial bet only that
is fine. However, there is no reason to limit your thinking to only
that if you have a mind of your own. So, you or anyone else can
define their own input parameters and determine another type of ER
based on those parameters. If they wish to determine their ER based
on any betting decision (initial bet + double down) then the math is
perfectly happy letting them do that. That is the wonders of math. Is
doesn't care about your reasons, only that you state them up front.

So, the answer could be 99.5 or 99.75 depending on your choice of
input pararmeters. Which, if you remember, is EXACTLY what I stated
when I said it depends on how you ask the question.

If you enjoy it Great! Thats what it is there for.

Maybe using FPDW will help.

1 X 100.7% X 1 = 100.7%
bet X return X double up { even bet so use 1 )

You clearly had no clue at this point, but I hope you now see this is
ONLY true as a function of the initial bet and not ALL possible input
definitions. Now, it is certainly valid to ask WHY anyone would want
to add doubling into their definition, but mathematically it is 100%
valid and that is the issue and the whole point of my original
comment. You really should try to open your mind just a little bit
and maybe you will learn something.

Does the return go up from 100.7? does it go down to 100.35% as

your

theory from JOB suggest ? or does it stay the same ??

If you state your input parameters as 1) the both initial bet and
double down bets are counted and, 2) that you will double the exact
same amount of money as your inital bet then the associated ER will
be 100.35. If you state you will double down twice as much money then
that ER will be 100.25. Are you catching on yet??? However, if
you state ONLY the initial bet is the basis for the ER, then you are
at 100.7. Gee, kinda looks like HOW YOU ASK THE QUESTION, doesn't it?

Now remember the math is the same if you double up once or
everytime .

I hope you have enough smarts to now see just how ridiculous this
kind of statement really is. In one case you would add initial bet+
double down bets and in the other case you would use only initial
bet. That means the math is DIFFERENT. Try to understand that basic
math is dimensionless. We, as users of the math as a tool define the
dimensions, we define the inputs and that determines what the output
means. So, throwing up numbers in a vacuum, as you like to do, is
meaningless.

···

--- In vpFREE@yahoogroups.com, "mklpryy24" <mklpryy24@...> wrote:

>>

*********************************************************************

*
> For example, classical physics reigned for many centuries. Those
> knowledgeable in the field assumed they understood nature
completely.

******************************************************
this aint physics, its basic math, same as what built the pyramids

&

stonehenge, & no body would think they understood nature completely.

ROTFLMAO. Maybe you should try reading a little before assuming you
understand what "no body" would think. It took decades before quantum
theories became marginally accepted.

>
> viewing ER in different ways is in the same ballpark, but I think
> being open to new ways of looking at old paradigms is a good idea.
>
> Dick
*******************************************************************
I looked at it a new way & still 2+2=4 : )

If you take 2 oranges and 2 apples and put them in a basket, how many
apples do you have? What, oh my gosh, it DOES MATTER how you ask the
question! The answer is NOT ALWAYS 4.

1 x return of game X 1 = return of game..

And guess what? 10 x return of the game x .1 = return of the game.
So, any other brilliant deductions we should know about?

& you dont get credit toward your comp account from the double up
bet.
HMMMM betting more & getting less for it must do somthing to the ER
dont you agree ?

We're not talking about comp accounts. It has absolutely nothing to
do with the question of whether a value is the product of the input
definitions. What if tomorrow a particular casino decided that they
wanted to attack double down players and changed their system to
count double down bets for points. Oh wait, that would be impossible
under your view of the world ... Now, do you really believe that
would be impossible to program? (I can't wait for this answer)

How you phrase the ? dont matter. The math matters. 1 X 99.5 X

1=99.5

not 99.75

Now, that I have shown your logic to be 100% wrong, don't you feel
just a little bit silly?

Dick