I would be particularly interested if you found something in a mathematical style guide that recommended an expression like
( a / b ) * c
Should be re-written as
a / b * c
Generally speaking, style guides advise rewriting equations for maximum clarity. Which usually includes a guideline of removing parentheses when their existence isn’t needed to clarify intent.
I believe, and I’m particularly interested to see if you found evidence that my understanding is incorrect, that the LTR convention used by calculators and computer programming languages today exists because a deterministic interpretation is a requirement or the hardware, not because any such convention existed prior to that or has been officially codified one way or the other by any mathematics bodies.
So like, forget division for a sec…
In a mathematics paper, you usually wouldn’t write:
(a + (b + c)) + d
You’d write:
a + b + c + d
(Except perhaps if in your paper the parentheses made it easier to follow how you got to that equation.)
Because in mathematics, it will never matter which order you do additions in, so you should drop the parentheses to improve clarity.
On a computer or a calculator though you might get a different result for those two equations like if a+b overflows your accumulator and c is a negative number, or when these are floating points values with significantly different magnitudes.
I believe english speaking engineers just adopted LTR as the convention for how to interpret it since they had to do something, and the english language is a LTR language. I don’t believe that convention exists outside of the context of computing.
The Wolfram quote and ISO quote in particular that you have in your post imply that an inline division followed by an explicit multiplication is ambiguous as to if it should be interpreted as a compound fraction.
If that’s correct, then it would be the inline division that makes it ambiguous, not the implicit multiplication that makes it ambiguous.
If there’s some source from before computers, or outside of the context of computers forcing a decision. Then your assertion that it is the implicit multiplication causes the ambiguity is correct.
I’m not trying to prove you’re wrong, I’m just genuinely curious which it is. And if you found evidence one way or the other.
I concur with everything you’ve written here.
I concur that a left-to-right interpretation of consecutive explicit multiplication and division is wide spread and how most calculators and computers would interpret:
a / b * c.
But the sources you quote in your blog post and the style guides I’ve read, state that a fraction bar or parenthesis should be used to clarify if it should be interpreted as:
(a / b) * c
or
a / (b * c)
You make the argument in your post that:
a / bc
is ambiguous (which I agree with)
but
a / b * c
is not ambiguous. Which is the part I disagree with, and I think the sources you quoted disagree with you as well. But I’m open to being wrong about that and am interested if you have sources that prove otherwise.
If I’m understanding your response correctly, you believe that
a / b * c
is unambiguous, and always treated like
(a / b) * c
because of a wide spread convention of left-to-right interpretation (a convention that we both agree exists), not because you found a source that states that.
Anyhow… I’m not out to convince you of anything and I appreciate you taking the time to explain your thinking to me.