Comment on Why abc, xyz, etc.?

BananaTrifleViolin@lemmy.world ⁨6⁩ ⁨days⁩ ago

A lot of it comes down to convention and convention is often set by those who did it first or whose work dominated a field. The whole mathematical notation system we use today is just a convention and is not the only one that exists, but is the one the world has decided to standardise to…

Rene Descartes is usually regarded at he originator of the current system. He used abc for constants and xyz for unknown variables amongst other conventions.

Sequential letter sets are easy to use as they are easily recognised, and convenient as a result, plus are generally accepted to have non specific or less specific meaning. For example:

a^2^+b^2^=c^2^

That formula is a much simpler concept to get round using sequential leffer than:

V^2^ + G^2^ = z^2^

When you don’t use sequential letters it also implies much more specific meaning to the individual letters, and that can introduce ambiguity and confusion.

When writing a proof there can be many many statements made and you’d quickly run out of letters if you didn’t have a convention for accepting abc are variables and can be reused.

We also do use symbols from other alphabet sets, and allha/beta/gamma is commonly used trio. But in mathematical notation there are a huge range of constants and symbols now that many have been ascribed specific uses. Pi for example. So you risk bringing in ambiguity of meaning by moving away from the accepted conventions of current maths by using other sets.

Even e has specific meaning and can be ambiguous if you need to stretch to 5 variables. When working with e it’s not uncommon to use a different string of lwtters in the latin alphabet to avoid confusion if you need to use variables

And we don’t stop at 3; abcd etc is used.

source
Sort:hotnewtop