In thehistory of mathematics, thegenerality of algebra was a phrase used byAugustin-Louis Cauchy to describe a method of argument that was used in the 18th century by mathematicians such asLeonhard Euler andJoseph-Louis Lagrange,[1] particularly in manipulatinginfinite series. According to Koetsier,[2] the generality of algebra principle assumed, roughly, that thealgebraic rules that hold for a certain class of expressions can be extended to hold more generally on a larger class of objects, even if the rules are no longer obviously valid. As a consequence, 18th century mathematicians believed that they could derive meaningful results by applying the usual rules of algebra andcalculus that hold for finite expansions even when manipulating infinite expansions.
In works such asCours d'Analyse, Cauchy rejected the use of "generality of algebra" methods and sought a morerigorous foundation formathematical analysis.
An example[2] is Euler's derivation of the series
| 1 |
for. He first evaluated the identity
| 2 |
at to obtain
| 3 |
The infinite series on the right hand side of (3) diverges for allreal. But neverthelessintegrating this term-by-term gives (1), an identity which is known to be true byFourier analysis.[example needed]
Thismathematical analysis–related article is astub. You can help Wikipedia byadding missing information. |
This article about thehistory of mathematics is astub. You can help Wikipedia byadding missing information. |