Cover of book published in 1736 | |
| Author | Isaac Newton |
|---|---|
| Language | English |
| Genre | Mathematics |
| Publisher | Henry Woodfall |
Publication date | 1736 |
| Pages | 339 |
Method of Fluxions (Latin:De Methodis Serierum et Fluxionum)[1] is a mathematical treatise bySir Isaac Newton which served as the earliest written formulation of moderncalculus. The book was completed in 1671 and posthumously published in 1736.[2]
Fluxion is Newton's term for aderivative. He originally developed the method atWoolsthorpe Manor during the closing ofCambridge due to theGreat Plague of London from 1665 to 1667. Newton did not choose to make his findings known (similarly, his findings which eventually became thePhilosophiae Naturalis Principia Mathematica were developed at this time and hidden from the world in Newton's notes for many years).Gottfried Leibniz developed his form of calculus independently around 1673, seven years after Newton had developed the basis for differential calculus, as seen in surviving documents like “the method of fluxions andfluents..." from 1666. Leibniz, however, published his discovery of differential calculus in 1684, nine years before Newton formally published his fluxionnotation form of calculus in part during 1693.[3]
The calculus notation in use today is mostly that of Leibniz, althoughNewton's dot notation for differentiation is frequently used to denote derivatives with respect to time.
Newton'sMethod of Fluxions was formally published posthumously, but following Leibniz's publication of the calculus abitter rivalry erupted between the two mathematicians over who had developed the calculus first, provoking Newton to reveal his work on fluxions.
For a period of time encompassing Newton's working life, the discipline ofanalysis was a subject of controversy in the mathematical community. Although analytic techniques provided solutions to long-standing problems, including problems ofquadrature and the finding of tangents, the proofs of these solutions were not known to be reducible to the synthetic rules of Euclidean geometry. Instead, analysts were often forced to invokeinfinitesimal, or "infinitely small", quantities to justify their algebraic manipulations. Some of Newton's mathematical contemporaries, such asIsaac Barrow, were highly skeptical of such techniques, which had no clear geometric interpretation. Although in his early work Newton also used infinitesimals in his derivations without justifying them, he later developed something akin to themodern definition of limits in order to justify his work.[4]