quote:
In mathematics, the infinitely small - the infinitessimal - is exceptionally common and is the basis of calculus.
This isn't quite right. Although it is true that Newton and subsequent users of calculus did think and argue in terms of infinitesimals, their ideas were not mathematically rigorous. It was not until Weierstrass and other analysts developed the theory of limits that calculus was finally given a firm, rigorous backing. But limits do not involve infinitesimals.
That said, although I don't use the term I do use the idea of infinitesimals when I teach calculus since, just like they did for Newton, they can give an intuitive idea of how to use calculus.
Also, there is a branch of mathematics called nonstandard analysis that develops rigorously the notion if infinitesimals and uses them to give an alternate development of analysis. I don't know much about it myself, but it has never really caught on in most of the mathematical world since it isn't really clear that it gives anything more than standard limit-based analysis gives.
Of course, it is possible that Robinson (the originator of nonstandard analysis) will have the last laugh if and when it proves to be useful in developing mathematics appropriate for Planck scales.
I have always preferred, as guides to human action, messy hypothetical imperatives like the Golden Rule, based on negotiation, compromise and general respect, to the Kantian categorical imperatives of absolute righteousness, in whose name we so often murder and maim until we decide that we had followed the wrong instantiation of the right generality. -- Stephen Jay Gould