Accepting that numbers can do strange, new things is one of the toughest parts of math:
- There’s numbers between the numbers we count with? (Yes — decimals)
- There’s a number for nothing at all? (Sure — zero)
- The number line is two dimensional? (You bet — imaginary numbers)
Calculus is a beautiful subject, but challenges some long-held assumptions:
- Numbers don’t have to be perfectly accurate?
- Numbers aren’t all the same size (i.e. 1 times some number)?
Today’s post introduces a new way to think about accuracy and infinitely small numbers. This is not a rigorous course on analysis — it’s my way of grappling with the ideas behind Calculus.