Zero has at least two purposes.

At its simplest, it is a placemarker for the absence of anything else, and it is the identity for addition.

If you use a positional numbering system (as the Babylonians did, and as we do, but unlike the Romans or Hebrews), then you need to have a value for the position where there is nothing.

In our decimal numbering system, we have numerals 1 to 9 for positive numbers, but then if we wish to express the number 12, we place a 1 in the tens column, and a 2 in the units column; but if we wish to express the number ten, then we have 1 in the tens column, but we need to place a symbol in the units column to denote there are no units over the 10. Similarly, if we write one hundred and one, then we would write a 1 in the hundreds column, and a 1 in the units column, but we have to now have a symbol in the tens column to say there are no tens, and we use the symbol '0'.

The Romans did things differently. The Romans had different symbols for their units, their hundreds, and their tens; so it was clear if one group was missing without having to fill the space explicitly with a symbol (the Romans did not use a zero). In Roman numbers, one was 'I', but ten was 'X', so if you wanted to write 12 you would write 'XII', but if you just saw an 'X' you did not need to see that there were no units over the ten because you had no need to identify the position of the numeral 'X' (in the decimal system, if you see '1' without a zero following it, you think it a '1', but if you see an 'X' without anything following it, you still know it to be a ten). Similarly, for Romans, the number one hundred was denoted by the letter 'C', so they could unambiguously write one hundred and one as 'CI', and one hundred and eleven as 'CXI', and nowhere did they need to have a placemarker for a missing digit, whereas we need to write '101' otherwise if we just saw '11' we would read that as eleven rather than one hundred and one.

The other function of zero is as an identity for addition. Although the conceptual idea of zero must be as old as trade, where the idea that if you had a debt of $100, and paid $100, the remaining debt would be zero; but much of early abstract mathematics dealt more with geometry than than with algebra, and so they did not really deal with negative numbers, and so had little need for zero. It was only with the adoption of Algebra (from the Arabic al-jabr, and it was from the Arabs that the Europeans learnt the modern numbering system which included the zero, although the Arabs themselves learnt it from the Indians) that we started to include the idea of algebraic addition and subtraction, and so had to include the number zero as the value when you subtract a number from itself (or, the equivalent, to add a number to its own negative number).

Infinity is another number that has long been part of the philosophy of humanity, the idea that something goes on forever without end. The word itself is from the Latin Latin 'infinitas' meaning 'unboundedness'. The concept is relevant both to geometry (the meeting point of two parallel lines) as well as to algebra, and is key to calculus (but is still a branch of mathematics that continues to develop, particularly in set theory and

hyperreal numbers). The use of the symbol '∞' to denote infinity seems to be credited to John Wallis some time in the late 17th century.