Where the simplest numbers, natural numbers are concerned, a really important part of /number/ as-such is the notion of a bijection.
There are two cows in a field, a Holstein, and a Guernsey. There are two fruits on the picnic table near their grazing area - an apple, and a kiwi.
Why is it that each of these collections of things can have the label /two/ appended to them? What is it that they have in /common/, that they (the pairs) are each "/two/"? This is where you're getting at what number actually is, and thinking about it with some depth.
In each case, you can pick something out of each collection, and "assign" it to something in the other collection, so that everything in one collection gets associated with just one other thing in the other collection, and with nothing left over in either collection. These two things, put together, define what a /bijection/ is, which is just a special type of mapping, or function, or more simply: a rule for assigning things in one collection to other things in another collection.
The number, then, in the natural case anyway, is exactly this same possibility of mapping between sets - for each number.
This concept of bijection unperpins our concept of number itself, at least where counting is concerned (as opposed to measuring, comparing lengths, etc). It's so fundamental that it can be extended to infinite sets, in order to compare them (this is why mathematicians say that certain infinite sets are "bigger" than others, though to a neophyte this seems to be a senseless statement). If there's stuff left over in an infinite set, or rather, if no bijection exists between two infinite sets, then the set with stuff left over is said to be some version of "bigger".