Hashing: computing hash value using selected algorithm from the original input. The same hash value will always be generated for the same input.
Tokenising: generating random unique value (token) and associating it with original value. New value will be generated each time for the same input.
(this is not formal/academic definition)
In a nutshell - the association (aka de-tokenisation table) is the piece that glues token and value together, hence if association table is destroyed the value is truly anonymous. You can use various context-based deanonymisation algorithms to guess the original value but there is no strong association.