释义 |
DictionarySeeEMVtokenization
tokenization[‚tō·kən·ə′zā·shən] (computer science) The conversion of keywords of a programming language to tokens in order to conserve storage space. tokenizationIn an EMV-based credit or debit card system, tokenization is the creation of a unique number for a transaction (the token) that is used in place of the customer's primary account number (PAN). See EMV. |