The Encrypted Token Pattern is one of the common remedies against CSRF attacks. However, if this is not implemented properly, with the right configurations, the approach becomes in effective.
Encrypted Token pattern uses an encryption validation method to authenticate requests rather than multiple tokens.
After successful authentication by the user, the server creates a unique token comprised of the user’s ID, a timestamp, and a unique key available only on the server. This Token is returned to the client and embedded in a hidden field.
On receipt of this request, the server reads and decrypts the Token value with the same key used to create the Token. Inability to correctly decrypt indicates an intrusion attempt. For instance, a developer might choose to use HMAC-SHA1 to sign the user-generated values and generate the token.
CSRF Token = HMAC-SHA-1(‘secret key’ + user ID + timestamp)
However, the security for this method relies heavily on multiple variables:
The cryptographic function used to create the token needs to be of a high quality. We have often observed that developers use simplistic Base64 encoding to generate what they believe to be “high-entropy” tokens.
This can be devastating, as attackers can now easily compromise the authentication system using forged tokens. If hash functions are being used, then its highly recommended to use SHA2 and above with salts.
If encryption or HMAC is used to generate the token, one should ensure that the key is a strong, pseudo-random value that is not easily guessable. Once the attacker identifies the key used to generate tokens, it is trivial for the attacker to forge tokens. More on Encrypted.
In case of tokenization, these data vaults are also referred as token vaults, and they can either be onpremise or cloud based. The term vault based means both sensitive data(like credit card number) and token will be mapped & stored in a table usually known as “Teradata” tables.
Handling of these table becomes a big challenge with increase in the volume of transactions. Every time during tokenization it stores a record for each card data and its token. When you used a card multiple times, each time it generates multiple tokens.
It is a fundamental concept. So the challenge here is Table size will be increased. In this project work a solution is presented where either on-premise or cloud based tokenization based service model is to protect sensitive and confidential data using vault token with high performance. More on Vault-based.
Here is my detailed article on this. Top companies – Voltage and Protegrity.