One Approach to Solving Tokenization Problem for Analysis of Large-Scale Collections of User-Defined Passwords

This paper performs an analysis of the algorithm of password tokenization introduced by R. Veras et al. We show main limitations of this approach and propose a new tokenization algorithm - RGramToken, based on frequency dictionaries of English words, bigrams and trigrams. Our approach allows better...

Full description

Bibliographic Details
Main Authors: Andrey N. Kuznetsov, Dmitry A. Vyshemirsky
Format: Article
Language:English
Published: Moscow Engineering Physics Institute 2017-06-01
Series:Bezopasnostʹ Informacionnyh Tehnologij
Subjects:
Online Access:https://bit.mephi.ru/index.php/bit/article/view/105