High-Efficiency Min-Entropy Estimation Based on Neural Network for Random Number Generators

Random number generator (RNG) is a fundamental and important cryptographic element, which has made an outstanding contribution to guaranteeing the network and communication security of cryptographic applications in the Internet age. In reality, if the random number used cannot provide sufficient ran...

Full description

Bibliographic Details
Main Authors: Na Lv, Tianyu Chen, Shuangyi Zhu, Jing Yang, Yuan Ma, Jiwu Jing, Jingqiang Lin
Format: Article
Language:English
Published: Hindawi-Wiley 2020-01-01
Series:Security and Communication Networks
Online Access:http://dx.doi.org/10.1155/2020/4241713
id doaj-29c886a14d404d31a46ef05c9397f2db
record_format Article
spelling doaj-29c886a14d404d31a46ef05c9397f2db2020-11-25T02:53:46ZengHindawi-WileySecurity and Communication Networks1939-01141939-01222020-01-01202010.1155/2020/42417134241713High-Efficiency Min-Entropy Estimation Based on Neural Network for Random Number GeneratorsNa Lv0Tianyu Chen1Shuangyi Zhu2Jing Yang3Yuan Ma4Jiwu Jing5Jingqiang Lin6State Key Laboratory of Information Security, Institute of Information Engineering, Chinese Academy of Sciences, Beijing 100093, ChinaState Key Laboratory of Information Security, Institute of Information Engineering, Chinese Academy of Sciences, Beijing 100093, ChinaState Key Laboratory of Information Security, Institute of Information Engineering, Chinese Academy of Sciences, Beijing 100093, ChinaSchool of Computer Science and Technology, University of Chinese Academy of Sciences, Beijing 100093, ChinaState Key Laboratory of Information Security, Institute of Information Engineering, Chinese Academy of Sciences, Beijing 100093, ChinaChina Information Technology Security Evaluation Center, Beijing 100085, ChinaState Key Laboratory of Information Security, Institute of Information Engineering, Chinese Academy of Sciences, Beijing 100093, ChinaRandom number generator (RNG) is a fundamental and important cryptographic element, which has made an outstanding contribution to guaranteeing the network and communication security of cryptographic applications in the Internet age. In reality, if the random number used cannot provide sufficient randomness (unpredictability) as expected, these cryptographic applications are vulnerable to security threats and cause system crashes. Min-entropy is one of the approaches that are usually employed to quantify the unpredictability. The NIST Special Publication 800-90B adopts the concept of min-entropy in the design of its statistical entropy estimation methods, and the predictive model-based estimators added in the second draft of this standard effectively improve the overall capability of the test suite. However, these predictors have problems on limited application scope and high computational complexity, e.g., they have shortfalls in evaluating random numbers with long dependence and multivariate due to the huge time complexity (i.e., high-order polynomial time complexity). Fortunately, there has been increasing attention to using neural networks to model and forecast time series, and random numbers are also a type of time series. In our work, we propose several new and efficient approaches for min-entropy estimation by using neural network technologies and design a novel execution strategy for the proposed entropy estimation to make it applicable to the validation of both stationary and nonstationary sources. Compared with the 90B’s predictors officially published in 2018, the experimental results on various simulated and real-world data sources demonstrate that our predictors have a better performance on the accuracy, scope of applicability, and execution efficiency. The average execution efficiency of our predictors can be up to 10 times higher than that of the 90B’s for 106 sample size with different sample spaces. Furthermore, when the sample space is over 22 and the sample size is over 108, the 90B’s predictors cannot give estimated results. Instead, our predictors can still provide accurate results. Copyright© 2019 John Wiley & Sons, Ltd.http://dx.doi.org/10.1155/2020/4241713
collection DOAJ
language English
format Article
sources DOAJ
author Na Lv
Tianyu Chen
Shuangyi Zhu
Jing Yang
Yuan Ma
Jiwu Jing
Jingqiang Lin
spellingShingle Na Lv
Tianyu Chen
Shuangyi Zhu
Jing Yang
Yuan Ma
Jiwu Jing
Jingqiang Lin
High-Efficiency Min-Entropy Estimation Based on Neural Network for Random Number Generators
Security and Communication Networks
author_facet Na Lv
Tianyu Chen
Shuangyi Zhu
Jing Yang
Yuan Ma
Jiwu Jing
Jingqiang Lin
author_sort Na Lv
title High-Efficiency Min-Entropy Estimation Based on Neural Network for Random Number Generators
title_short High-Efficiency Min-Entropy Estimation Based on Neural Network for Random Number Generators
title_full High-Efficiency Min-Entropy Estimation Based on Neural Network for Random Number Generators
title_fullStr High-Efficiency Min-Entropy Estimation Based on Neural Network for Random Number Generators
title_full_unstemmed High-Efficiency Min-Entropy Estimation Based on Neural Network for Random Number Generators
title_sort high-efficiency min-entropy estimation based on neural network for random number generators
publisher Hindawi-Wiley
series Security and Communication Networks
issn 1939-0114
1939-0122
publishDate 2020-01-01
description Random number generator (RNG) is a fundamental and important cryptographic element, which has made an outstanding contribution to guaranteeing the network and communication security of cryptographic applications in the Internet age. In reality, if the random number used cannot provide sufficient randomness (unpredictability) as expected, these cryptographic applications are vulnerable to security threats and cause system crashes. Min-entropy is one of the approaches that are usually employed to quantify the unpredictability. The NIST Special Publication 800-90B adopts the concept of min-entropy in the design of its statistical entropy estimation methods, and the predictive model-based estimators added in the second draft of this standard effectively improve the overall capability of the test suite. However, these predictors have problems on limited application scope and high computational complexity, e.g., they have shortfalls in evaluating random numbers with long dependence and multivariate due to the huge time complexity (i.e., high-order polynomial time complexity). Fortunately, there has been increasing attention to using neural networks to model and forecast time series, and random numbers are also a type of time series. In our work, we propose several new and efficient approaches for min-entropy estimation by using neural network technologies and design a novel execution strategy for the proposed entropy estimation to make it applicable to the validation of both stationary and nonstationary sources. Compared with the 90B’s predictors officially published in 2018, the experimental results on various simulated and real-world data sources demonstrate that our predictors have a better performance on the accuracy, scope of applicability, and execution efficiency. The average execution efficiency of our predictors can be up to 10 times higher than that of the 90B’s for 106 sample size with different sample spaces. Furthermore, when the sample space is over 22 and the sample size is over 108, the 90B’s predictors cannot give estimated results. Instead, our predictors can still provide accurate results. Copyright© 2019 John Wiley & Sons, Ltd.
url http://dx.doi.org/10.1155/2020/4241713
work_keys_str_mv AT nalv highefficiencyminentropyestimationbasedonneuralnetworkforrandomnumbergenerators
AT tianyuchen highefficiencyminentropyestimationbasedonneuralnetworkforrandomnumbergenerators
AT shuangyizhu highefficiencyminentropyestimationbasedonneuralnetworkforrandomnumbergenerators
AT jingyang highefficiencyminentropyestimationbasedonneuralnetworkforrandomnumbergenerators
AT yuanma highefficiencyminentropyestimationbasedonneuralnetworkforrandomnumbergenerators
AT jiwujing highefficiencyminentropyestimationbasedonneuralnetworkforrandomnumbergenerators
AT jingqianglin highefficiencyminentropyestimationbasedonneuralnetworkforrandomnumbergenerators
_version_ 1715358000675291136