Modeling the processing of a large amount of data
The definition of large amounts of data, Big Data, is used to refer to technologies such as storing and analyzing a significant amount of data that requires high speed and real-time decision-making when processing. Typically, when serious analysis is said, especially if the term DataMining is use...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Al-Farabi Kazakh National University
2018-08-01
|
Series: | Вестник КазНУ. Серия математика, механика, информатика |
Subjects: | |
Online Access: | https://bm.kaznu.kz/index.php/kaznu/article/view/490/392 |
id |
doaj-5c9ef79ed871447094cbfe6c93f1fc17 |
---|---|
record_format |
Article |
spelling |
doaj-5c9ef79ed871447094cbfe6c93f1fc172021-08-02T07:04:23ZengAl-Farabi Kazakh National UniversityВестник КазНУ. Серия математика, механика, информатика1563-02772617-48712018-08-01971120126https://doi.org/10.26577/jmmcs-2018-1-490Modeling the processing of a large amount of dataG. T. Balakayeva0D. K. Darkenbayev1Al-Farabi Kazakh National UniversityAl-Farabi Kazakh National UniversityThe definition of large amounts of data, Big Data, is used to refer to technologies such as storing and analyzing a significant amount of data that requires high speed and real-time decision-making when processing. Typically, when serious analysis is said, especially if the term DataMining is used, hat there is a huge amount of data. There are no universal methods of analysis or algorithms suitable for any cases and any volumes of information. Data analysis methods differ significantly in performance, quality of results, usability and data requirements. Optimization can be carried out at various levels: equipment, databases, analytical platform, preparation of source data, specialized algorithms. Big data is a set of technologies that are designed to perform three operations. First, to process large amounts of data compared to "standard" scenarios. Secondly, be able to work with fast incoming data in very large volumes. That is, the data is not just a lot, but they are constantly becoming more and more. Thirdly, they must be able to work with structured and poorly structured data in parallel in different aspects. Large data suggest that the input algorithms receive a stream of not always structured information and that more can be extracted from it than any one idea. The results of the study are used by the authors in modeling large data and developing a web application.https://bm.kaznu.kz/index.php/kaznu/article/view/490/392large amounts of datadata processinganalysismodelingmethods |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
G. T. Balakayeva D. K. Darkenbayev |
spellingShingle |
G. T. Balakayeva D. K. Darkenbayev Modeling the processing of a large amount of data Вестник КазНУ. Серия математика, механика, информатика large amounts of data data processing analysis modeling methods |
author_facet |
G. T. Balakayeva D. K. Darkenbayev |
author_sort |
G. T. Balakayeva |
title |
Modeling the processing of a large amount of data |
title_short |
Modeling the processing of a large amount of data |
title_full |
Modeling the processing of a large amount of data |
title_fullStr |
Modeling the processing of a large amount of data |
title_full_unstemmed |
Modeling the processing of a large amount of data |
title_sort |
modeling the processing of a large amount of data |
publisher |
Al-Farabi Kazakh National University |
series |
Вестник КазНУ. Серия математика, механика, информатика |
issn |
1563-0277 2617-4871 |
publishDate |
2018-08-01 |
description |
The definition of large amounts of data, Big Data, is used to refer to technologies such as storing
and analyzing a significant amount of data that requires high speed and real-time decision-making
when processing. Typically, when serious analysis is said, especially if the term DataMining is
used, hat there is a huge amount of data. There are no universal methods of analysis or algorithms
suitable for any cases and any volumes of information. Data analysis methods differ significantly in
performance, quality of results, usability and data requirements. Optimization can be carried out
at various levels: equipment, databases, analytical platform, preparation of source data, specialized
algorithms. Big data is a set of technologies that are designed to perform three operations. First,
to process large amounts of data compared to "standard" scenarios. Secondly, be able to work with
fast incoming data in very large volumes. That is, the data is not just a lot, but they are constantly
becoming more and more. Thirdly, they must be able to work with structured and poorly structured
data in parallel in different aspects. Large data suggest that the input algorithms receive a stream
of not always structured information and that more can be extracted from it than any one idea.
The results of the study are used by the authors in modeling large data and developing a web
application. |
topic |
large amounts of data data processing analysis modeling methods |
url |
https://bm.kaznu.kz/index.php/kaznu/article/view/490/392 |
work_keys_str_mv |
AT gtbalakayeva modelingtheprocessingofalargeamountofdata AT dkdarkenbayev modelingtheprocessingofalargeamountofdata |
_version_ |
1721239551202033664 |