Realizing Rough Set Theory with Spark for Large Scale Information Systems
碩士 === 元智大學 === 電機工程學系 === 104 === Apache Spark, an alternative to Hadoop MapReduce, is currently one of the most active open source projects in the big data world. The major characteristic to differ Spark from Hadoop is that it is a cluster computing framework that lets users perform in-memory comp...
Main Authors: | Kuo-Min Huang, 黃國閔 |
---|---|
Other Authors: | Kan-Lin Hsing |
Format: | Others |
Language: | en_US |
Published: |
2016
|
Online Access: | http://ndltd.ncl.edu.tw/handle/14274642376799824073 |
Similar Items
-
Rough sets theory for developing systematic indicators of sustainable transportation
by: Min-Wei Huang, et al.
Published: (2005) -
Increment Attributes set Reduction Algorithm of Information System Based on Rough Set Theory
by: Yu-Shien Ho, et al.
Published: (2010) -
TheExplorationofDataAnalysisUsingRoughSetTheory
by: Cheng-Yung Huang, et al. -
Large-scale e-learning recommender system based on Spark and Hadoop
by: Karim Dahdouh, et al.
Published: (2019-01-01) -
Realizability interpretations for intuitionistic set theories
by: Dihoum, Eman Emhemed
Published: (2016)