Cache-Aware Batch Update for In-Memory Databases

碩士 === 國立臺灣大學 === 電機工程學研究所 === 105 === Low-latency, high throughput in-memory DBMSs have been attracting attention in research and application increasingly in recent years, thanks to hardware evolutions. Furthermore, there are many works proposed to address issues in the upcoming NVRAM era for durab...

Full description

Bibliographic Details
Main Authors: Ting-Kang Chang, 張庭綱
Other Authors: 陳銘憲
Format: Others
Language:en_US
Published: 2017
Online Access:http://ndltd.ncl.edu.tw/handle/h5v5gg
Description
Summary:碩士 === 國立臺灣大學 === 電機工程學研究所 === 105 === Low-latency, high throughput in-memory DBMSs have been attracting attention in research and application increasingly in recent years, thanks to hardware evolutions. Furthermore, there are many works proposed to address issues in the upcoming NVRAM era for durable in-memory storage. However, very few of them were focused on cache utilization for low-locality update-intensive workloads, which usually lead to poor cache utilization and undesirable performance. We design a cache-aware batch update model to improve cache efficiency for such workloads. By buffering update requests in cache, the system can aggregate spatially close requests arriving at distant time into one batched update, and avoid unnecessary data re-fetch from memory, thus reducing read latency and improve overall throughput. The experiments show that our proposed cache-aware model can save up to 75\% last-level-cache misses and achieve up to 1.65x speedup over a cache-oblivious reference model.