The abundance of available sensor and derived data from large scientific experiments, such as earth observation programs, radio astronomy sky surveys, and high-energy physics already exceeds the storage hardware globally fabricated per year. To that end, cold storage data archives are the--often overlooked--spearheads of modern big data analytics in scientific, data-intensive application domains. While high-performance data analytics has received much attention from the research community, the growing number of problems in designing and deploying cold storage archives has only received very little attention. In this paper, we take the first step towards bridging this gap in knowledge by presenting an analysis of four real-world cold storage archives from three different application domains. In doing so, we highlight (i) workload characteristics that differentiate these archives from traditional, performance-sensitive data analytics, (ii) design trade-offs involved in building cold storage systems for these archives, and (iii) deployment trade-offs with respect to migration to the public cloud. Based on our analysis, we discuss several other important research challenges that need to be addressed by the data management community.
Cold storage data archives: More than just a bunch of tapes
DAMON 2019, 15th International Workshop on Data Management on New Hardware, Held with ACM SIGMOD/PODS 2019, July 01, 2019, Amsterdam, Netherlands
© ACM, 2019. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in DAMON 2019, 15th International Workshop on Data Management on New Hardware, Held with ACM SIGMOD/PODS 2019, July 01, 2019, Amsterdam, Netherlands http://dx.doi.org/10.1145/3329785.3329921
PERMALINK : https://www.eurecom.fr/publication/5858