作者
Amit Jain, Kamaljit I. Lakhtaria, Prateek Srivastava
发表日期
2013/12
研讨会论文
Advances in Engineering and Technology
页码范围
536-543
出版商
Elsevier
简介
With increasing amount of text data being stored rapidly, efficient information retrieval and Storage in the compressed domain has become a major concern. Compression is the process of coding that will effectively reduce the total number of bits needed to represent certain information. Data compression has been one of the critical enabling technologies for the ongoing digital multimedia revolution. There are lots of data compression algorithms which are available to compress files of different formats. This paper provides a survey of different basic lossless data compression algorithms on English text files: LZW, Huffman, Fixed-length code (FLC), and Huffman after using Fixed-length code (HFLC). All the above algorithms are evaluated and tested on different text files of different sizes. To find the best algorithm among above, comparison is made in terms of compression: Size, Ratio, Time (Speed), and Entropy. The paper is concluded by the decision showing which algorithm performs best over text data.
引用总数
201420152016201720182019202020211132121
学术搜索中的文章
A Jain, KI Lakhtaria, P Srivastava - Proc. Int. Conf. Adv. Comput. Sci.(AETACS), 2013