|Institution:||Louisiana State University|
|Department:||Electrical & Computer Engineering|
|Keywords:||markov models; information theory; arithmetic coding; huffman coding; data compression|
|Full text PDF:||http://etd.lsu.edu/docs/available/etd-07032007-100117/|
The commonly used data compression techniques do not necessarily provide maximal compression and neither do they define the most efficient framework for transmission of data. In this thesis we investigate variants of the standard compression algorithms that use the strategy of partitioning of the data to be compressed. Doing so not only increases the compression ratio in many instances, it also reduces the maximum data block size for transmission. The partitioning of the data is made using a Markov model to predict if doing so would result in increased compression ratio. Experiments have been performed on text files comparing the new scheme to adaptive Huffman and arithmetic coding methods. The adaptive Huffman method has been implemented in a new way by combining the FGK method with Vitters implicit ordering of nodes.