The term data compression describes reducing the number of bits of information which needs to be saved or transmitted. This can be achieved with or without losing information, so what will be removed in the course of the compression will be either redundant data or unnecessary one. When the data is uncompressed subsequently, in the first case the data and the quality will be the same, while in the second case the quality shall be worse. There are various compression algorithms that are better for different type of information. Compressing and uncompressing data often takes a lot of processing time, which means that the server carrying out the action needs to have ample resources in order to be able to process the info fast enough. A simple example how information can be compressed is to store how many sequential positions should have 1 and just how many should have 0 in the binary code rather than storing the particular 1s and 0s.

Data Compression in Shared Hosting

The compression algorithm which we work with on the cloud internet hosting platform where your new shared hosting account will be created is known as LZ4 and it is used by the advanced ZFS file system which powers the system. The algorithm is superior to the ones other file systems use because its compression ratio is much higher and it processes data considerably faster. The speed is most noticeable when content is being uncompressed since this happens faster than data can be read from a hard disk. Therefore, LZ4 improves the performance of each and every website hosted on a server which uses this algorithm. We use LZ4 in one more way - its speed and compression ratio allow us to produce a number of daily backup copies of the entire content of all accounts and keep them for a month. Not only do these backup copies take less space, but their generation does not slow the servers down like it can often happen with alternative file systems.