The term data compression means decreasing the number of bits of info which should be saved or transmitted. This can be done with or without the loss of information, which means that what will be deleted at the time of the compression will be either redundant data or unnecessary one. When the data is uncompressed afterwards, in the first case the data and the quality shall be identical, while in the second case the quality shall be worse. You can find different compression algorithms that are more efficient for different kind of information. Compressing and uncompressing data in most cases takes lots of processing time, so the server carrying out the action needs to have ample resources to be able to process the info fast enough. An example how information can be compressed is to store just how many consecutive positions should have 1 and just how many should have 0 within the binary code rather than storing the particular 1s and 0s.

Data Compression in Hosting

The compression algorithm that we work with on the cloud internet hosting platform where your new hosting account will be created is called LZ4 and it's applied by the leading-edge ZFS file system that powers the platform. The algorithm is far better than the ones other file systems work with since its compression ratio is a lot higher and it processes data a lot faster. The speed is most noticeable when content is being uncompressed since this happens at a faster rate than info can be read from a hard disk drive. Therefore, LZ4 improves the performance of any website stored on a server which uses this particular algorithm. We use LZ4 in one more way - its speed and compression ratio allow us to make several daily backup copies of the entire content of all accounts and keep them for 30 days. Not only do these backups take less space, but their generation won't slow the servers down like it often happens with some other file systems.