Data compression is the compacting of data by lowering the number of bits that are stored or transmitted. Consequently, the compressed information will take substantially less disk space than the initial one, so extra content could be stored using identical amount of space. You'll find various compression algorithms that work in different ways and with a lot of them just the redundant bits are removed, so once the info is uncompressed, there's no loss of quality. Others remove excessive bits, but uncompressing the data afterwards will result in reduced quality compared to the original. Compressing and uncompressing content consumes a huge amount of system resources, and in particular CPU processing time, so each and every hosting platform which uses compression in real time needs to have sufficient power to support this feature. An example how data can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" what number of sequential 1s or 0s there should be instead of saving the entire code.

Data Compression in Shared Website Hosting

The compression algorithm which we use on the cloud hosting platform where your new shared website hosting account will be created is known as LZ4 and it is applied by the leading-edge ZFS file system that powers the system. The algorithm is a lot better than the ones other file systems use as its compression ratio is a lot higher and it processes data considerably quicker. The speed is most noticeable when content is being uncompressed since this happens even faster than info can be read from a hard drive. Because of this, LZ4 improves the performance of any site stored on a server that uses this algorithm. We take full advantage of LZ4 in one more way - its speed and compression ratio allow us to produce a couple of daily backup copies of the whole content of all accounts and keep them for one month. Not only do the backups take less space, but their generation doesn't slow the servers down like it can often happen with other file systems.