Data compression is the compacting of information by decreasing the number of bits which are stored or transmitted. Consequently, the compressed information takes less disk space than the initial one, so extra content could be stored on the same amount of space. You'll find different compression algorithms that function in different ways and with several of them just the redundant bits are erased, which means that once the info is uncompressed, there's no decrease in quality. Others remove unneeded bits, but uncompressing the data following that will result in reduced quality in comparison with the original. Compressing and uncompressing content needs a huge amount of system resources, especially CPU processing time, therefore every Internet hosting platform which uses compression in real time needs to have adequate power to support this feature. An example how information can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of storing the whole code.

Data Compression in Hosting

The ZFS file system that is run on our cloud web hosting platform employs a compression algorithm called LZ4. The latter is substantially faster and better than every other algorithm out there, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data faster than it is read from a hard drive, which improves the performance of sites hosted on ZFS-based platforms. Due to the fact that the algorithm compresses data very well and it does that very fast, we're able to generate several backups of all the content kept in the hosting accounts on our servers daily. Both your content and its backups will take reduced space and since both ZFS and LZ4 work very fast, the backup generation will not affect the performance of the hosting servers where your content will be stored.