Data compression is the compacting of data by decreasing the number of bits which are stored or transmitted. As a result, the compressed info will take considerably less disk space than the original one, so much more content could be stored on the same amount of space. You can find many different compression algorithms which function in different ways and with many of them only the redundant bits are removed, which means that once the info is uncompressed, there is no loss of quality. Others remove excessive bits, but uncompressing the data later on will lead to reduced quality compared to the original. Compressing and uncompressing content needs a large amount of system resources, particularly CPU processing time, therefore any Internet hosting platform that uses compression in real time should have sufficient power to support that attribute. An example how information can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" how many consecutive 1s or 0s there should be instead of keeping the entire code.

Data Compression in Shared Web Hosting

The compression algorithm used by the ZFS file system which runs on our cloud hosting platform is named LZ4. It can enhance the performance of any site hosted in a shared web hosting account with us since not only does it compress info more effectively than algorithms employed by various other file systems, but also uncompresses data at speeds which are higher than the HDD reading speeds. This can be done by using a great deal of CPU processing time, that is not a problem for our platform due to the fact that it uses clusters of powerful servers working together. A further advantage of LZ4 is that it allows us to create backups at a higher speed and on reduced disk space, so we shall have a couple of daily backups of your databases and files and their generation won't change the performance of the servers. This way, we can always restore all content that you may have deleted by mistake.