Data compression is the compacting of information by reducing the number of bits that are stored or transmitted. Because of this, the compressed information will need considerably less disk space than the initial one, so a lot more content can be stored on the same amount of space. You will find different compression algorithms which work in different ways and with a lot of them only the redundant bits are removed, so once the data is uncompressed, there is no decrease in quality. Others erase excessive bits, but uncompressing the data afterwards will result in lower quality compared to the original. Compressing and uncompressing content takes a large amount of system resources, especially CPU processing time, so any web hosting platform which employs compression in real time should have ample power to support that attribute. An example how info can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" the number of sequential 1s or 0s there should be instead of saving the whole code.

Data Compression in Web Hosting

The ZFS file system that runs on our cloud web hosting platform employs a compression algorithm called LZ4. The latter is substantially faster and better than any other algorithm on the market, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard drive, which improves the overall performance of websites hosted on ZFS-based platforms. Because the algorithm compresses data quite well and it does that quickly, we can generate several backups of all the content kept in the web hosting accounts on our servers on a daily basis. Both your content and its backups will take less space and since both ZFS and LZ4 work very quickly, the backup generation will not influence the performance of the servers where your content will be stored.

Data Compression in Semi-dedicated Servers

The ZFS file system which runs on the cloud platform where your semi-dedicated server account will be created uses a powerful compression algorithm called LZ4. It is one of the best algorithms out there and positively the best one when it comes to compressing and uncompressing website content, as its ratio is very high and it'll uncompress data quicker than the same data can be read from a hard disk drive if it were uncompressed. Thus, using LZ4 will quicken any kind of website that runs on a platform where this algorithm is present. This high performance requires lots of CPU processing time, that is provided by the numerous clusters working together as a part of our platform. What's more, LZ4 enables us to generate several backups of your content every day and have them for a month as they'll take a reduced amount of space than standard backups and will be generated much faster without loading the servers.