Data compression is the compacting of information by lowering the number of bits that are stored or transmitted. In this way, the compressed information requires considerably less disk space than the original one, so additional content could be stored using identical amount of space. There're various compression algorithms which work in different ways and with a lot of them only the redundant bits are deleted, so once the info is uncompressed, there's no loss of quality. Others delete excessive bits, but uncompressing the data later will result in reduced quality compared to the original. Compressing and uncompressing content takes a huge amount of system resources, and in particular CPU processing time, so any hosting platform that employs compression in real time needs to have adequate power to support this attribute. An example how data can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of storing the actual code.
Data Compression in Semi-dedicated Servers
The ZFS file system which runs on the cloud platform where your semi-dedicated server account will be created uses a powerful compression algorithm called LZ4. It is one of the best algorithms out there and definitely the best one when it comes to compressing and uncompressing website content, as its ratio is very high and it will uncompress data a lot faster than the same data can be read from a hard drive if it were uncompressed. In this way, using LZ4 will boost every website that runs on a platform where the algorithm is enabled. This high performance requires a lot of CPU processing time, that's provided by the multitude of clusters working together as part of our platform. Furthermore, LZ4 allows us to generate several backup copies of your content every day and save them for a month as they will take much less space than regular backups and will be created much faster without loading the servers.