Data compression is the compacting of information by lowering the number of bits that are stored or transmitted. In this way, the compressed information requires considerably less disk space than the original one, so additional content could be stored using identical amount of space. There're various compression algorithms which work in different ways and with a lot of them only the redundant bits are deleted, so once the info is uncompressed, there's no loss of quality. Others delete excessive bits, but uncompressing the data later will result in reduced quality compared to the original. Compressing and uncompressing content takes a huge amount of system resources, and in particular CPU processing time, so any hosting platform that employs compression in real time needs to have adequate power to support this attribute. An example how data can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of storing the actual code.