Data compression is the compacting of data by lowering the number of bits that are stored or transmitted. As a result, the compressed data will require substantially less disk space than the original one, so much more content could be stored using the same amount of space. You'll find different compression algorithms which work in different ways and with a lot of them just the redundant bits are erased, therefore once the data is uncompressed, there is no loss of quality. Others erase excessive bits, but uncompressing the data subsequently will result in reduced quality in comparison with the original. Compressing and uncompressing content needs a large amount of system resources, and in particular CPU processing time, so each and every Internet hosting platform which employs compression in real time needs to have adequate power to support this attribute. An example how information can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" how many consecutive 1s or 0s there should be instead of storing the whole code.