The term data compression refers to reducing the number of bits of info that should be stored or transmitted. You can do this with or without losing information, so what will be removed in the course of the compression can be either redundant data or unneeded one. When the data is uncompressed subsequently, in the first case the info and the quality shall be the same, whereas in the second case the quality will be worse. You will find various compression algorithms which are more effective for different sort of information. Compressing and uncompressing data often takes lots of processing time, therefore the server performing the action must have enough resources in order to be able to process your info quick enough. One simple example how information can be compressed is to store just how many sequential positions should have 1 and how many should have 0 within the binary code instead of storing the actual 1s and 0s.
Data Compression in Web Hosting
The compression algorithm that we work with on the cloud internet hosting platform where your new web hosting account shall be created is called LZ4 and it's used by the advanced ZFS file system that powers the platform. The algorithm is better than the ones other file systems use since its compression ratio is much higher and it processes data considerably faster. The speed is most noticeable when content is being uncompressed as this happens quicker than info can be read from a hard disk drive. For that reason, LZ4 improves the performance of each site stored on a server which uses this algorithm. We take full advantage of LZ4 in an additional way - its speed and compression ratio let us generate a couple of daily backups of the whole content of all accounts and keep them for 30 days. Not only do our backup copies take less space, but their generation will not slow the servers down like it often happens with some other file systems.