The term data compression identifies reducing the number of bits of data that has to be saved or transmitted. This can be done with or without losing data, which means that what will be deleted in the course of the compression will be either redundant data or unneeded one. When the data is uncompressed afterwards, in the first case the information and the quality will be the same, while in the second case the quality will be worse. There're various compression algorithms which are better for different kind of info. Compressing and uncompressing data generally takes plenty of processing time, therefore the server carrying out the action needs to have ample resources to be able to process your info fast enough. A simple example how information can be compressed is to store how many sequential positions should have 1 and just how many should have 0 inside the binary code rather than storing the actual 1s and 0s.
Data Compression in Shared Hosting
The compression algorithm that we employ on the cloud internet hosting platform where your new shared hosting account will be created is called LZ4 and it is used by the revolutionary ZFS file system which powers the platform. The algorithm is better than the ones other file systems work with because its compression ratio is a lot higher and it processes data considerably faster. The speed is most noticeable when content is being uncompressed since this happens quicker than information can be read from a hdd. As a result, LZ4 improves the performance of every site hosted on a server that uses this particular algorithm. We take advantage of LZ4 in one more way - its speed and compression ratio let us generate multiple daily backups of the whole content of all accounts and store them for a month. Not only do the backups take less space, but their generation doesn't slow the servers down like it often happens with various other file systems.