Latency, is a networking term, used to describe the time taken by data to move from one node, to the next, along its journey. A good way to think about latency, is to imagine a train travelling from one station to the next. You can think of latency as the time taken by the train to move from one station to the following. As latency occurs during the data transmission, there are several elements that can bring about latency: -
- Storage Delay - When you search for data, a processor needs to search for and retrieve the data which you are looking for. This will take some degree of time, which in turn produces latency, as this data needs to be found initially then retrieved.
- The Device - As the data passes through a device, the device will read the data that is being transmitted. It may also make additions, or slight changes to the data that is being transmitted. Each time a device does this, some time will be taken during the process i.e. the latency. Going back to the example above about the train. Imagine the train entering a station, its more than likely that there will be a transfer of passengers at each station along the journey. As passengers are getting on and off the train, some time will need to be taken at the station until the transfer is complete. In this case latency would be the time taken to complete this transfer of passengers.
- Transmission - The method of data transmission, will also impact on the time taken for the data to travel from one place to the next. Staying with the analogy of a train journey, transmission can be compared to a train heading along tracks towards its destination. If it is quiet the train will carry on along its track, unimpeded, and will arrive at its final destination relatively quickly. At busier times, more trains will be converging on this final destination at the same time. Not all of the trains heading into the station, will be able to arrive at the same time. This would cause some trains to have to wait on the track, until other trains further ahead, have had time to arrive, exchange passengers, and then move out of the station onto the next leg of their journey. The busier the network, the more time will be taken by each train to arrive at the station, this is latency.
- Propagation - Propagation refers to the inevitable time taken, for the data to travel from one node to the next unimpeded.
So, is it possible to reduce latency on your website? The answer is yes! There are ways to reduce latency within a network. Essentially, its a case of having the source of the required information closer to you. At Peters Web we use a CDN (Cloudflare), that stores the static files we require on a network. As the CDN has locations close to us, we can retrieve files and information much more quickly. In terms of our dynamic content, Cloudflare also helps us to retrieve this information much quicker, as it has many more potential avenues of traffic than a standard service provider. Back to the trains! An example of this would be several tracks moving in each direction at once rather than a single track! Another method for retrieving information quicker, would be to retrieve the files required from a 'cache' (What is a caching). This would speed up the transfer of data to the user, rather than having to get it from the server, each time that you need it. Finally, using an internet protocol such as HTTP/2, would quicken up the transfer of data considerably. This is because less 'trips', back and forth between the sender and the receiver of the data are required. Lack of correct protocols, could be illustrated, by thinking of it as having a conversation over the phone where you would hang up, and re-dialling after every sentence! I have mentioned just a few of the ways in which latency can be diminished. Of course, there are many more should you wish to find them! Essentially, its about cutting down the distance, or obstacles, through which the data has to travel to and from its destination.....
If you require more information around this subject please contact us via a support ticket to our hosting department (open a ticket) and we will get back in touch as soon as possible.