Latency is a networking term used to describe the time taken by data to move from one node to the next along its journey. A good way to think about latency is to imagine a train travelling from one station to the next, you can think of latency as the time taken by the train to move from one station to the next. As latency occurs during the data transmission there are several elements that can bring about latency: -
- Storage Delay - When you search for data, a processor needs to search and retrieve the data which you are looking for. This will take some degree of time which produces latency as this data needs to be found initially then retrieved.
- The Device - As the data passes through a device the device will read the data that is being transmitted (or may make additions or slight changes to the data that is being transmitted). Each time a device does this some time will be taken during the process i.e. the latency. Going back to the example above about the train, imagine the train entering a station - its more than likely that there will be a transfer of passengers at each station along the journey. As passengers are getting on and off the train some time will need to be taken at the station until the transfer is complete (latency would be the time taken to complete this event).
- Transmission - The method of data transmission will also impact on the time taken for the data to travel from one place to the next. Staying with the analogy of a train journey, transmission can be compared to a train heading along tracks towards its destination - if its quiet the train will carry on along its track unimpeded and will arrive at its final destination relatively quickly. At busier times more trains will be converging on this final destination at the same time, not all of the trains heading into the station will be able to arrive at the same time causing some trains to have to wait on the track until other trains further ahead have had time to arrive, exchange passengers and then move out of the station onto the next leg of their journey. The busier the network the more time will be taken by each train to arrive at the station this is latency.
- Propagation - Propagation refers to the inevitable time taken for the data to travel from one node to the next unimpeded.
So....is it possible to reduce latency on your website? The answer is yes, there are ways to reduce latency within a network. Essentially its a case of having the source of the required information closer to you, at Peters Web we use a CDN (Cloudflare) that stores the static files we require on a network. As the CDN has locations close to us we can retrieve files and information much more quickly. In terms of our dynamic content Cloudflare also helps us to retrieve this information much quicker as it has many more potential avenues of traffic than a standard service provider (back to the trains......several tracks moving in each direction at once rather than a single track!). Another method for retrieving information quicker would be to retrieve the files required from a 'cache' (What is a caching)rather than having to get them from the server each time that you need them. Finally using an internet protocol such as HTTP/2 would quicken up the transfer of data considerably as less 'trips' back and forth between the sender and the receiver of the data are required (imagine having a conversation over the phone and hanging up and re-dialling after every sentence.....!). I have mentioned just a few of the ways in which latency can be diminished, of course there are many more should you wish to find them! Essentially its about cutting down the distance or obstacles through which the data has to travel to and from its destination.....
If you require more information around this subject please contact us via a support ticket to our hosting department (open a ticket) and we will get back in touch as soon as possible.