source: secureditsolutions.com

A lot of people and companies tell us that the internet’s made people impatient. It’s not entirely true though; in fact, forward-thinking companies have simply adjusted our expectations.

If you’re old enough, think back to 20 years ago. Accessing the internet meant cutting off your phone line so you could use your dial-up modem. When you did, you’d watch pictures and websites appearing in chunks as your connection strained under the pressure. At the time, it was normal – now, it would make us furious.

Customer service moved at a similar snail’s pace too. Call a company and they’d be waiting for screens to load while you were on the line – and paying with a credit card often involved the person you were talking to hanging up and calling you back, so they could free up their line for their card machine. One of those modern innovations that help with customer service is by allowing a company to have a VOIP system. You can explore options more on www.evolvesolutionsinc.com as they have a good overview of things you can do in the cloud with their phone system.

Like with most things, there are a small number of excellent businesses that drive things forward – and, if smaller businesses want to keep up, they need to adjust their processes accordingly – but with so many lightning-fast applications at our disposal, speed problems often lie with our internal IT infrastructure.

What is the latency?

source: starhub.com

Generally, latency is talked about in the same conversations as network performance and speed. If you’ve measured the latency in your network and you’re experiencing ‘low latency’, you’ve got a quick network with applications that respond equally quickly. On the other hand, a ‘high latency’ network is one which takes an undue amount of time to respond to an end-user request. You can find a latency calculator here.

Now, the difference between high and low latency might only be a matter of fractions of seconds – but we’re in a world where the speed bar has been set high. You competitors and their end-users can deliver excellent service – so, if you can’t, your customers are likely to move on quickly.

Why does latency occur?

There’s latency involved with any transmission of information. At the very least, the speed of light limits the transfer of data – but there are other things which stand in the way of a data packet trying to get its job done.

To understand why latency happens, it’s useful to understand the basics of how data is sent between programs.

Rather than sending any information as one block – information is broken down into tiny data packets. After sending, these data packets are reassembled, making the chunk of information that’s required.

So, sending an email will involve a huge number of tiny packets of data that are put together to make the overall message – and a VoIP call will be a continual stream of packets that make up the moving image and audio that you can see and hear.

When you access information over any connection, a ‘test’ packet of information is sent to make sure the connection will work – and to establish a speed that information will be sent. It’s useful to consider this a bit of ‘route planning’ by the application you’re using.

The trouble is, when the application starts to send information, if anything slows the transfer down (additional traffic using the connection for instance) then the speed that was set cannot be maintained – yet the data will still be sending from the program you’re using at the original rate.

Bandwidth vs Throughput

source: market-software.pl

Data sends at a rate a connection cannot keep up with when ‘bandwidth’ cannot keep up with the ‘throughput’ needed to keep everything running smoothly. Again, if we go back to our route analogy, it’s useful to think of bandwidth as the amount of traffic that a road can cope with – and throughput is the amount of traffic that’s trying to use the road.

Too much data and not enough bandwidth leads to bottlenecks.

The thing is, these bottlenecks can be solved by the system – and applications will actively seek to do this. To reduce throughput, your IT system will start to dump data packets – in an effort to get things running smoothly again. This works – but it comes at the expense of the data that was being sent – and, while it’s happening, your applications are running slowly.

What happens when data is dumped?

source: pixmar.co.za

Fortunately, many applications can continue to run, even if some data has been lost – but this isn’t universally true. In many instances, dropped data packets will degrade the overall data so badly that it becomes unrecognizable to the program receiving it.

Perhaps the most sensitive applications are those that are streaming data in real-time. So, if you use a VoIP phone system, or you’re streaming video – or even if you use cloud-based systems, you’ll find that too much latency can completely crash the application your end-user is relying on.

What’s the latency issue for customers?

As interactions between customers and our businesses become more sophisticated, so does the information that props up those interactions. Now, it’s not uncommon for customers to be able to directly access our systems in a protected way – making automatic payments or accessing statements – and so forth. For these people, slow access leads to frustration – or not being able to access the information they need.

It’s not just direct access that causes a problem though. When customers are at the other end of the phone, the last thing they want to hear is that your “systems are down” – especially when your competitors’ systems probably aren’t experiencing any issues. When dealing with your business puts your customers out, you can expect them to vote with their feet.

Dealing with latency

The warning signs of latency are fairly plain to see. Generally, if you’re experiencing failing systems, or systems that are painfully slow – then there’s a good chance latency is to blame.

The good news is, it’s almost always an in-house issue – since the internet infrastructure that carries your information when it leaves your network is capable of handling untold throughput.

So, if latency feels like a problem – especially if you’re using real-time systems or have mission-critical systems that you simply cannot afford to lose, then talking to a specialist IT network support provider is almost certainly going to be the way forward.

Even if they can help you avoid just one instance of losing your systems, it could be the difference between keeping a number of customers, or otherwise…