Looking for a specific product?

Make a search for products & suppliers, articles & news.

Internet research to level the playing field

The Internet as we know it today has been optimised to transmit large amounts of data or “greedy streams” - the type of transmission involved in downloading large files or watching online TV.

(Illustration: Simula) Network lag can lead to discrepancies in the information received by the various players. (Illustration: Simula)

 

“Up to now, Internet research has primarily focused on speeding up transmission by increasing bandwidth so that more data can be transferred at a given time,” explains Andreas Petlund of Simula Research Laboratory in Oslo.

The most common Internet protocol for transmitting data, TCP, works by apportioning available bandwidth among the users present at any given time. The downside is that this can cause latency, or delay, in data transmissions.
For time-dependant applications such as Internet telephony and online gaming, time lags as short as a few hundred milliseconds can create big problems.

Aiming to reduce latency

“In real-time gaming against other players online, data is transmitted only when an action such as moving around or shooting at someone is performed. The same principle applies for stock market programs when placing orders or requesting share prices, for example, via the trading systems in use by Oslo Børs, the Norwegian Stock Exchange. In such cases it is essential to avoid any delay,” says Dr Petlund.

Applications like these often generate what are called thin data streams. With thin streams only small amounts of data are transmitted at a time and there can be extended periods between data packages. (See Facts about data packages and network latency below)

According to Andreas Petlund, thin streams cannot compete with greedy traffic for bandwidth. Thin streams almost invariably come up short against greedy traffic and users are left to cope with the resulting lag.

(Photo: Shutterstock) A stock exchange’s trading system requires rapid response (Photo: Shutterstock)

 

As part of a new research project funded under the Research Council of Norway’s large-scale programme on Core Competence and Value Creation in ICT (VERDIKT), researchers are working to reduce latency as much as possible.

“We want a more balanced Internet where thin streams don’t always lose out. This can be achieved by adding speed to the mix, instead of only thinking about maximising throughput,” says Dr Petlund.

New approaches

Network researchers are now planning to use simulation and modelling to learn more about the network behaviour of thin data streams. According to Dr Petlund, neither this nor the behaviour of data streams in competition with other traffic has ever been studied in depth.

The primary obstacle lies in the vast complexity of the systems making up the Internet. “We may thoroughly understand each individual mechanism or sub-protocol under controlled conditions, but in the Internet jungle it is rather like putting something into a black box without knowing what’s going to come out the other end,” he explains.

“This happens because the Internet is a shared resource and we have no control over what everyone else is using it for.”

(Photo: Norunn K. Torheim)  

Dr Andreas Petlund of Simula Research Laboratory in Oslo. (Photo: Norunn K. Torheim) International cooperation

 

One of the partners the Norwegian researchers will be working with is Dr Jens Schmitt of the University of Kaiserslautern. Dr Schmitt is working on the development of mathematical models of network behaviour and testing the extent to which the models provide a good picture of reality.

“We also have some researchers from the US on the team,” Dr Petlund adds. “In collaboration with the Cooperative Association for Internet Data Analysis (CAIDA) in San Diego, a leader in the field of Internet analysis, we are going to perform measurements and analyses to find out what percentage of all data streams are thin streams. No such data exists anywhere today.”

Pushing for standardisation

Researchers are also employing more traditional research methods in order to study how thin streams behave both in test networks in the laboratory and when they are transmitted via the Internet.

One desired outcome is a standardised mechanism for handling thin data streams through the Internet Engineering Task Force (IETF).

“We won’t be able to establish a standard unless we can prove that one is really needed. That is why we first need to measure the prevalence of thin streams,” says Dr Petlund.

It is also essential to find out if prioritising thin data streams on the Internet has any negative consequences on other traffic. If this turns out to be the case, then the current use of so many different transmission technologies will pose a formidable challenge.

“At one time everyone connected to the Internet by means of a cable. Now we have a wide array of alternatives such as WiFi, 3G, 4G, WiMax, ADSL and fibre-optic connections – all of which behave differently. We must come up with solutions that are optimal for everyone,” Andreas Petlund affirms.

(Screenshot: Funcom)   Funcom’s game, The Secret World, uses technology which reduces latency (Screenshot: Funcom)

Better online computer games

It was an interest in computer games that originally inspired researchers at Simula to study systems supporting time-dependent applications ahead of most of the rest of the field.

Andreas Petlund has previously worked on improvements at the operating system level to decrease latency arising from package loss. Users of Linux are benefiting from the resulting technology.

The large Norwegian computer game company, Funcom, has integrated these improvements into a number of their games servers. The technology has been tested on their highest-profile game, Age of Conan, and will be used for The Secret World, soon to be released.

Facts about data packages and network latency

In order to transmit large amounts of data over the Internet as efficiently as possible, a steadily increasing amount of data is sent until maximum bandwidth capacity is reached. The amount of data then stabilises so that bandwidth usage is optimised.

The Internet transmission protocol most used today, TCP, works by dividing data into packages. Queuing systems are used to transmit data packages. All data streams destined to go between given nodes in the Internet can be found in these queues.

If the queues fill up entire data packages are removed from the queue. These packages are then lost.

In order to determine which packages have actually arrived at the destination, the originator requests delivery confirmation for each package. If too much time elapses before a response is received the package is transmitted anew, resulting in network lag.

 Source: Andreas Petlund

 

Facts about the project

Traffic behaviour of interactive time-dependent thin streams on the modern Internet – TimeIn has been granted funding under the VERDIKT programme at the Research Council of Norway from January 2012 through December 2015.

Post-doctoral fellow Andreas Petlund of Simula Research Laboratory and the University of Oslo is project manager.

He is working with Information Access Disruptions (iAD), one of the Research Council of Norway’s Centres for Research-driven Innovation (SFI). The new research project TimeIn follows in the footsteps of previous research activities in iAD.

Partners: UNINETT (a state-owned company responsible for Norway's National Research and Education Network), the University of Oslo, Funcom, CAIDA (Cooperative Association for Internet Data Analysis), Cisco Norway, the University of Kaiserslautern and Karlstad University.

 

Related news

Latest news

Industry's preferred pipeline standard gets update by DNV GL

A number of requirements related to linepipe fabrication,

Servogear Hybrid CPP for the World’s First Hybrid Catamarans in Commercial Traffic

Norled AS is one of Norway’s largest ferry and express boat operators. 

CMR is helping companies in Hordaland with research-based innovation

CMR has won a contract with Hordaland county council as competence broker. The contract is part of the Research-based Regional Innovation (FORREGION) scheme.

ISO to release final draft of ISO 45001 – Occupational Health & Safety

The ISO 45001 Final Draft International Standard (FDIS) is scheduled to be released for review on November 30. In case of a positive vote the standard is expected to be published in February 2018.

DNV GL helps reduce cost of compliance with innovative approach to independent verification of oil and gas assets

Verification surveys applying remote witnessing involve equipping a technician onsite with hardware (a camera or smartphone) and dedicated software which enable remote...

Renewables exports are increasing, but more rapid growth is required

Reorientation among oil and gas suppliers boosted international sales of renewable energy equipment and services by Norwegian companies, writes Ivar Slengesol, Director of Industry and Clean Technologies at Export Credit N...

The world’s most modern wellboat

The wellboat Ronia Diamond was characterised as the “most modern wellboat the world has seen” during her christening in Bergen on 23 September.

Joint industry collaboration to boost offshore and marine sector with Additive Manufacturing (3D printing), Drone and Digital Twin technologies

DNV GL, A*STAR’s Singapore Institute of Manufacturing Technology (SIMTech) and National Additive Manufacturing Innovation Cluster (NAMIC) ...

DNV GL line optimization brings ten per cent fuel savings for Hapag-Lloyd Cruises’ new expedition cruise vessels

“The design concept for the two new ships was ‘inspired by nature’,” says Dr. Henning Brauer,