The Internet of Things – is it all over the top?

Around 10,000 years ago, humans started to live in settled communities. They became farmers and established an enduring connection between mankind and nature. That connection is at the forefront of the Internet of Things.

untitledMeet Bob Dawson. Bob has driven tractors and combine harvesters for nearly 40 years. For the last three years, he has driven both vehicles at the same time. He sits in his combine harvester while it is steered via GPS. On-board applications measure the crop yield in real-time passing information back to analyse for future sowing and spraying. The driver-free tractor is being driven by the same application as in the harvester cab. The trailer’s sensors monitor when it is full and the harvester automatically switches off the grain chute and tells the tractor to take the trailer to a waiting truck. Precision agriculture is with us and the Internet of Things (IoT) is at its heart.

What is the relationship between the IoT and Over The Top (OTT) applications, and what are the implications of IoT for customers, for network operators and for society in general?

There are four umbrella elements of the IoT:

  • Devices – A truly interconnected world could have hundreds of billions, even trillions, of devices. For example, every piece of packaging for every prescribed drug, every food wrapper. Every few metres of every stream in the world could have its own device in the water enabling analysis of water levels, quality, climate and sustainability.
  • Connectivity – for many IoT applications, the connectivity requirements are ubiquitous coverage, low unit costs, low data rates over many years. There are other applications that require high speed, low latency and massive bandwidths. There are many fragmented alliances and consortia and, if some succeed, it could be a crack in the mobile operators’ defences as the gatekeeper for business-grade mobile connectivity.
  • Applications – Uber has become a wellknown OTT taxi-hailing application. But Uber has bigger goals – to remove the need for people to own, or even drive, cars and to remove the need for towns to build any more car parks, where cars sit doing nothing all day while their owners are at work. Applications are often seen as the place to be in the value chain, as they are perceived to be where the value flows to. Barriers to entry are small – it needs network connectivity to run but does not require the negotiation of a direct relationship with the network operator.
  • Analysis – the volumes of data produced by IoT devices and applications, combined with unstructured, qualitative data such social media feeds, means that “data science” is a critical skill. The automated nature of IoT means that much of the interpretation will itself be done in an algorithmic, or possibly more “neural network learning” way by machines themselves.

What are the implications for network operators? Consumers are more than willing to purchase applications and services direct from third-parties minimising their dealings with fixed and mobile operators. IoT could extend this separation dramatically. Operators therefore will have to build networks and carry data packets in such a way that unit costs falls more quickly than the price they can charge. At the same time, operators have vast amounts of network data and customer / device data. They will have to develop their own data analysis skills, both to improve their own business and to sell insight-based services to others.

And the implications for wider society? If an individual driver has a car crash, then that driver might learn for next time. If an autonomous Tesla car has a crash, then all Tesla cars in the world can learn for next time. A world in which high-quality interconnected networks enable new applications and services to launch rapidly to reach and connect consumers, citizens and devices over the top of those networks ought to be a good thing. At the same time an interconnected network is only as secure as its weakest connection. It can be hacked.

IoT has the potential to become embedded in almost every aspect of society and so its adoption raises questions of balance between individual, social, political and economic goals. Solving these is likely to be a series of steps and iterations – rather like a human version of a self-learning network.

This is a summary of a full article which appeared in The Journal, December 2016. To access the article in full visit the ITP website (free for members).

Advertisements

Back in the day…

The good old telephone service has gone though many changes during its lifetime but perhaps the most significant was the move from analogue to digital, reflects Professor Nigel Linge.

Naturally, the human voice is inherently analogue but transmitting it as such makes the untitledresulting electrical signal susceptible to the impact of noise and attenuation, leading to a reduction in overall voice quality. However, in 1938 a radical alternative technique was proposed by Alec Reeves, who was working at International Telephone and Telegraph’s laboratory in Paris.

Reeves proposed that the analogue signal should be sampled at regular intervals, with the amplitude of the voice signal being converted into a binary number and then transmitted as a series of electrical pulses. So long as these pulses could be detected at the receiver, the original analogue voice could be reproduced without degradation. Known as Pulse Code Modulation (PCM), Alec Reeves was awarded French Patent No. 852 183, on 3 October 1938 for his ideas which in effect heralded the dawning of the digital age. Unfortunately, as is often the case with pioneering ideas, the technology of the day was not capable of realising the complexity of PCM.

In fact, it was not realised until 1968 when the GPO in Britain opened the world’s first PCM exchange which was the Empress telephone exchange near Earls Court in London. This was the first exchange of its type that could switch PCM signals from one group of lines to another in digital form and it laid the foundations for the more widespread use of digital switching that now sees PCM at the heart of our fixed-line, mobile and IP-based telephony along with all our digital audio systems.

At the other end of the scale, and seemingly trivial in comparison, BT changed the way domestic telephones were connected to their network on the 19 November 1981 with the introduction of the plug and socket interface. Up until this time the telephone in your home was permanently wired to the BT network which meant that connecting a computer to the phone line could only be achieved using either an acoustic coupler or via a telephone with an integrated modem such as the Type No13A Datel modem set. The best speeds that could be obtained with such systems was typically 300bit/s. However, the introduction of the plug and socket interface in 1981 changed all of this. The telephone service provided by BT was now terminated in a ‘master’ socket into which the customer could plug their own phone.

More importantly, this meant that there was now a direct electrical connection to the external phone line which provided a more efficient mechanism for connecting a computer via a modem. In 1988 the V.21 modem increased speeds to 1.2kbit/s; in 1991 this was extended to 14.4kbit/s with the V.32 modem; and ultimately in 1998 speeds reached 56kbit/s with the V.90 modem. Thereafter the introduction of Digital Subscriber Line technology led directly to today’s superfast services and all thanks to introduction of a simple socket.

Today with 15 per cent of UK households now officially declared as mobile only, there is a slow but growing trend away from traditional fixed-line telephony. An important step on that journey was made on the 14 December 2009 when the Scandinavian telecommunications company, TeliaSonera, became the first operator to commercially launch a publicly available LTE (4G) mobile network. Back in 1981 Scandinavia had led Europe into the mobile era and now in 2009 it was leading the world into 4G deployment with services opening in the central parts of Stockholm and Oslo. The network infrastructure was provided by Ericsson in Stockholm and Huawei in Oslo and was initially targeted at mobile broadband customers using a Samsung provided LTEonly USB dongle. Proper 4G handsets took a little longer to materialise but once again Scandinavian companies led the way in Europe when the Samsung Galaxy S2 LTE became available to their customers in 2012. Later that year the UK witnessed the launch of its first 4G network. Today there are over half a billion 4G subscribers across 151 countries with, interestingly, the UK now cited as offering some of the highest average 4G download speeds in the world.

Mobile consolidation

For most of the past 20 years, competition authorities in mobile markets have focussed on securing the entry of additional competitors. Markets with high entry costs, like mobile telecoms, would not be expected to accommodate a very large number of firms but some degree of competition between a number of firms – more than one but not many – delivers better results.

Entry into mobile is restricted by the availability of radio spectrum. Opportunities arise by releasing spectrum from broadcasters or the military. But later entrants then face the challenge of competing with established firms. This was fine when demand was growing but today later entrants have to compete for existing customers of established operators.

But by the time of 4G (after 2010), interest in entering mobile markets had largely evaporated. The amount of new spectrum available was more limited and there had been some poor commercial returns by new entrants to 3G. Rather than promoting entry, 4G was driving the market towards consolidation. Firms like Hutchison were reluctant to invest in 4G when they had struggled commercially with 3G. Other firms, like Telefonica, felt that they could deploy their capital more profitably in emerging markets in Latin America.

Pressures on later entrants increased following global financial crisis after 2007. Moreover, by 2010, operators were also feeling the effects of competition from over-the-top applications such as WhatsApp and tighter regulation of international roaming charges and interconnection rates. But the main driver of consolidation was simply that late entrants found themselves unable to achieve sufficient scale to be profitable within a single technology cycle.

Rather than winning customers from rivals, the other way to achieve both scale and cost savings is through mergers with rivals. These savings mean that a rival operator can invariably offer a higher purchase price for the asset than a buyer that does not already have operations in the market. The other option for sub-scale or unprofitable firms was to exit the market by selling out to another party who is outside the mobile market. An example of this was EE, which was acquired by BT.

Consolidation can provide an escape route for sub-scale firms but it has less obvious benefits for consumers. The European Commission is concerned that prices will not be as low after the merger as they might otherwise have been. The advocates of mergers claim that, in the longer term, the cost savings from combining assets and operations will offset some of the upward pressure on prices that might otherwise be associated with a reduction in the number of firms. The Commission has generally rejected these efforts, finding that any savings will more likely bring higher profits for the owners of merging firms rather than being passed on as lower prices.

The other and more interesting claim relates to future investment. Advocates of mergers claim that the merged firm will be better able to invest because of its greater scale and/or higher levels of profitability – claims that are extraordinarily difficult to assess.

Consequently, competition authorities have only been prepared to approve mergers if the parties are also prepared to take various steps to replace the firm that is exiting with another entrant. But why, when one set of investors were seeking to exit, would another set be persuaded to enter? The predictable lack of interest has prompted authorities to promote various models which would allow new firms to enter the market at lower cost and risk by using the merged firm’s existing network as a Mobile Virtual Network Operator for an extended period.

The most interesting and immediate question is what happens to those firms whose merger plans have had to be abandoned. Do they sell to a party from outside the market at a lower price? Do they find another way to grow or to become profitable? In Europe, such firms can pursue the ‘failing firm’ defence, arguing that without a merger the firm will exit the market altogether.

The current debate reveals how little we actually understand about what determines the performance of these markets. We know monopolies are generally to be avoided, but we know very little about what might ensure higher levels of investment, how these investments might translate into prices, quality or other outputs which consumers care about.

This is a summary of a full article which first appeared in The Journal, December 2016. To read the article in full please visit the ITP site (free for members).