" /> Telco 2.0: October 2009 Archives

« September 2009 | Main | November 2009 »

October 30, 2009

GroupM execs at Telco 2.0

Telco 2.0 is delighted to be working with GroupM, the parent company to WPP’s media agencies, with billings of $60billion.

Rupert Day, their COO, will be a panellist on the ‘Media 2.0’ session at our EMEA event next. We’ll be running an intimate workshop on ‘what’s wrong with mobile marketing and how to fix it’ with his colleagues from Joule and Mediacom on the morning of day three of the event. And we’re delighted to welcome Peter Tortorici, CEO of GroupM Entertainment Worldwide, to join the keynote speakers at Telco 2.0 AMERICA on 9-10 December in Orlando.

Here’s Peter’s biog:

One of the television industry’s most respected executives and involved with some of television’s biggest hit series, Peter Tortorici has served as president of two broadcast networks, taken one of them to “number one”, developed top-rated shows and created new business ventures for some of Hollywood’s most prolific studios and production companies.

Peter was executive vice-president and then president of CBS Entertainment. During his tenure, he was part of a team that guided the network to achieve “number one” status in prime time, daytime and late night schedules for three consecutive seasons, from 1991 to 1994. As president, he was instrumental in developing such Emmy Award-winning series as “Northern Exposure”, “Picket Fences”, “Murphy Brown” and “Chicago Hope”. In addition, he supervised CBS Productions and turned out several highly successful programs including “Touched By An Angel”, “Dr. Quinn, Medicine Woman” and “Walker, Texas Ranger”.

Later, Peter joined the Carsey-Werner Co. as executive producer. He supervised ongoing production of the company’s network shows such as the Emmy Award-winning “3rd Rock From The Sun” and developed new programs including the launch of “Cosby”. He also created the company’s consumer products division and launched their first online initiatives.

Peter served as president and chief executive officer of Telemundo Network in 1998-99. He crafted and supervised the successful re-launch of the U.S. Hispanic network. He later joined with Sony Pictures Entertainment and Columbia TriStar Television as an independent producer. His Peter Tortorici Productions produced two critically-acclaimed prime time series, “Body & Soul” on Pax TV and “Significant Others” with NBC/Bravo.

Peter joined MindShare and founded MindShare Entertainment in 2003. Currently serving as CEO of GroupM Entertainment Worldwide, he provides global leadership to GroupM’s content initiatives. He has been honoured industry-wide for innovation, leadership and accomplishment in the emerging area of branded entertainment and brand funded content.

To share this article easily, please click:

The Pirate Plan for Voice

Ed. - To warm us up for the forthcoming Telco 2.0 Exec Brainstorms on new business models (next week in London and 9-10 Dec in Orlando, Florida), Telco 2.0 is blogging from eComm this week. Below are some more highlights…

If there was a Telco 2.0 coup at the world’s regulators, and they were taken over by crazy extremists determined to revolutionise the industry as fast as possible, what might happen? As you might expect, eComm is a good place to find out. It would probably be a good place to plan the coup, too. Most of the usual suspects are right here.

But the first step, I think, would be to act on the LTE voice fiasco. As Dean Bubley points out, we’re now up to six possible options, which is too many. Fortunately, some of them are so awful they can be instantly discarded; an overlay circuit-switched voice network, anyone? The obvious option for the revolutionaries is option six - forget about it, and run VoIP services over the data pipe. The legacy devices can keep using GSM - it’s famously difficult to actually turn a network off.

This would, of course, be a huge opportunity for the independent voice & messaging people. It would also be a huge opportunity for the carriers, to replace their Voice 1.0 interface and their efforts in the field of Worse Voice & Messaging with a line of technically superior and more customer-intimate products. Things like the power-saving radio control channel could be provided as carrier APIs. Even during a revolution, the default option is still very important, and therefore there would be money in partnering with one of the independents to get them access to the base of default-option customers.

Making that happen, though, would require action on the numbering problem. Cullen Jennings of Cisco pointed out that telephone numbers as an institution may well survive the PSTN, just because of the layers of social significance they’ve acquired. People are delighted to get someone’s phone number - not their e-mail address. E-mail is work, twitbookspace is public, telephony is emotional. Further, telephone numbers are necessary to make the interworking with older systems happen - Decommissioning Day is a long way off, and the first jurisdictions to do that would have to keep using numbers for international calls for a very long time.

Numbers are also the only identifiers you can use when the only user interface is a dial pad, and they also retain more identifying robustness than the e-mail style user names in most VoIP systems (but notably not Skype, with its cryptographic certificates), which is important for identity/security/authentication applications.

We’ve got a technical solution; ENUM uses the DNS to map telephone numbers to user names, TELURL does the reverse. But so far, the only available infrastructures are private ones operated by carriers, which keeps the old numbering model in place even as the business model crumbles. Fortunately, there is an example of a solution; some countries, like Finland, coped with mobile number portability by creating an organisation to manage the national numbering plan and map the numbers to their currently-relevant line identifier.

In a public numbering system, we could own our numbers and have anyone we wanted take care of them, with some public agency to sign the root signing certificate (because you don’t want to do national ENUM without also doing DNSSEC, the security extensions for the DNS). With that, and open access to the IP pipe, we could get our telephony from anywhere we liked, cross-network.

Further, as Rudolf van der Berg pointed out, changing the interconnection model for telephony to be more like the peering/transit Internet model would end the situation where the mobile operators get to subsidise themselves from the rest of the industry in general and the disruptors in particular.

The final element in this big bang strategy would be to tackle spectrum. Presenter after presenter pointed to the increasingly uncomfortable truth that vast amounts of spectrum are licenced, but unused, or unlicensed, and locally congested but still broadly underutilised. Permitting the opportunistic reuse of spectrum, beginning to open the higher frequencies, and coming up with what OFCOM’s William Wood called “a better flavour of unlicensed rules” might make it possible for disruptors to get access to spectrum, where they could use the new generation of cheap and open-source equipment to deliver mobility.

This would leave a connectivity- or utility-only sector and an empowered and technically advanced applications-and-services sector, and perhaps a few successful players from the old telco market who partnered successfully with the new voice applications.

It’s possible, here, to imagine that we’re going through the same scenarios as we identified in the Online Video report - and voice is just entering the Pirate World phase.

To share this article easily, please click:

RebelVox: really mission-critical voice & messaging

Ed. - To warm us up for the forthcoming Telco 2.0 Exec Brainstorms on new business models (this week in London and 9-10 Dec in Orlando, Florida), Telco 2.0 reports from last week’s eComm. We’ve said before that if you want to do better communications, you’ve got to understand the social conventions of telephony. Why is it that we treat telephone calls as sacred, interrupting any human activity whatsoever to answer the ringing phone, to any one of 4 billion or so possible callers? More to the point, telephony even takes precedence over face-to-face conversations, alone among all forms of communication. Nobody breaks off a conversation to read e-mail. This social super-status has costs; few things are quite as annoying or as intrusive as the phone that never stops ringing, and there is something especially poignant about one that never rings. Probably these conventions were established in the era when telephony was much more expensive, per minute, than labour; but they are so well entrenched in our culture and even in our neurology (try ignoring a ringing telephone - it’s harder than you think because you were conditioned that way early in life) that they last. Another maddeningly annoying experience is conversing on a poor-quality link; during the Falklands War, the difficult relationship between senior British commanders was considerably strained by the effects of the secure satellite voice system’s encryption and latency on their voices. Specifically, Admiral Woodward’s voice was made to sound remarkably like Donald Duck; he wasn’t a personality known for tolerating mockery and the regular teleconferences tended to aggravate disagreement rather than settle it. This is, if you like, an example of telephony becoming the opposite of communication. RebelVox, a US Voice 2.0 startup, was partly inspired by military experience; one of its founders recalls carrying no fewer than four radios and their associated, back-aching batteries during his last U.S. Special Forces (Airborne) tour of Afghanistan. None offered an acceptable user experience. The core technology proposition essentially lies in an algorithm for synchronising audio buffers on multiple devices over the air; this results in the first interesting feature of RebelVox. If the network quality worsens, the sound is buffered until it improves; if it gets worse still, it is saved and delivered as a recorded message. If the called party is unavailable, the same process is used. If they become available, though, they can interrupt the recording and catch up to the live stream. Multiple conversations and group conversations are supported from a single in-box; speech-to-text transcripts are a possibility, and if the network gets really awful, the system eventually degrades down to store-and-forward text messaging. So, it essentially bridges the domains of voice and messaging, and provides user control over the mode they receive information in, instead of the traditional sovereignty of the caller. The planned business model is to licence the technology. We think this is an interesting and innovative product, but as usual, we think they should forget the supposedly sexy consumer/YouTube/twitbook market, where so many things are free and so many bits contain so little value, and concentrate on enterprises and perhaps also their original market, the military, law enforcement, and safety-critical applications. To share this article easily, please click:

Quotes of Note

Ed. - To warm us up for the forthcoming Telco 2.0 Exec Brainstorms on new business models (this week in London and 9-10 Dec in Orlando, Florida), Telco 2.0 is reports from last week’s eComm.

Some quotes of note:

Michael Jackson of Mangrove Partners (not the recently deceased member of the Jackson Five), credited with introducing MVNOs:

Actually, MySQL started the MVNOs - you could just set it up and start billing without having to buy something incredibly complicated

(Our friends at PlusNet would agree; their billing platform was once thought to be the world’s biggest MySQL database.)

Sean Park of Nauiokas & Park:

Mobile operators, they’ve already built a huge micropayments platform that works! They’re doing this right now - they send me a bill down to the last second. But they aren’t using it because they’re the phone company. The other thing is multi-sided markets. I’m trying to get a much better knowledge, academically and practically, of multi-sided markets

Martin Geddes:

What makes the money in telcoland? Voice and SMS - the interconnected applications, that work the same way anywhere
Today you have missed calls; in the future, your phone will tell you why you should call back

Dean Elwood of Voxygen:

Fortunately there’s this service called Calliflower that makes it all MUCH BETTER

Cullen Jennings of Cisco:

Distributed hash tables are the biggest advance in data structures in 20 years…and they’re the opposite of cloud computing!
In the future, 100% of commercial communications products will incorporate substantial open source

Lee Dryburgh:

Our economy is suffering under this vast waste of human time, and this has got to change…even if 80% of calls go straight to voicemail, telcos get termination fees. The carrier doesn’t care about your time.

Martin Taylor, Metaswitch:

Aside from price, applications are the only way of differentiating ourselves

Colin Pons, KPN:

80% of telco revenue is telephony…but the telephony business model is dead. We forgot that VoIP is a technology - and if you change the technology but keep the business model, you haven’t changed much
To share this article easily, please click:

October 29, 2009

Voice 2.0 is Coming!

Ed. - To warm us up for the forthcoming Telco 2.0 Exec Brainstorms on new business models (next week in London and 9-10 Dec in Orlando, Florida), Telco 2.0 is blogging from eComm this week. This is probably the best event series on strategic technology developments, and we’re delighted to partner with it. Below are some highlights so far. NB: Google Wave users can follow the conference backchannel by searching “tag:eComm with:public”; presentations will be posted to the individual session waves in due course.

eComm is, of course, a good listening post for monitoring the progress of Voice 2.0. The news is that once you get out here to the cutting edge, it’s already taken as given that voice is an application and that value will be created in future by integrating it with other processes, applications, and services.

So Voxeo demonstrated their new Tropo.com cloud telephony platform; this provides an API for scripting-language developers to use in creating applications that use voice, based on Voxeo’s existing VoiceXML hosting infrastructure and interconnection. It’s impressively simple, and leaves you wondering what the telcos have been doing all these years.

Paul Sweeney of Voicesage called attention to an interesting and possibly valuable adjacent business model for CEBP. Namely, creating the CEBP APIs involves creating a wide variety of new “edges” at which the enterprise interacts with its customers, suppliers, employees, and other stakeholders in general. The concept of these “edges” was first popularised by John Hagel; essentially, he argues that value is created by the interaction of diverse actors at the points where they meet.

Making your business processes discrete, explicit, real-time, and measured, as CEBP demands, also means creating as a by-product lots of data about how your business actually works, rather than how it should work, how it was meant to work, or how people in it want to think it works. This makes it possible to use CEBP as a powerful management-information systems tool, and perhaps also to decentralise the design of front-line services.

Sten Tamviki, Skype’s Chief Evangelist, provided some interesting data on how a virtual telco whose revenue source is PSTN interconnection can actually make money. Over a year, retail voice prices fell by 7%, wholesale interconnect prices fell by 12%; but Skype’s volumes rose by 13%. Essentially, the volume free Skype-to-Skype calls pull in permits Skype to extract some of the best prices for bulk interconnection on its key routes, and therefore to keep its prices low whilst still maintaining a useful margin.

What are those routes? Increasingly, the main traffic routes parallel the main world migration routes; US-Mexico is one of the biggest. Obviously, interconnection is doubly important here as the availability of computers and Internet bandwidth is dramatically unequal. Skype’s markets exhibit a cross-over point in terms of per capita income at which the traffic types suddenly reverse; below the crossover, voice, instant messaging, and interconnect voice dominate. Above it, video calls take over; Skype can be both a premium application and a hard discounter depending on context.

So, if their biggest traffic routes are like that, why isn’t Skype the best alternative banking/money transfer player going? Shouldn’t there be an eFinance conference as well, as someone asked Lee Dryburgh? The answer is simple and slightly depressing; the Skype team have enough trouble avoiding being regulated as a telco, without risking being treated as a bank as well. It’s a familiar problem in m-banking, which is why we advise that you find a bank and brainwash it into joining in.

Rudolf van der Berg was back with a provocative presentation on how (in his view) the termination regime must go in order to make either free voice, or better voice, possible. He argues that an analogous model to the Internet’s tradition of peering and transit is better; voice operators wouldn’t be obliged to interconnect, but they would be required to make all the numbers in the national numbering plan available. Therefore, operators would make their own decisions on whether to peer or to buy wholesale transit, and the prices of voice and data would converge.

It wouldn’t matter whether a bit was a bit of voice or of an HTTPS e-commerce transaction; and therefore, it wouldn’t matter whether a bit was a bit of telco voice or third-party voice, theoretically opening the way to a much more diverse voice market. Interestingly, this inter-relates with Dean Bubley’s presentation on the problems of LTE voice standardisation; one option is just to forget it and do VoIP over the data channel. Van der Berg’s proposal would certainly help with that.

To share this article easily, please click:

Hands-On with Google Wave

Ed. - To warm us up for the forthcoming Telco 2.0 Exec Brainstorms on new business models (this week in London and 9-10 Dec in Orlando, Florida), Telco 2.0 reports from last week’s eComm.

As the above ought to make obvious, we’ve had an opportunity to use Google Wave during eComm. So what the hell is it? What’s it for?

Well, the notion that it’s more in competition with e-mail than anything else is roughly true. Rather, it’s more like e-mail than it is like voice or video or the Web, but it’s not very much like e-mail.

The simplest description of the experience is that it’s a hybrid of instant messaging, webmail, and a collaborative text editor like Google Docs, delivered in a user interface that is made up of elements borrowed from several other Google services. You post text - mostly - to the thematic waves, and others can then reply or edit your text. Changes are distributed in real time. There’s a contacts window, borrowed from GMail, and an inbox of things you’ve been using recently. As well as text, images, files, and YouTube videos will go in there too, as will small applications (known as robots).

The upshot, with a hundred-odd caffeinated geeks frantically typing stuff into it, is a fairly stimulating way to discuss and take notes in real time. However, it’s also quite distracting, and the user experience is…itchy. Your typing is mirrored in real time as well, so as you try to take in information, the user interface focus shifts annoyingly every time someone else presses a key. A lot of screen space is taken up by noisy bits-and-pieces, so you have to peer at the messaging and content you’re meant to be collaborating on. It is very easy to accidentally stop editing before you’re finished, and have to start again. And there are some horrible bugs; waves frequently hang and refuse to open, and the application is crashy.

Of course, Google will no doubt improve these things with time and kaizen. Indeed, it wouldn’t be at all surprising if a lot of Firefox/Greasemonkey developers are already working on user scripts intended to improve the look and feel. At the moment, there’s little chance of full alternative clients emerging, because although Google has a draft standard for communication between Wave servers, they haven’t standardised the client-server element yet.

The good news is that the Wave Federation Protocol seems reasonable - it’s intended to let non-Google Wave servers communicate with each other and with Google, like e-mail servers do. It works on the principle that the originating Wave server keeps the canonical version of data users give it, and that changes are communicated between servers using the well-understood XMPP protocol’s ability to transport chunks of XML as payloads in its instant messages. At eComm, Google provided that rare thing, a live demonstration that actually worked, of a Google Wave user communicating with a non-Google Wave server via server-to-server communication.

It’s well worth remembering that Google’s Chat/Talk instant messaging service, and its Talk voice service, are all XMPP-based; Wave can be seen as an effort growing out of GMail to integrate a variety of XMPP-based realtime communication applications.

At the moment, the only voice element is that there are a couple of third-party “robots” that control Web-telephony applications from within Wave. Martin Geddes, reliably too cool for school, turned up with one that implements a Ribbit plugin; Tim Panton of PhoneFromHere demonstrated one that used an Asterisk server’s AMI Web interface and the Skype for Asterisk plugin to interact with Skype.

Similarly, Wave is a long way from even considering a business model; it’s worth pointing out, as Stuart Henshall memorably twittered, that in a sense Wave is the most un-Google product ever. Google’s core product is search and its core business is advertising; Wave doesn’t carry ads at the moment and the search function deeply disappointed most people at eComm. In fact, Google Wave engineers present admitted openly that it left a lot to be desired. As an example, if you searched for “eComm” during the conference, you would get no results; you had to provide the rather odd search string mentioned at the top of this post.

However, we can certainly imagine that advertisers could well be interested in Wave as a conversational marketing tool. It’s been said that for some reason, you can’t charge for access to a Web application, even if you optimise it for mobile, but you can sell an iPhone app that wraps around the same site’s APIs - the app has a sort of “thinginess” that people attach value to. Wave’s “robots” could well play a similar role.

To share this article easily, please click:

Delivering FTTH with Free.fr

Ed. - To warm us up for the forthcoming Telco 2.0 Exec Brainstorms on new business models (next week in London and 9-10 Dec in Orlando, Florida), Telco 2.0 is blogging from eComm this week. This is probably the best event series on strategic technology developments, and we’re delighted to partner with it. Below are some highlights so far. NB: Google Wave users can follow the conference backchannel by searching “tag:eComm with:public”; presentations will be posted to the individual session waves in due course.

Benoit Felten introduced a presentation typically rich in chewy data on an economic model of FTTH deployment he and his Yankee Group colleagues have prepared. The take-home message is that their sensitivity analysis shows that the primary drivers of return on investment in FTTH deployments are:

  • The take-up rate
  • Cost per home passed
  • Open access

By comparison, initiatives intended to drive up ARPU had a negligible impact on the business case for fibre. Euros spent on increasing take-up, building operational excellence in your fibre deployment teams, or developing a better wholesale model to open the network to other service providers, however, pay back many times over. Just taken in isolation, open access cuts three years off the payback period for an FTTH network.

Operational excellence was also the theme of Rudolf van der Berg of Logica’s presentation on Iliad, who will need no introduction to Telco 2.0 readers. Van der Berg argues that the vast majority of the effort operators make to segment, slice, and generally fiddle with their customer bases is entirely wasted. Instead, Iliad/Free.fr is winning with one hyper-simple product: everything for €30 a month. Of course, this implies a major effort to achieve the superb engineering required to deliver “everything” and to do it at levels of OPEX low enough that such smash-mouth pricing is sustainable.

The slides are available here:

To share this article easily, please click:

The Promise and Reality of LTE

Ed. - To warm us up for the forthcoming Telco 2.0 Exec Brainstorms on new business models (next week in London and 9-10 Dec in Orlando, Florida), Telco 2.0 is blogging from eComm this week. This is probably the best event series on strategic technology developments, and we’re delighted to partner with it. Below are some highlights so far. NB: Google Wave users can follow the conference backchannel by searching “tag:eComm with:public”; presentations will be posted to the individual session waves in due course.

Spectrum and LTE: promise and reality

Telco 2.0 ally Brough Turner argues that the scarcity of radio spectrum is an illusion brought on by telcos and the limits of current receiver technology. He described experiments in measuring the actual utilisation rate of the US CMRS (Cellular Mobile Radio Systems) bands; the highest utilisation rate they were able to observe was only 13%, during a US political party convention.

Surely, then, there’s plenty of room at the bottom.

The reason why we don’t need to regulate the visible spectrum is that the receiver - the eye - is far more sensitive and selective than any radio device we’ve ever constructed; if we can improve the receivers, software-defined radios could squeeze out much more bandwidth from the existing spectrum allocations, and the mobile operators’ monopoly of spectrum would be subverted.

On the other hand, there’s also plenty of room at the top, in the extremely high frequency bands that are currently largely unused. A major advantage here is that it’s much easier to build a MIMO device for gigahertz frequencies; the distance between the multiple antennas has to be half the wavelength. so a MIMO device for the so-called beachfront spectrum at 700MHz can’t physically be a mobile device because it’s got to be rather longer than a laptop.

Beyond MIMO, it’s becoming possible to put multiple radios on a chip as well as multiple antennas on a radio; this opens the possibility of using beamforming.

However, Moray Ramsey of test-and-measurement specialist Agilent Technologies had a bucket of cold water to go with all that optimism; he argues, based on data from Agilent’s work on LTE conformance testing, that the next-generation wireless standard has serious problems. The baseline performance of LTE Release 8 radios is turning out to be a disappointment on the test stand; especially, the OFDMA and MIMO technology is more complicated than originally thought, and rather than doubling the capacity by going from 1 antenna to 2×2 MIMO, the boost is closer to 20%.

Worse, the LTE vendors have been using the wrong benchmark for comparison; LTE has been trialled against early HSPA deployments, against which it does indeed show a substantial improvement. But very few things in the industry have advanced as quickly as HSPA once it got going; peak download speeds have increased by a factor of 10 since the end of 2005 and uplink by a factor of 15. The latest wave of HSPA kit, in fact, beats the current LTE generation soundly.

As Dean Bubley pointed out in discussions later, the number of solutions to the LTE voice problem is now up to six. And worst of all, one of a list of criteria he cited as necessary to achieve LTE implementation was that IMS would need to be deployed. We’ve said this before; Long Term Evolution ought to mean just that - evolution, and that means that good HSPA now beats middling LTE tomorrow.

To share this article easily, please click:

October 27, 2009

O2 and Buongiorno: Winning with Two-Sided Advertising

Not so long ago, we published a research note on content wholesaler Buongiorno and its successful transformation towards a two-sided business model. Specifically, we liked the way their platform allowed rewards - like airtime top-ups - to be based on rules, firing off an event whenever specific conditions are satisfied. Their Top-Up Surprises product with O2 UK, for example, offers a giveaway of extra SMS messages and minutes for prepaid users who top up more often; not only does this encourage the subscribers to use more of the core voice, messaging, and data product, but it also creates an opportunity to sell to upstream customers. Although there was no evidence that O2 had yet considered such a move, we speculated that a useful two-sided move would be to make the surprise associated with somebody’s brand - for a fee, of course.

We recently got this press release:

O2 Media will be opening up the hugely successful O2 Top-up Surprises reward scheme to allow UK brands to reach O2’s 10million Pay & Go customers. Recently ranked by Hitwise as the most visited competition website in the UK*, Top-up Surprises is one of the most popular reward schemes in the mobile industry.

Launched in November 2008, Top-up Surprises rewards Pay & Go customers on O2 every time they top up. Surprises range from extra texts, picture messages and minutes through to prizes such as holidays, TVs and mobile phones. Rather than traditional ‘push’ direct response campaigns, Top-up Surprises gives advertisers access to a channel through which consumers are already actively engaged and proactively visiting.

Blockbuster has been one of the first brands to take advantage of this opportunity with a campaign designed to drive cost effective customer acquisitions. O2 customers visiting Top-up Surprises to claim their reward were offered a 30 day free trial of Blockbuster’s unlimited rental service, a £10 voucher to spend in a Blockbuster store or online and £1 off the monthly fee if they chose to take up the unlimited rental service. 52% of customers chose to take up the Blockbuster offer and of these 11% have already redeemed the offer voucher.
It’s worth noting that this wraps up a number of Telco 2.0 principles - the importance of the core voice & messaging product, the value of upstream customers and wholesale, and the problem of respecting data sovereignty while utilising the customer data pile.

In the light of the recent Hoofnagle et al paper on the acceptability of behavioural advertising, it’s worth noting that this play both offers a direct reward for the customer, and respects the boundaries of their privacy relationship with the operator.

[Ed. - Buongiorno and O2 UK will be talking about this and other related topics at the 7th Telco 2.0 Exec Brainstorm in London next week]

To share this article easily, please click:

October 26, 2009

Ring! Ring! Hot News, 26th October 2009

Telco 2.0 Top Stories

China Mobile reports a return to growth in Q3, with income up 2.6% year on year after the first quarter of decline since 1999. The carrier signed up another 15 million subscribers in the quarter. Meanwhile, China Telecom saw a 48% plunge in its profits as subscribers rushed away from its core fixed-line voice business towards mobile, and its subscriber acquisition costs rocketed as it makes its own move into the mobile business. It reminds us of a crack of Boris Nemsic’s when he was running Mobilkom Austria; if all your national fixed-line and mobile calls are inclusive, what do you need fixed-mobile convergence for?

Ericsson announced Q3 profits down 74%, on a mixture of weak demand for network equipment and problems at its handset- and chip-making joint ventures, which are absorbing huge quantities of cash. It may well be true that the market for network kit is weak; but it’s also true that most of the big contracts that have recently gone to European vendors have gone to Alcatel or NSN.

On the other hand, they’ve announced an SMS-based, revenue-sharing mobile payments system; it will, however, cost the publishers who want to use it up to €1100 a month, which sounds like it might drown the fish.

Here’s an interesting thought; analysts for Execution Ltd., which sounds far too much like Murder, Inc., reckon the telcos are going to be forced to start rolling out fibre in earnest, if not by the government then by the start of DOCSIS 3.0 upgrades in the cable TV networks. As if to underline the point, German cableco Unitymedia announced 120Mbits service in Cologne and Aachen; pity about the uplink, though, which is 5Mbits or a ratio of 1:24. How will that work out on a cable network like Comcast that’s gone marginally net-outgoing?

Meanwhile, in regulatory news, the EU has issued a directive that requires member states to implement GSM spectrum refarming, and the FCC is about to announce draft Net Neutrality rules. The Electronic Frontier Foundation is now worrying about the FCC using this to exceed its authority; they’ve apparently not heard the phrase “rejoice - just rejoice!” Senator John McCain disagrees and wishes to legislate against it; he invented the BlackBerry, remember…

Apple is, predictably, feeling smug after the best quarter in the company’s history and a 7% boost to sales of iPhones. Nokia responded by suing Apple over an alleged patent infringement.

As a consequence of all those iPhones, 60% of the devices using AT&T’s WLAN hotspot business are now smartphones, the first time devices other than laptops were in the majority. Telephony Online points out that this means the WLANs are now really there to offload bulk data from the UMTS cellular network.

Which could be handy; Brough Turner has a tour de force post on the real problems facing AT&T’s mobile network and exactly why it’s suffering from congestion. It’s highly technical, but the problem is essentially that the buffers on their radio-network controllers are too big and therefore anything involving TCP/IP grinds to a halt because it’s unable to sense the congestion.

Of course, few operators are in a better position to backhaul their traffic, whether from femtocells, macrocells, or from WLAN boxes, than Free.fr, with its overbuild fibre network. It looks like parent company Iliad is now a cert for the fourth French mobile licence, after all the other competitors dropped out. France Telecom, by the way, is now the second-least popular large company in France.

Meanwhile, the drum is banged for Google Android as more gadgets await launch. Details of the Motorola Droid are leaked here. Palm, however, can take comfort in a good start in the UK; however, they’ve also succeeded in alienating Netscape legend Jamie Zawinski to the point of giving up and getting an iPhone.

Relatedly, here’s leading political blog TalkingPointsMemo, talking to readers about its mobile device policy.

Back in the UK, 3UK is apparently trying to include a maximum number of buzzwords in one product, by bundling a Spotify subscription with a Google Android device.

Symbian has open-sourced its kernel, and rather more worryingly, lost Psion-era founding father David Wood, who is apparently off to spend more time thinking about the singularity. They also announced the BeagleBoard, an open-source minimal computer platform based on a Texas Instruments OMAP system on a chip for $149. Someone’s already building a robot around it.

Despite the mobile OS wars, everything is a Web site. This is just as true in mobile as it is on the desktop; consider this data. Between 2007 and 2008, UK mobile subscribers deserted the carriers’ “on-deck” sites in droves in favour of going straight to the content they were after. This is especially telling given that it happened in the context of soaring user adoption; losing market share in a falling market is bad, losing it in a growing market is awful.

In more serious business, ABI Research predicts steady double digit growth in mobile data revenues from the enterprise sector. Interestingly, the biggest data hogs are Eastern Europe and the Middle East. Rival crystal ball squad Informa Telecoms & Media expect enterprise customers to reach 24% of the total by 2014, for a headline figure of $92.6bn in data service revenue.

But what about the voice? We’ve said before that there’s no future in cloning the PSTN and calling it a VoIP service; there goes another, as Sasktel gets out of the business. David Burgess’s OpenBTS project, meanwhile, went down predictably well at Astricon, with small rural carriers being especially keen on the possibility of going mobile for cheap; the lesson, however, of the failed carrier VoIP players is that better voice is only optional in the way survival is optional.

GrameenPhone, Bangladesh’s biggest GSM operator, has completed its IPO, the biggest in the history of the Bangladesh stock exchange, which was three times oversubscribed. And there were reasonably good figures at Tele2, on the back of a strong performance in Russia.

The ugly mess around those lost Sidekick profiles is being blamed on third party tech providers; meanwhile in the UK, The Guardian newspaper tends to talk a good game on things Internet, but this week hackers nicked 500,000 CVs from their jobs website.

Clearwire announced more WiMAX roll-outs. A rumour of a Google music service (surely that’s YouTube?) was denied. Twitter has picked Bing as its preferred search engine. Some Time Warner Cable modem/routers are accessible with administrator privileges if you just turn off JavaScript.

And Telco 2.0’s offices are apparently in “Silicon Roundabout”, which is apparently cool enough to be on Bruce Sterling’s blog. Don’t tell the landlord.

To share this article easily, please click:

October 21, 2009

TV preview of 7th Telco 2.0 Exec Brainstorm

Telecom TV preview of upcoming Telco 2.0 Exec Brainstorm:

To share this article easily, please click:

October 20, 2009

Latest Internet traffic stats: Google and CDNs outmuscle Tier 1 Telcos.

These may be the most important charts you see this year.

At this autumn’s NANOG in Dearborn, the twice-yearly get together for the Internet operations engineering community, Craig Labovitz gave a presentation (download here) on the latest ATLAS Internet traffic study. It deserves to be considered a seminal document. It’s already been hyped as part of the YouTube bandwidth cost wars, but it’s so much more than just the fact that Google is peering extensively.

Below, we describe the contents of the presentation, their implications for the future of the Internet and its economy, and discuss how these findings relate to Telco 2.0. If you’re involved in Internet service provision, content delivery, or investment in the TMT sector, you need to read this.

We’ll be discussing this new data, in particular during the Cloud Computing 2.0 sessions, at the upcoming Telco 2.0 EMEA (4-5 Nov, London) and Americas (9-10 Dec, Orlando) events.

From Big Transit to Big Platform

From the early days of the Internet, there’s been a clearly identifiable structure that rather contradicts the public image of it as a seamless mesh of interconnection. Edge networks were served by ISPs, who in turn relied on the major, frequently American, transit carriers for their upstream connectivity. This structure emerged from the development of the NSFNet in the 1980s and 1990s, in which the US and European NRENs (National Research and Education Networks - JANET in the UK, GEANT in France, NSFNet and then Internet 2 in the US) first interconnected university and private R&D organisations, and then interconnected to each other using a small group of telcos’ wholesale service. Later, the arrival of commercial Internet backbones from 1994 and the creation of network access points (NAPs) cemented this structure in place.

labovitz.png

A small elite of operators relied entirely on peering with each other, the so-called Tier 1 carriers; if you weren’t Tier 1, you were a customer. Tier 1 domination was based on two scarcities - that of backbone connectivity, and that of interconnection. The first was undermined by the massive investment in dark fibre of the .com boom, and then overwhelmed by the second wave of submarine cable investment in the late 2000s. The second was undermined by the growth of Internet exchanges - in contrast to a NAP, where lower-tier carriers interconnect with Tier 1 carriers, an IX is a facility where networks of any size exchange traffic, usually a membership organisation. Although the first IXen appeared almost as soon as the restriction on commercial interconnection with the NSFNet was lifted in 1994, this structure survived into the mid-2000s.

labovitz1.png

Now, though, it’s gone. In 2007, all the top 10 networks by ATLAS traffic measurement were global transit carriers - all except for Cogent and Telia were formally Tier 1 in the sense of being transit-free. The list looks very much as it would have done in 2002; AT&T, Abovenet, Sprint, Global Crossing, all there, Verizon would have been UUNet and MCI. Today, although Level(3) and GBLX are still on top, Google is number three and Comcast Cable number five. The impact of change is visible in terms of pricing; transit is becoming a super-dumb pipe product.

labovitz2.png

Google, meanwhile, now accounts for almost as much Internet traffic as Level(3) did two years ago! The move towards direct peering interconnection between content and eyeballs is well underway. And it’s also notable that YouTube is disappearing as a separate entity for internetworking purposes - subsumed into the Google infrastructure.

labovitz3.png

CDNs - Supertankers of the Internet

But it would be profoundly wrong to conclude that Big Content has won out over Big Telecom, and that the plausible talkers of the 90s were right about content being king. You will look in vain for content owners in the top 10. Rather, the key actors in the new look Internet are the Big Platforms. Google is one; the others are the major CDN operators. The top five content delivery networks - CDNs - now account for 10% of global traffic. In fact, because ATLAS only tracks interdomain traffic, it’s probably closer to 15% - they estimate that three-quarters of Akamai’s traffic is intradomain, between its edge servers and hosts on the same local network. This is, of course, the point of Akamai’s existence.

labovitz4.png

This has a crucial role in another trend ATLAS identified; in 2007, the top 30,000 ASs (Autonomous Systems - roughly, individual networks) accounted for 50% of global traffic. Today, the majority of global traffic is heading to or from the top 150 networks. But this doesn’t imply a hierarchical structure like that of the old days; rather than going via 111th St NYC, UUNet, and LINX to reach content, users get it from local CDN servers. The average hop count, a measure of directness and routing complexity, has fallen to 3.5.

labovitz5.png

Content is king; but distribution is King Kong, and there’s more to distribution than just pipes.

Everything is a Web site: our client/server Internet

Another major trend is the concentration of traffic into the World Wide Web. Traffic on HTTP port 80 is by far the biggest category and the fastest growing. Rather than fancy P2P systems, the leading distribution systems for Internet video are HTTP and Flash streaming - i.e, just like YouTube.

labovitz6.png

Although P2P traffic seems to be falling, however, it’s not going away; it’s also hardening its defences by using strong encryption and selecting its ports at random, thus defeating characterisation either by port category or by deep packet inspection.

Comcast: Wholesaler

If P2P is not what it used to be, surely this means that the residential eyeball networks are more downstream-heavy than ever? You might imagine we’re heading for a content is king Internet, where the big media industries (or rather their logistics partners, the CDNs) shovel movies at a passive user population via a downlink-heavy ISP community. But, fascinatingly, this isn’t happening; one of the most TV-minded of ISPs, US cable operator Comcast, has ended up in the global carrier top 10. And it’s turned its uplink/downlink ratio around while doing it - Comcast is now a net sender to the global Internet.

labovitz8.png

In 2007, it was pulling into two or frequently even three bytes of data from the Internet for every byte it sent, the classic pattern of an eyeball network delivering content to read-heavy residential users. Now it’s a marginal net exporter of traffic. The exact figure helps to show what’s happened; Comcast’s traffic ratio is now hovering just a tad over 50%. That puts it in a group with the major transit providers; the big platforms send three or more bytes outbound for every one inbound.

labovitz9.png

That, in turn, suggests that their business is being driven by two-way communications applications - specifically, they’re providing mobile backhaul, metro-Ethernet connectivity for businesses, voice over IP transit (the quintessential 50/50 ratio application), and wholesale video delivery to other ISPs, like a CDN. Richer wholesale is driving their business model.

Remain Calm: Buy Data Centres

The overwhelming conclusion from this data is that the platform - for content delivery, for better wholesale, and for cloud computing - is where it’s at. Big platforms, we said, are the load-centre container ports of the digital economy, our future platforms for growth. As the Google engineers are in the habit of stickering on their laptops: My other computer is a data centre.

my other computer is a data centre (thanks to licio)

To share this article easily, please click:

October 19, 2009

Ring! Ring! Hot News, 19th October, 2009

Telco 2.0 Top Stories

The agenda is out now for Telco 2.0’s European and American events; book here while there are still places available.

Nokia announced ugly Q3 results this week, booking the first quarterly loss in 10 years. In Q3, Nokia lost €559 million overall, against a consensus forecast of a €350 million profit. In fact, the analyst consensus wasn’t that far off in terms of the operating level - Nokia took a €908 million write-off against the value of Nokia Siemens Networks, which therefore implies the company must have made €351 million from operations before the monster accounting charge hammered it. (Remind anyone of Vodafone’s books back in the day?)

They also lost six percentage points of smartphone market share, but the real horrors were deep within NSN; revenues and profits are sliding. Not so long ago, the head of NSN said he aimed to move towards software and services; is another Nokia services strategy really wise? Anyway, in Q3 the services line of business made up 47% of NSN’s sales.

After all, here are some leaked numbers about Comes With Music; there are a total of 32,000 users in the UK and 107,000 worldwide, nine months after launching. On the up side, NSN scored contracts for six of the geographical “circles” out of 22 in Telenor-owned Unitech Wireless’s Indian roll-out. For comparison, ZTE landed three.

There’s a podcast here on using the Ovi Store as a developer. Meanwhile, Telephony Online suggests Google Android might “take down Nokia”. But one of the analysts quoted as support says that as well as Android
Nokia is seeing increased smartphone competition from LG, Samsung, Palm, RIM, Apple and HTC
That doesn’t leave anybody else…although it’s certainly true that Samsung and RIM ship a lot more phones than Apple or Android.

Google had Q3 results as well this week. Revenues are up 7% and margins up 5% year on year. The key metric of traffic-acquisition cost, TAC, was up somewhat, but was marginally lower as a percentage of ad revenue than a year before. Free cash flow was some $2.54bn - Google is becoming a cash machine. Interestingly, 53% of revenue came from outside the United States; Google is also becoming a major exporter. Rich Karpinski of Telephony Online expects a renewed push for monetisation after Google took an investment pause during the crisis. (Even if Android could be crashed by a specially crafted SMS for a while…)

Meanwhile, the YouTube cost base wars are on again. Wired reports that Arbor Networks will present new research at the next NANOG meeting suggesting that Google is getting practically all its bandwidth from peering relationships, and further that the structure of the Internet is changing. In 2007, the majority of traffic was heading from or to the top 30,000 ASNs; now it’s the top 150. On the other hand, the density of interconnection between edge networks has hugely increased, as more and more content networks peer and more and more content is served from major CDNs, and less traffic has to pass through the major transit operators. Not surprisingly, with all that data whizzing around, Cisco is updating its line of routers.

We’ve done a string of posts - here, here, and here on this issue. From the last, this is what we were saying in June:
A case can easily be made that Google could make its cost of delivery for video - zero. Every global IP transit provider would love to be the exclusive deliverer of such a significant portion of the world’s Internet traffic, and the transit providers could make money by squeezing the downstream ISPs in their cost of delivery. Such an extreme network design would bear a heavy political cost for Google and would obviously be unpalatable, but it illustrates the power that Google has accumulated through the YouTube traffic.

In other content delivery news, remember Joost? The Skype founders’ venture into P2P TV has ended with acrimonious litigation, rather like the buy-back of Skype itself. It’s quite possible that Zennstrom and Co have just spent so much time in smelly developer pools together that they can’t stand the sight of each other any more.

And 70% of the British public opposes plans to cut off P2P filesharers from the Internet.

Sun Microsystems is about to launch its app store for the Java world; Apple has decided to permit free iPhone apps to offer things for sale inside the application. They had almost certainly insisted that such applications be sold for a price for fear of their revenue share being bypassed; now, theoretically, you could build an application that contains a whole app store.

In more interesting app-store/developer community news, we’ve wondered why Telefonica didn’t take Litmus to Brazil. Well, TIM Brasil is launching a multi-platform app store, backed by Qualcomm, using their Plaza Retail app store-in-a-box solution. Qualcomm is desperately keen to get into applications, as a counter to the end of its 3G monopoly and LTE’s victory in the standards wars. This gives them a serious launch customer in a country with a renowned hacker community (hey, they invented Commwarrior).

Meanwhile, Shazam has put on 15 million users since February and tapped some investment from Kleiner Perkins. It’s a nostalgia trip to the early 00s! Shazam was one of the very, very first mobile applications to hit the market and practically the only one to get any traction with consumers before the iPhone. It’s profiting hugely from the app store boom - it’s on all the big four (Apple, RIM, Android, and Ovi), it got 10 million downloads on Apple, and it’s the second most downloaded app on BlackBerry App World. AT&T’s CTO, meanwhile, recently blamed music applications for burning through their data network’s capacity.

Time Warner Cable announced it would begin reselling Sprint/Clearwire WiMAX service, in a move that had been long predicted.

Elsewhere, Saudi Mobily saw surging growth; MobileOne steady progress; and Alcatel-Lucent bagged a contract to build NTT DoCoMo an all-IP network.

Speaking of broadband and IP, the FCC is showing a lot of interest in the submission from the Berkman Centre about its stimulus plan, which argues that open access is vital to the success of fibre projects. As we recently pointed out, the public sector is leading the deployment of NGA worldwide - here’s Italy, with a scheme to ensure minimal broadband access, and here’s the Commonwealth of Massachusetts, with three public sector fibre projects getting their stimulus plan money.

The Ghanaian government and Vodafone are under pressure regarding the terms of Vodafone’s acquisition of Ghana Telecom - an investigation alleges that Vodafone paid significantly less than the officially announced price, and among other things that the state should hang on to the national fibre backbone as a strategic asset.

The US government is still suing to hang on to documents about the illegal surveillance program. The aim of that project was to data-mine the operators’ CDR piles in order to find suspected terrorists; some people do this voluntarily, and we call this “social networking”. Twitter has announced a PageRank-like reputation digging feature, and Microsoft claims to have recovered some of the lost Sidekick user data. A scandal is going on after a huge German social network, SchülerVZ lost records of over one million users.

You may recall that emerging markets pioneer and Celtel founder Mo Ibrahim decided to use some of the proceeds of selling Celtel to the Kuwaitis to pay a cash reward for African heads of state who behaved well and retired peacefully. This year, the prize goes to…no-one at all, as Mo doesn’t think any of the candidates deserve it.

To share this article easily, please click:

October 18, 2009

Facebook developer app stats

We’re looking forward to the sessions on ‘Monetising telco APIs’ at the upcoming EMEA (4-5 Nov, London) and America (9-10 Dec, Orlando) Telco 2.0 Exec Brainstorms, building on output from the May event. A big thank you to one of the panellists, Toby Beresford, a developer specialising in Facebook apps, who shared some latest stats with us (below). Surely an opportunity for telcos to create a platform too. Just needs a joined-up commercial strategy as we described before…. Do come along to one of the events to get the inside track with leaders from Vodafone, BT, Orange, Telefonica, Google, Verizon Wireless, etc…

Latest Facebook developer app stats:
* More than one million developers and entrepreneurs from more than 180 countries
* Every month, more than 70% of Facebook users engage with Platform applications
* More than 350,000 active applications currently on Facebook Platform
* More than 250 applications have more than one million monthly active users
* More than 15,000 websites, devices and applications have implemented Facebook Connect since its general availability in December 2008

To share this article easily, please click:

October 12, 2009

Ring! Ring! Hot News, 12th October 2009

Telco 2.0 Top Stories

BT says it’s actually going to do a lot more FTTH than previously planned, and it’s going to overbuild as well as install in new construction. Apparently, this is because they’ve discovered that it doesn’t cost as much as they thought, and (according to various press reports) they can use their existing ducts. Wasn’t this obvious? Or is this a reference to the secret cable-stripping tech they bought into?

Alternatively, the opposition hasn’t said much about telecoms, but they are keen on regulated access to ducts, poles, and trenches. With an election a few months away, perhaps the prospect of competitors putting fibre in their ducts has smoked out BT? Meanwhile, Telstra issued a list of objections to the Australian government’s plan to impose structural separation.

BT also has an interview with Martin Geddes on the corporate Web site, discussing new business models and Voice 2.0 (what else?) Swinging off that particular Web-liana, there’s an interesting blog post from Richard Veyrard here.

Speaking of voice, AT&T has caved in and will now permit iPhone users to make VoIP calls over their cellular network, thus essentially ending the whole charivari about Google Voice and the iPhone’s XMPP capability.

If you’re wondering why Verizon was only half on board with last week’s Vodafone 360 announcements, here’s the story: they’re betting heavily on Google Android devices, even if they’re also keeping an anchor out to windward by investing in LiMo gadgets. This may mean that Motorola is no longer a zombie company - even if the old staple market of Verizon’s CDMA investments is gone forever, they’re looking at a steady stream of Android work for VZW. As if on cue, Moto trimmed its commitment to LiMo, withdrawing software VP Christy Wyatt from the board of the foundation, although they remain on board as an associate member.

As a result of all this Android activity, Gartner re-assessed its forecasts for 2012 smartphone market share; they reckon Android will be marginally ahead of both Apple and RIM, with Symbian still well in the lead. Despite this, does it worry anyone that Eric Schmidt thinks handset subsidy is a great idea?

Whatever he may think about handset subs, though, you can’t odds a total IT outsourcing contract for 35,000 workstations, which Rentokil has just signed with Google.

Etisalat, meanwhile, is planning to launch a handset of its own and is negotiating with the Chinese manufacturers. You can expect a “customised user interface and broadband connectivity” for $80, it appears.

The new version of Amazon’s Kindle is here and it has a GSM radio, at last. The Guardian’s blog has an interesting piece on a rather curious two-sided business model they are using; users in the UK are paying rather less than AT&T’s bulk data roaming rates, and the secret turns out to be that US Kindle users are charged a premium for roaming internationally, which subsidises the service for international users. There’s more, including the interesting point that Sony’s rival device uses a Qualcomm dual mode chipset to handle both civilised and US networks, here. Despite that, traders at the Frankfurt book fair report that nobody buys e-books very much.

If it’s broadband connectivity you’re after, in much of the world outside the OECD, your best chance is WiMAX. And the best option for WiMAX is to see a specialist; Safaricom has just signed a contract with Alvarion for a national broadband wireless network, while Airspan expects to cover the gas-rich Bolivian province of Santa Cruz in broadband within four months. On the other hand, the South Korean government is not pleased with the rate of investment in either WiMAX or IPTV - the latter goes without saying, but the former just helps to make the point that WiMAX is the solution for the emerging markets, rather than UMTS 2.0.

The iconic - almost stereotypical - emerging market application is mobile money. Uganda’s New Vision reports that, six months after launch, MTN and Zain have signed up 250,000 customers between them and that 47% of money transfers in Kenya are now carried out through M-PESA.

Nortel, meanwhile, once a WiMAX pioneer, is being broken up piece by piece. Fibre specialists Ciena have acquired what was once the pride of Canadian industry, the Nortel unit that produced optical networking and carrier Ethernet equipment based on its treasure of patents from STC and BT.

TIM Brasil announced that it’s bringing forward a $4bn investment plan to expand its network and integrate a long-distance fibre operator it bought.

There is much havering about the proposed unlimited music download service from Virgin Media; and the behavioural ad industry mourns Phorm, while EU regulators howl at the gates. They’re already coming for the spammers; did you know the UK has never prosecuted a spammer?

In other regulatory news, it seems that the FCC knows what it wants and it knows how to get it; if the carriers want that juicy 800MHz spectrum, they’ll have to accept net neutrality. Basta! And the Federal Trade Commission wants to make bloggers disclose their freebies. (There are freebies for blogging? Who knew?)

Deutsche Telekom is accused of “saving its network to death” after it failed a much-followed quality test; it doesn’t help that they’ve still not been able to fill in a coverage hole between their HQ in Bonn and the airport, despite this personally embarrassing Kai-Uwe Ricke. Nothing like the embarrassment Didier Lombard subjected himself to - we linked to the rant from earlier this year where he accused workers outside Paris of spending all their time going fishing. Now, after dozens of employees committed suicide, who’s sorry now?

As it happens, Microsoft and T-Mobile USA are pretty sorry, in every sense of the word; all the data stored by Sidekick users in the cloud operated by Microsoft’s Danger division has disappeared. Given that the whole point of the Sidekick was that all your contacts, photos, etc were kept on a remote server and synchronised with a Web page, we’re in epic fail territory here. Users are currently advised not to switch off the gadget under any circumstances or let the battery run down, because it uses the cloud and only the cloud for persistent storage, and until the servers are back up, all your data is gone if you switch the thing off.

Famously, Cisco Systems eventually adapted to the idea that its users would hack the specialised version of Linux that runs on some of their Linksys WLAN routers. Now, they’ve taken it further; there’s a cash reward out for the best app that runs on a Cisco AXP-series router’s embedded Linux distro. The first winner created a building management system. There’s geeky for you.

Telephony Online visits the Nokia lab that tests mobile phones to destruction; and finally, this year’s Nobel Prize for Physics goes to three scientists who pioneered fibre-optic telecommunications. We’d especially like to note Charles Kao, who went from the then Woolwich Poly straight into the old Standard Telephones & Cables (STC) R&D operation, and discovered that the main obstacle to working fibre-optics was chemical rather than physical - too many iron ions in the glass, typically. (STC eventually became Nortel UK, and then, history.)

To share this article easily, please click:

Digital Music - will a ‘public license’ address the fundamental challenges?

Below is first of a new series of ‘Devil’s Advocate’ articles, where we ask people to look at topics from a different point of view.

dadvocate.jpg

This guest post from Gerd Leonhard of Mediafuturist, takes the form of an impassioned ‘Open Letter to Governments’ for a Digital Music License (DML) as an alternative to the proposed ‘3 Strikes and Disconnection’ legislation in many countries.
 
Readers should note that the ‘official’ Telco 2.0 line runs contrary to Gerd’s views. We believe that there are other, more practical, ways of improving the music industry’s business model problem. But then, that’s the point of this ‘Devil’s Advocate’ thread…to challenge our and our readers’ thinking.

[Ed. - Either way, this should help warm people up to the debates we’ll be having with Feargal Sharkey et al at the 7th Telco 2.0 Exec Brainstorm on 4-5 November in London. And Gerd will be on hand to shake things up a bit.]

The Digital Music License (DML) - why and how a new public license for the legal consumption of music on the Internet would provide a solid alternative to the proposed ‘3 Strikes & Disconnection’ legislation

Dear Policy Makers and Governmental Organizations:

The  proposed “3 Strikes” legislation is flawed in many more ways than I could hope to outline in this letter, and many of these issues have already been addressed in many other places. Therefore I shall provide only a quick summary of some of the key issues, and then move on to describe what a fruitful, realistic and decidedly more pragmatic alternative could look like.

Unauthorized use of music on the Internet is not a technical problem but a business issue. The global ‘free’ sharing of music via the Internet (whether streamed or downloaded) is growing exponentially, and this cannot be overturned by technological means. The digital music ®evolution clearly poses a myriad of business and socio-cultural problems. Rather than hoping for a technical fix, they require us to devise a new social contract that legalizes what people actually do, and permits us to build new business models around it.

Anyone that has attempted to innovate within the music industry will attest to the fact that the largest hurdle for the monetization of music on the Internet during the past 15 years has been the astounding absence of new licensing schemes that actually fit the ‘Internet Generation’ i.e. the digital natives, and the new ways of consumption that connected consumers are rapidly adopting. Basically, the problem is not what consumers are doing - the problem is that we have not blessed it with a license yet.

Any attempt to solve these business issues with technological measures - such as the proposed 3-strikes legislation - would, with utter certainty, be very expensive, have serious social and political consequences, and yet fail miserably to deliver tangible monetary results for the content industries or indeed the creators. Digital Rights Management (DRM) has been pushed very hard by the music industry for over a decade, and has now finally been acknowledged as the snake-oil it really always was. The only outcome of the proposed 3 Strikes legislation would be to further criminalize every single fan and every potential customer.

In practice, 3 strikes means no more money for the creators, no new revenues for the industry (but even more rejection by the consumer), and still no satisfaction for the music consumers. In my view, the most pressing objective must be to solve the very real problem of how music (and then, other digital content) can indeed generate new revenues via the Internet. The old revenue streams are the past, beyond a shadow of a doubt - just look at what is happening to newspapers and print publishing! Technology will not, and cannot, solve problems posed by seriously outmoded business practices.

The bottom line: controlling the flow of digital files is ‘Mission Impossible’. The challenging but nevertheless indisputable reality is that the very idea of reliably and consistently controlling the distribution of music files on the Internet is basically a technical impossibility as well as a social, political and cultural minefield. Today, the simple act of listening or streaming, watching or reading anything on a connected computer or a mobile Internet device is the same as copying the content; one cannot be done without the other. The Internet is a giant copy machine, by definition, by design, and now… by culture.

We may not like it, and we not appreciate it, but just like the railway was hated by the people that made horseshoes and horse carriages we have no choice but to shift what we do, adapt, and reinvent ourselves. As your own kids or any so-called digital native will tell you, having access to content is now the same as having a copy of the content, i.e. ownership. This is true in technical terms and in terms of user behavior and mindset, but crucially, not yet in terms of the law and revailing licensing practices. And therein lies the rub.

I would argue that we are in fact trying to build a new business on top of the pre-Internet principle of exclusive copyright - a stark dilemma that has proven to create endless friction but produce very little new revenue. The very idea of being able to control the flow of files in order to extract earlier or possibly higher payments from the users is fundamentally flawed, and we must therefore look for ways to monetize it rather than to prevent it.

The value of music is no longer (just) in the copied file. We urgently need to understand and accept that the value of music is no longer (just) in the mere copies of the digital files. Our attention needs to shift from the old - and dying - business of ‘selling the copy’ to selling everything else, i.e. the many other forms of user value around that copy. We need to start with providing very low-cost or flat-rated and bundled access, and then create many new revenue generators on-top of the bundled, legalized access to music.

Once legal and unlimited music distribution is built into Internet access - when Access is Content - a revitalized music industry can focus on talent, curation and marketing. That is to say, the attention-getting and the conversion of that attention into actual income. Yes, there is serious commercial value in the music industry once we regulate distribution.

The DML: the alternative to the proposed ‘3 Strikes’ legislation.

80 years ago, the answer to the challenge of a then-new and vastly popular technology called ‘Radio’ was to legalize it and provide new licensing schemes to remunerate the content creators. The same thing happened with Cable TV and with the photocopier, and the very same logic needs to be applied to music on the Internet. A public, collective, standardized and open license for music on the Internet needs to be either voluntarily created by the music industry, or mandated i.e. enforced by the government - and the sooner the better for everyone.

The DML would - similar to the existing radio & broadcasting licenses that are in effect around the world - make music available on public, standardized terms and conditions, and therefore allow any and all businesses that want to use music to do so without the utterly crippling uncertainties that exist in the current marketplace.

Revenue shares and flat rates, not fixed license fees per song.


The objective of the DML is to create a new, vast, and constantly replenishing ‘pool of money’ for music, i.e. to grow the revenue potential along with the growing number of users, as well as with the many new use cases that will arise from the DML.

In my opinion, the most crucial component of the DML is this: the license fee needs to be calculated on a revenue-sharing basis rather than on a per-unit (i.e. per song) fee, whether streamed or downloaded. The current practice of a fixed per-track fee (usually amounting to about 1 cent U.S. per song) for a stream and around 70 cents (U.S.) for a download has proven to be economically detrimental and utterly unrealistic for the market participants (such as Omnifone, Spotify, Rhapsody, Napster and Yahoo) given that the digital music ecosystem is still in its nascent phase, that large-scale advertising revenues for new forms of media are always 2-3 years behind, and that a very large number of users - potentially all UK consumers - are likely to listen to quite a bit of music in this way.

This market does not and will not bear license fees that are fixed in this manner and that are totally unrelated to actual incoming revenue streams. Instead, the DML would need to be calculated on a flat-rate or percentage-of-revenue basis, possibly combined with a minimum ‘floor’ that could prevent unfair and unintended use of ‘free’ music as a loss-leader (if needed). Since there are many different kinds of businesses that would benefit from having legalized music available (e.g. telecoms, operators, search engines, social networks and communities, blogs, web portals, online magazines etc), but their business parameters are so vastly different, I would propose to initially make the DML only available to ISPs, mobile network operators and telecommunications providers.

This would have several important advantages: once they are able, i.e. licensed, to offer music bundles and flat rates, ISPs and telecoms will have every incentive and reason to monitor (i.e. count) which songs are used on their network, they have very large numbers of users that will provide for a critical scale of payments to be obtained immediately (thus significantly lessening the potential threat of revenue loss in the physical music market), they have strong potential for the integration of next-generation, user-friendly advertising integration, and they already have built-in billing and payment mechanisms.

When licensing ISPs, mobile operators and other telecommunications companies, it will be crucial to offer flat-rate licenses rather than to pursue revenue shares which are not going to be an acceptable way of generating music revenues from this process, at least initially. Rather, I believe that a fixed, flat-rate license fee per user, per week or month, would be the most suitable way provided that suitable 3rd parties (see below) will also engage to contribute to the funding of each user’s license fee. It is crucial to not simply declare the license fee payments to be the ISP’s problem - because it isn’t, and because the solution is in the creation of a new Ecosystem, a new business logic, and not in creating tax-like burdens for individual industries.

Economic experts have done a lot of work on the flat rate model. Far from being an economist myself, I would venture to say that, in Europe, a payment of 1 Euro per week per user seems to be economically feasible; however the exact price point will of course need to be negotiated with all involved parties, and possibly be adapted on a yearly basis until the market is more fully developed and each party’s ultimate value position can be determined. In any case-and this is crucial - the DML must clearly be so utterly affordable that every single ISP, operator, and telecommunications company would immediately apply for a license.

In terms of the actual use of the music and the subsequent accounting for remuneration purposes, I propose that it should not make a difference if a song is downloaded or streamed (i.e. played on-demand while online), and - similar to CableTV - it should not make a difference if a user would use music 24 hours a day, every single day, or just download 3 songs every now and then. All music usage would need to be counted, anonymized and reported, and artists would get paid proportional to the actual use of music i.e. according to their popularity (see below for details).

A UK-based calculation example: a pool of 2.6 Billion GBP per year for music. As an example, a DSL provider and mobile network operator with 20 Million UK users would need to generate funds to pay for a DML of GBP 80 Million per month, i.e. 960 Million GBP per year. Further, assuming an average of 50 Million eligible UK residents, i.e. a large percentage of the entire UK population (~ 61 Million) generating 1 GBP per week, the revenues for the music industry would amount to a very substantial 50 Million GBP per week i.e. 2.4 Billion GBP per year, which represents almost twice the UK’s recorded music revenues in 2008 (1.36 Billion GBP). Any argument of ‘cannibalization’ of existing revenue streams such as CDs or iTunes would pale against this figure.

How to fund a DML of 1 Euro per week per user.

The key question is, of course, how exactly the ISPs and telecoms would raise the money to pay for the quite significant cost of the DML, every week, per user. This is a crucial issue since under no circumstances should the ISPs, operators or telecoms be made solely responsible for the financial solution of this problem; it is absolutely crucial to position the DML as a business solution that will unlock strong new revenue opportunities and will be more than cost-neutral in a fairly short time.

In my view, the job of building the financial support mechanisms i.e. the ecosystem that the DML will require should be handled by a mutually respected, knowledgeable and neutral advisory board whose mission would be to collate this new ecosystem and to get device makers, advertisers, premium-service providers and other interested parties aboard as quickly as possible. Of course, as in television and radio, advertising is one of the key factors that will subsidize the DML fee payments.

The concept of advertising-supported content is not new but what will be drastically different, going forward, will be the type of advertising that we will see on digital networks in the very near future. Concepts such as advertising becoming content, itself (such as in mobile phone applications) and social advertising will blossom once permission for the legal use of music is given, creating much higher advertising revenues than we are currently seeing online.

The global advertising spend currently amounts to roughly $670 billion, per year. Going back to the UK example, above, the UK advertising & marketing spend is forecast at approx £25 billion in 2010 (eMarketer), with - by 2012 - an estimated 25% i.e. £6.25 billion going to digital and mobile advertising. Yet, digital and mobile advertising would only be one piece of this new puzzle: handset makers could pay subsidies to get preferred i.e. ‘presented by’ access to users (basically a network-centric variation of the existing ‘Nokia comes with Music’ concept), social networks could contribute subsidies to legally integrate ISP-hosted music into their own networks via the DMLs that operators and ISPs would already have; search engines and portals could do the same.

Imagine if Google could sit on-top of this new system of fully legalized, feels-like-free music - this is similar to how Google has already made legal music (streaming and downloading) ‘feels like free’ in China.

After an initial set-up period, it would be crucial that an ISP or operator that makes use of the DML would be able to fully recover the DML costs through a multitude of new revenue streams, such as next-generation advertising, the sale of mobile applications based on the unlimited availability of music (such as social music and play list applications), subsidies by CE companies i.e. handset and device makers, data-mining and cross-selling (with careful consideration to consumers’ data protection and privacy, of course) and various forms of up-selling of other product and services (including music-related premiums) such as the games industry has offered for the past decade, already, or even by re-packaging some of the license costs to their users.

The DML is NOT a tax. Any indication that the DML essentially amounts to a tax or is yet another compulsory payment scheme levied onto the consumer (such as the existing TV & Radio licenses) needs to be avoided, at least in the UK market where such a proposal would probably be politically unwise. The DML is simply a new license that is made available to businesses that want to use digital music, with the funding being generated from the market participants, themselves.

Monitoring of usage and fair payment to content owners.

Every song that is performed i.e. streamed or downloaded on the Internet would need to be tracked and accounted for, using already available software solutions such as Gracenote or Shazam. This data would need to be made anonymous using a mathematical formula that could protect each user’s private data while still providing actuarial tracking of which song has been used how many times, on any given day, week or month. Each artist and rights-holder would then receive a monthly payment that is proportional to the actuarial use of their music during each tracking period, e.g. if a given artist’s music was used 1.3% of the time (i.e. in any given month), he or she or their representatives (record labels and publishers) would receive 1.3% of the total pool of money collected. All participating creators (e.g. writers, lyricists, composers, producers etc) would get their proportional payment from the same pool. I am advocating a 50-50 split between the composer and the performer (i.e. recording and publishing), at this time. Overlaps with existing rights schemes (such as public performance on the Internet, and so-called web-casting and Internet-radio) would need to be investigated and addressed, as well.

Existing examples: similar models to the proposed DML are already in place, or are being investigated in:
  1. China, where Google is providing free and fully legal streams and downloads of music via their Top100.cn property, in return for a share of advertising revenues (and in full collaboration with all major labels)
  2. Denmark, where the ISP and mobile network operator TDC already has made music ‘free’ to all of their subscribers, in return for paying a flat fee per year to the music rights organizations
  3. The U.S., where Warner Music Group, via the new Choruss project, is rolling out a flat rate music license for universities and colleges
  4. Korea, where Korea Telecom’s Melon service provides flat rate music access to over 50% of the population
  5. Canada, where the Canadian songwriters are lobbying the government for a new flat-rate charge for digital music
To share this article easily, please click:

October 8, 2009

China’s Monster ‘Facebook’ QQ: coming to a screen near you

Introduction

The world is full of fast-growing, hyper-fashionable social networking and user-generated content plays. Almost to a man, they lack one thing - profits, or even revenues. An English-speaking technology media and analyst/investor community obsessed by the US West Coast has practically ignored QQ.com, one example of spectacular success, because it’s Chinese.

A Profitable and Valuable Social Network

At the 30th of June, Tencent (QQ’s owners) had thrown off RMB993 million (US$145 million) in free cash in six months, even after spending RMB1.9bn in CAPEX and a further RMB593 million in financing costs. For comparison, Facebook went marginally cashflow positive for the first time in August and isn’t yet profitable.

The bottom line is impressive too; at the last count, Tencent’s gross margin was at 67.3% and net margin was 41.75% - this smashes HP’s investment criterion of “fascinating margins”, i.e. 45% gross, and Iliad’s 70% ROI on new fibre deployment. We previously estimated the gross margin for October 2008 as 63.5%, so it appears that things have consistently been going well for QQ.

The shares (listed in Hong Kong) have gone from HK$60 to 120 since April, showing that this performance is also attracting plenty of demand from investors - albeit at a somewhat toppy price/earnings ratio of over 50.

Nearly a Billion ‘Users’

There were 990 million user identities on QQ as at the 30th of June, 2009. Given the current growth rate, the billionth user will almost certainly be announced in the next quarterly results - but a nontrivial percentage of these are inactive, are multiple aliases, or are spambots. [NB. This is true of all IM communities except, perhaps, for the 17 million users of IBM Lotus Notes Sametime inside their enterprise firewalls, as we pointed out in the Consumer Voice & Messaging 2.0 strategy report.]

As impressive as this is, instant messaging user bases are usually only weakly bound to the service, they are usually non-paying, and many people have multiple usernames. A more useful metric is peak concurrent users - the maximum number of users simultaneously logged in during the period in question. To be counted, a user name has to be active in that they are online, so it’s reasonable to deduce that they exist. It doesn’t prove that they are a human being (or for that matter a useful application rather than a pest); however, whether or not a logged-in user is human, they are consuming system resources.

So, measuring peak concurrent users provides us both with better data on uptake and a more useful indicator of capacity related costs. It’s a standard telecommunications engineering principle to “provision for the peak” - that is to say, it’s useless to build a network with only sufficient capacity for the average traffic, as 50% of the time it will be congested and probably non-functional through overload. To be available, the system must supply enough spare capacity to handle the peaks in demand. Peak load determines scale, and hence cost.

In 2008, at various times, QQ’s parent company Tencent claimed to have between 355 and 570 million users. At the end of June, 2009, the user count stood at 990 million - so the nominal user base had roughly doubled. In 2008, peak concurrent users were 45.3 million, growing to 65 million in June 2009. According to QQ.com’s live statistics readout (you can watch it grow in real time here), the record at time of writing was 79 million. According to Alexa, 3.26% of global Web users visited one of the various qq.com sites in September 2009.

qq-growth.png

For comparison, Skype’s all-time peak concurrent user count is 15 million, although it has the advantage of using user-provided infrastructure, whereas QQ has a client-server architecture and therefore a constant need for rack-space.

Not just users, but Paying Users

In 2007, out of 12 million peak concurrent users, 7.3 million had spent money with QQ, or to put it another way, 61% of verifiable QQ users were buying value-added services. (How many mobile operators can claim that?)

In March, 2009, we thought it unlikely that this high proportion would continue to pay as the service grew - and that it was quite possible that the 7.3 million earlier payers were dominated by early adopters and power users, so that future recruits would be less committed to the community, less geeky, and lower-income.

However, when Tencent’s Q1 results appeared at the end of March, 36.9 million users had purchased value-added services during the quarter, growing at a monthly rate of 8.4% to reach 40 million by the end of June. This latter figure was against a concurrent user base of 65 million, meaning that 62% of concurrent users were paying users.

We think this is an impressively high proportion at such volumes, and suggests that the revenue may scale reasonably well as it grows penetration further. As one might expect the cost model of such a volume business to scale efficiently, this implies further prospects of profitability. It is likely that such thoughts are one of the influences on the aforementioned growth in QQ’s share valuation.

So, how did they do it?

qq-cpf.png

In our Serving the Digital Generation Strategy Report, we identified a list of key factors that anyone who wants to attract the customer of the future would have to address, which together describe what we call the participation imperative. Specifically, four axes define the customer’s aims:

To read the rest of this article, covering:

  • The Customer Participation Framework
  • The role of digital money
  • Why it beats advertising
  • QQ’s two-sided business model
  • What happens next

 

Members of the Telco 2.0TM Executive Briefing Subscription Service please see the full article here. Non-Members, please see here for how to subscribe, or email contact@telco2.net or call +44 (0) 207 247 5003.

To share this article easily, please click:

Strategy 2.0: The $375 billion growth and efficiency opportunity - update

In preparation for our EMEA and AMERICA brainstorms,the Telco 2.0 team has been reviewing and debating some of its core theories and previous analysis.

Two important reports, which also helped launch the Telco 2.0 Initiative, were “Beyond Bundling: Future Broadband Business Models” and “The ‘Two-Sided’ Telecoms Market Opportunity”. Together they described a $350bn growth opportunity for telcos that was about leveraging their distinctive assets to create interoperable platforms that enabled third party organizations to optimize their everyday business processes and interactions with customers. The focus was mature markets, because that’s where the greatest business model pressures existed, and the time horizon was 10 years.

It wasn’t a forecast; it was (and still is) an opportunity to add greater value to the wider ‘digital economy’ and grow ahead of market projections.

But we all know it’s hard making predictions, especially about the future. Suggesting fundamental adjustments to how successful industries make money is also not without its pitfalls. As part of a preview of two new reports on Fixed and Mobile Broadband End-Games and ‘Two-Sided’ Business Model Use Cases, we thought we’d take the brave step to revisit our original analysis to see how things have changed and what interim lessons we could draw out. This first article focuses on the broadband. The next one will address the lessons from the 2-Sided Business Model report.

A bit of Telco 2.0 Background

Back in the autumn of 2007, just as the great financial crisis began to bite, we had two key messages - the first was that the current broadband business model risked seeing costs escalate without limit while revenues fell behind, and the second was that telcos and ISPs could escape this by mastering a richer suite of digital logistics skills, seeking adjacent as well as end-user revenues, and embracing structural separation.

Winning Telco 2.0 Predictions

From the viewpoint of today, some of the predictions in the report look robust - the coming wave of streaming video, for one, and the impact on ISPs’ OPEX bills, closely foreshadowed what would happen in spring 2008 with the arrival of the BBC iPlayer and the second stage of the YouTube rocket. (see also the report Online Video Market Study: The impact of video on broadband business models) On the other hand, the industry is still lagging with respect to richer value-added services, both retail and wholesale, especially if you compare with the mobile world.

However, the position isn’t as grim as it might have looked from early 2008, either; despite the video surge, the ISPs are hanging on. Partly this is because of the steady progress of Moore’s and Gilder’s laws. Upgrading has been getting cheaper, cushioning the blow - and the rapid growth of the CDN industry (something we certainly did predict) has also been a factor in coping with the crisis. In fact, the CDN business has been a primary form of two-sidedness in the telecoms industry - several backbone operators have been developing their own CDNs, and their revenues from upstream customers obviously flow directly into the industry, but even the third-party ones like Akamai help in that they reduce costs to the eyeball networks (that is, primarily residential, rather than transit, hosting, or enterprise ones).

Slice’n’Dicey (we never said it was easy)

As far as the various “slice’n’dice” options go, these have turned out to be just as difficult and unpopular as expected, and have in fact been undermined by the industry’s success in coping without them. Nobody takes the M6 toll road if there’s no traffic jam on the M6; one major UK operator that spent heavily on a deep packet inspection infrastructure is mostly using it to enforce their original usage cap, and another that invested in Ellacoyas is using them to monitor botnet (i.e. machines controlled by hackers) activity.

This coping strategy, however, has limits; Moore’s law only affects things like routers, and Gilder’s law (that bandwidth grows three times faster than Moore’s law states for processing power) only affects the network cards, rather than the wires in the ground. At some point, either the technology (traffic growth beats ADSL2+) or the economics (it becomes uneconomical in terms of OPEX to keep going with relatively maintenance intensive copper) will force the jump to fibre. Then, the industry will be faced with a massive bill for civil works as the trenches get dug up, and the question of whether it can fund the write-off of the copper infrastructure and the capital expenditure for fibre deployment from its current business model will be back.

Public Sector Rolls Fibre, OK

As far as fibre goes, a trend we didn’t see coming was that deployment would be dominated by the public sector to quite the extent it has been. This takes different forms - in Australia, Amsterdam, South Yorkshire, and Singapore, the role of the state has been direct, in actually building publicly owned dark fibre or wholesale networks. In France, by comparison, it has been indirect, using regulatory changes to mandate regulated access to France Telecom and other operators’ ducts, trenches, and poles.

Laser Accuracy

We certainly thought that a wide range of other services, as well as voice, IP, and video, would want to ride on the fibre, and in fact, developments in the US and Australia tend to bear this out. The US stimulus plan is putting serious money into both broadband and smart grid technology, while the Australian Government is very keen on the “trans-sector concept” popularised by Paul Budde, in which multiple public services and infrastructures (telemedicine, research & education, smart grid, environmental monitoring, CCTV, etc, etc) are delivered over a common fibre network. However, in practice, this has been more likely to take the form of these services being customers of the dark fibre owner, or else over-the-top applications, than being customers for sender-party pays data from the telcos.

NGNs and LLU / Wholesale Markets

Another issue regarding fibre, which grows out of its increasingly public nature, is that any move to next-generation access will be heavily influenced by the LLU/wholesale market. It is unthinkable in many markets that the incumbent operators will be allowed to turn the clock back - in fact, some of the incumbents agree, like Deutsche Telekom - and in others, like Australia, the incumbents’ role in the access layer is usurped entirely. However, the alternative operators (like the UK’s bitstream/LLU DSL operators) have made real capital investments in LLU and associated backhaul, and any migration strategy will have to accomodate them in order to be acceptable.

That includes pricing; in the UK, initial OFCOM policy is that there is no need to regulate the pricing of BT’s planned wholesale fibre access product although there is only one of it. This raises an important point about the original report; did we mistake BT Wholesale (and Openreach) pricing issues for fundamental economic ones that affected the entire industry? After all, for most UK ISPs, the immediate physical effect of surging video traffic was a bill from BT, not a fried router TCAM. Arguably, what happened was that more of the industry’s revenues were swept into the incumbent, and specifically its less-regulated wholesale division, just as it tried to shuffle as much of its cost base as possible into the highly regulated, low margin Openreach.

Why do Telcos neglect Voice when it’s so valuable?

And while this industry-wide tactical firefighting operation was going on, not without success, did the operators take their eye off a major strategic trend? We’ve seen plenty of innovation in voice - but it’s all come from disruptive start-ups, mostly originating from the nethead side of the wire. With a couple of honourable exceptions, telcos have been letting the core voice business rot and fighting with knives over a much less profitable ISP line of business. The results are visible, as Dave Burstein points out in his current newsletter:
Ivan Seidenberg, Verizon CEO, saying “voice is dying” is a defining moment in telecom history. He didn’t use those words, but his comments at Goldman Sachs are clear “we have to pivot and make a shift from the voice business to the data business and eventually to the video business. … we must really position ourselves to be an extremely potent video-centric asset.”
“The issue there is perhaps it is like the dog chasing the bus a little bit. So what I need to do is get ourselves focused around the following idea, that video is going to be the core product in the fixed line business. … I shed myself of the burden of chasing the inflection point in access lines and say I don’t care about that anymore.”

Verizon remains one of the most profitable companies in the world, but the wireline business is heading downhill so fast JPMorgan writes “Action will likely be necessary to support the dividend beginning in 2012.” They won’t be able to support $5B/year in dividends without tapping wireless 45% owned by Vodafone. Martin Peers think Verizon will buy a satellite TV company. Knocking out one of the four TV providers is unthinkable if the Obama team is serious about competition, but that’s not proven…

In Conclusion…

In the future, it won’t be enough being a telco to be good at voice - the link with the copper (or fibre) in the ground is going. We certainly got that right - and a few other good shots too.
(Ed - to catch up or contribute further, join us at the EMEA and AMERICA brainstorms, or to get the latest analysis email contact@telco2.net to register interest in our forthcoming new research reports on Fixed and Mobile Broadband End-Games and ‘Two-Sided’ Business Model Use Cases.)

To share this article easily, please click:

Cisco/Oxford Broadband Quality Study Backs Telco 2.0 on Fibre

Remember this post from April? Especially, do you remember this chart?

We identified four groups of countries from the plot of price per megabit vs. average speed:

  1. High bandwidth, low price countries (green box) - South Korea, Japan, France, Finland and Sweden have 16Mb/s average broadband capacity, enough to receive two high definition tv streams or a variety of other services per household
  2. Moderate bandwidth, low price countries (blue) - US, Canada, the Netherlands and Germany have 4Mb/s+ average broadband capacity, enough to comfortably take standard definition TV plus other services in parallel
  3. Low bandwidth, low price countries (yellow) - UK, Spain, Italy and Hungary
  4. Low bandwidth, high price countries (pink) - Many eastern European and developing countries, which have broadband at video TV quality or worse and where price is the barrier to heavy use.
And we drew some conclusions from this -
The first group was essentially the poor; the second and third both consisted of markets where there was extensive unbundling or bitstream-based competition, and really they should be taken together for these purposes; and the fourth was an odd and heterogenous one, which only had in common that they had a strong tradition of public-sector planning and infrastructure investment. Perhaps the most interesting detail was what we didn’t find; there was no fifth group worth mentioning where FTTH was available, but only at a steep price. Below the sort of pricing you expect for leased lines, the market wasn’t providing real broadband to those who could afford it.

This week, Oxford University’s Said Business School and the University of Oviedo published a study (sponsored by Cisco) into broadband quality worldwide. Download the report here. Here’s a chart with their headline findings that seems to bear us out.

saidbqs.png

There are some detail differences. France is in group 1 in our analysis and group 2 in theirs, but there is a very simple explanation for this - the SBS/Oviedo/Cisco study doesn’t take any account of price, and it’s a crucial variable in ours.

saidbqs1.png

Similarly, this chart of “broadband quality leaders” bears a close resemblance to the choice between technocracy or anarchy we identified in this post.

To share this article easily, please click:

Use Cases 2.0: Five new business models in action

For our forthcoming EMEA and US Executive Brainstorms, and for a new strategy report, we’ve been working on a set of five new ‘use cases’ that bring ‘two-sided’ telecoms business models to life. The ‘use cases’ describe the application in detail, and an outline business case for the opportunity.

There’s a presentation on the background here, and more on each one and what they mean for the industry below.

So what are the use cases?

  • Digital Advertising 2.0: Local Mobile Search
  • Broadband 2.0: Managed Mobile-to-Fixed data offload service
  • Digital Money 2.0: Introducing Mobile Banking to mature markets
  • Digital Home 2.0: Telcos’ role in Smart Grid
  • Voice and Messaging 2.0: SME productivity platform

1. Digital Advertising 2.0: Local Mobile Search Use Case

The first use case reflects our long standing interest in the possibilities and pitfalls of the vast resources of customer data the telecoms industry is sitting on. The possibilities in terms of social networking, advertising, marketing, and business intelligence are huge – though there are significant challenges too. For example, the first ever scientific study of public attitudes to targeted advertising shows that people appear to be highly sensitive to the use of behavioural data. This use case will explore how to tackle this combined problem and opportunity.

blackcab.png

It’s never like this when you really need a taxi

The Use Case focuses on providing immediate Local Mobile Search services that deliver relevant results to customers on the basis of their current location. The consumer proposition is a free local search SMS short-code. Sending a text containing search terms grants permission to enable the use of the Telco’s embedded location information. Responses are returned free to the user, paid for by the advertisers, and carrying paid results with greatest prominence (as per Google search). The aggregating agent for the advertising can be a partner directory service or search specialist. The Telco gains a share of the revenue, either from wholesale SMS charges or by a revenue share model.

2. Broadband 2.0: Managed Mobile-to-Fixed data offload service

Our new Broadband 2.0 Use Case shows a new way to ease the increasing data traffic on mobile networks and the associated cost surge and creates a new wholesale revenue opportunity for Fixed broadband service providers (BSPs).

An expensive way to solve the mobile data capacity crunch is to install ever more cell sites. A potentially cheaper and easier way is to relieve the backhaul and core networks by shuffling bulk Internet traffic off to the fixed broadband network and out to the public Internet at the earliest possible stage.

We see two phases: ‘Offload 1.0’ as Mobile Operators use femtocells or WiFi on either their own sister-company’s fixed broadband, or a third party’s broadband without optimisation, and ‘Offload 2.0’, with Mobile operators dealing with fixed-line BSPs wholesale arms (not just their own “captive” arms). Offload 1.0 is a good start, but Offload 2.0 is needed to create the network effects to offload sufficient volumes, improve operational effectiveness and “groom” the traffic in a variety of ways.

3. Digital Money 2.0: Introducing Mobile Banking to mature markets

Nearly 200 million people live outside of their countries of birth, 93% of whom on a voluntary or economic basis driving over $300 bn in worldwide, international remittances per annum. Yet many in this segment are poorly served by Banking Services.

The Digital Money use case looks at the huge success of m-banking/money transfer services in emerging markets, and asks how operators could extend this business into the unbanked segments of developed markets - a “reverse leapfrogging” strategy. The Use Case:

  • Sizes the global market opportunity
  • Details a consumer proposition and core experience
  • Analyses major flows of international migration and the existing, highly costly, money transfer opportunities available to them
  • Analyses the costs, risks and mitigations, and overall business proposition of a mobile-based m-banking service targeted at migrants.

3. Digital Home 2.0: Telcos’ role in Smart Grid

Smart grid technology - adding rich controls to the electricity grid using modern telecoms - is emerging as a megatrend. Developed and fast-industrialising countries are pouring money into grid upgrades, driven by an increasing concern for energy efficiency and the climate.

Managing the grid better offers significant efficiency gains, and the possibility of integrating much greater percentages of variable renewable forms like wind or solar power into the mix. However, the electricity industry has been very keen to outsource its billing and measurement operations, leaving it short of the key assets it needs to build on. This may open up an opportunity for the telcos to be more than simply providers of bulk SMS/USSD messaging. Through this Use Case we explore which technologies and business models are best used and what is the best balance for each operator, region and nation in terms of sophistication vs. costs.

uc-table.png

4. Voice and Messaging 2.0: SME productivity platform

Finally, we’ve long argued for the importance of better voice & messaging for SMEs and departments within larger enterprises. Communications enabled business processes (CEBP) can help firms:

  • Cut costs through increased automation and more streamlined processes to reduce demand variability and under-utilisation
  • Increase revenues through better customer understanding (e.g. in-store feedback) and reducing the number of no-shows

For central deployments in larger organisations, businesses rely on a combination of application providers and systems integrators to get fully customised solutions that are approved by both senior management and the IT department. However the rise of software as a service applications has meant hat many smaller firms and departments can look at adding deployments which add value but don’t affect existing systems. By integrating these solutions with communications, they can add greater value still. We believe operators are in a strong position to bring this vision to fruition as a platform for a range of CEBP applications.

For more on these Use Cases, please join us at the forthcoming EMEA and US Executive Brainstorms, or email contact@telco2.net to pre-order the new Use Cases strategy report.

To share this article easily, please click:

Voice is dead, says Verizon CEO

Thanks to Dave Burstein, who runs the excellent DSL Prime newsletter, for capturing this:

Ivan Seidenberg, Verizon CEO, saying at the Goldman Sachs Investment Conference last month that “voice is dying” is a defining moment in telecom history.

He didn’t use those words, but his comments are clear “we have to pivot and make a shift from the voice business to the data business and eventually to the video business. … we must really position ourselves to be an extremely potent video-centric asset.”

“The issue there is perhaps it is like the dog chasing the bus a little bit. So what I need to do is get ourselves focused around the following idea, that video is going to be the core product in the fixed line business. … I shed myself of the burden of chasing the inflection point in access lines and say I don’t care about that anymore.”

Verizon remains one of the most profitable companies in the world, but the wireline business is heading downhill so fast JPMorgan writes “Action will likely be necessary to support the dividend beginning in 2012.” They won’t be able to support $5B/year in dividends without tapping wireless 45% owned by Vodafone. [Some analysts] think Verizon will buy a satellite TV company. Knocking out one of the four TV providers is unthinkable if the Obama team is serious about competition, but that’s not proven…

You can meet Dave, who has lots more great insider information, at the 8th Telco 2.0 Exec Brainstorm, on 9-10 December in Orlando.

To share this article easily, please click:

October 7, 2009

ESPN: Making ‘pay-for-content’ work - a telco opportunity

Internet Video is a booming business in it’s own right, a key driver of broadband volumes and costs, and increasingly an important component of telcos and other broadband service provider’s (BSPs) packaged broadband offerings (see our recent Strategy Report “Online Video Market Study: The impact of video on broadband business models ). The Goliath US Sports network, ESPN, has just entered the UK market, and we analyse here their history, strategy, and lessons for BSPs and other content aggregators both here in the UK and elsewhere.

Introduction

In the rush to find a working model for monetizing internet video, the most obvious solution is often overlooked – the payTV model. Since 1979, when the Entertainment and Sports Programming Network (ESPN) secured an exclusive deal with the USA colleges (NCAA) to screen their sporting contests, the model has proven resilient to both the advertiser-funded free model and economic climates. The model has also delivered both steady profits and growth to all players in the value chain – rights holders (e.g. sports bodies), content aggregators (e.g. channels) and distributors (e.g. cable systems). In the payTV model the money flows from the consumer to the distributor to the rights holders via the aggregator. 

In the internet world, we are starting to see hybrids of the payTV model emerge. In this note, we analyse two of the more adventurous services: ESPN360 (USA) and SkyPlayer (UK). We also put these services into the context of both incumbent video distributors (Comcast and BSkyB) and challengers (Verizon and BT). We focus upon the elements that historically have driven success for aggregators and distributors and place those elements into the more complex internet-era.

kmm-espn.png

ESPN’s History

ESPN actually started life as an ad-funded network. It only started charging the cable distributors in 1984, only once it had built an audience and had been  purchased by ABC – a parent with deep pockets. The rate card seems meagre these days: US$0.25 per subscriber per month in the first year rising to US$0.30 per sub in the second. However, it was enough for the legendary John Malone and his TCI cable system to refuse to pay and threaten to set up a rival network. The battleground has always been the same: the relative value of controlling the home versus controlling the content.

By the mid-1990s ESPN, now owned by Walt Disney, had lucrative contracts for major league baseball and the National Football League. It perennially asked for and got double-digit rate increases from the cable networks. When ESPN wanted carriage of a sister channel ESPN2, for extreme sports, they got it gratis. This was at a time when new and unproven channels such as FoxNews were actually paying the cable networks for carriage. As Malone said at the time “Little Mickey had us by the throat”. Of course, the cable companies passed on the charges to homeowners and took most of the flack. Again, very little has changed over the years with aggregators leveraging popular content to expand into new areas.

ESPN now offers much more than “live sports” channels for the distributors. Two key channels are ESPN Classics which shows replays of historical matches, and ESPN News which provides highlights, commentary and analysis of past and upcoming events. ESPN offers its core audience a menu served up 24/7. As at Sept 2008, ESPN had 93.7m subscribers to its main channel, 97.3m to ESPN, 63.2m to ESPN classic and 67.4m to ESPN News.

The main competitor in the USA is Fox Sports Net which launched in 1996. Fox Sports takes a regional approach to broadcasting tailoring output to local markets.  For example, Fox shows the local Chicago Bulls (NBA) and Cubs (MLB) matches in competition to the ESPN national games. However, Fox Sports has never achieved the scale of ESPN and caters for a niche audience. The lesson is that there is a first mover advantage and scale matters both for negotiating for exclusive content and in determining the share of the distribution pie.

ESPN on the Internet

ESPNet.SportsZone.com launched in 1995 and has since grown to become #2 sports site in the USA (now ESPN.com) with 24m unique views in August 2009 (source - Comscore) after Yahoo! Sports! 

The primary focus of ESPN.com is highlights, interviews, statistics and analysis. The site offers ad-funded video, but is relatively small in scale compared to YouTube and Hulu. In 2008, it served an average of 120 million videos per month, a 32 percent increase from 2007. The real ESPN innovation is the ESPN360 website which offers live streaming of broadcast events. In the USA, this is only available to “affiliated ISPs” – those which have signed wholesale carriage deals with ESPN. The major ISPs, such as Verizon and Comcast, have already signed up. ESPN has once again left the billing and customer care relationship with the distributor. 

ESPN360 is effectively a mirroring the original payTV strategy – people will indirectly pay for live streaming of exclusive sports events. There are a few subtle differences for the internet era: a remote viewer option which enables people to watch the events from hotels and work; a free offer to the college networks; and a free offer to the military networks. Outside of the USA, ESPN offer subscription and pay-as-you-go packages direct to the end-consumer.

ESPN do not publically disclose the rate card for ESPN360 affliates, but no doubt it will favour the distributors offering a traditional broadcast payTV service. Single-play broadband pipe only providers are likely to suffer as they don’t have the negotiating power of the likes of Comcast. For telephony incumbents, such as Verizon and AT&T, the case for offering a TV service becomes ever more compelling. Similarly, distributors offering payTV via satellite are likely to suffer. ESPN360 effectively adds the option of watching events on PC at home or on the go as well as on the TV.

The Rights Holders side of the equation is similarly shifted towards incumbent channels and away from new entrants. ESPN leverages not only its investment in acquisition, but also in production to effectively push the same product through multiple means of distribution. The marginal costs are limited compared to a new entrant. A new entrant streaming internet-only content faces a limited future. In effect, the Rights Holders will end up licensing live broadcast rights as a bundle regardless of transmission medium (broadcast, internet or mobile).

The UK landscape

The payTV market in the UK followed a different evolution path to the USA. When the cable networks started to be built in the mid-to-late 1980s the industry was very fragmented; the original networks spent the majority of their capex on infrastructure and rather than investing in content bought in most TV programming from the USA. Also, telephony capabilities were built into the network from day one and therefore they had BTs lucrative monopoly on home telephony revenues to target.

In 1992, the nascent satellite industry starting investing heavily in sports programming by securing the exclusive right to the UK’s Football Premier League. BSkyB effectively played both the Content Aggregator and Distributor. Investment was balanced between programming and customer acquisition. BSkyB wholesaled their channels to cable companies in much the same way as ESPN did and had similar periodic fights over the value chain. Playing a dual role in content aggregation and distribution was, and still is, nothing new. Warner Communications invested in the earliest USA cable franchises and began investing in channels such as Nickelodeon. Distributors such as TCI and Comcast have also invested in channels, and Ted Turner’s CNN was initially financed from ownership of a local TV station.

The cable industry in the UK has consolidated and restructured over the years to leave just one remaining network, Virgin Media, serving 3.4m homes and covering around 50% of UK homes. Unlike the USA, satellite penetration is much higher than cable penetration with BSkyB serving 9.4m homes in the UK & Ireland. BT, a relative latecomer to the TV market compared to Verizon and AT&T, serves only 0.5m homes.

kmm-espn1.png

The broadband market has also evolved differently to the USA with BT being forced to open its access network to third parties by the regulator. This enabled BSkyB to launch a broadband (and telephony) service in 2006 which has grown to serve 2.2m homes. This compares to Virgin Media which serves 4m homes and BT which serves 4.8m. There are also other players such as TalkTalk (4.3m homes) and Orange (1m homes) which currently do not offer a significant TV offering. 

kmm-espn2.png

In 2009, ESPN acquired some rights for UK Premier League football and launched a series of channels around this content. ESPN have mirrored their USA model in the UK. The previous holder of the rights, Setanta, adopted a very different model with Setanta using an end-to-end service model – hence carrying all the operating costs of subscriber management. Setanta’s model ultimately failed. The ESPN model is simpler and provides incentives for distributors and in our opinion has a much higher probability for success.

[To read the rest of the article, covering:-

  • BT’s own label TV service 
  • ESPN360 in the UK?
  • SkySports on the Internet
  • Other European precedents
  • Conclusion - the ESPN opportunity for Telcos
  • Lessons for Telcos and other Service Providers

Members of the Telco 2.0TM Executive Briefing Subscription Service please see the full article here. Non-Members, please see here for how to subscribe, or email contact@telco2.net or call +44 (0) 207 247 5003.]

To share this article easily, please click:

The Business Case for ‘Two-Sided’ Telecoms Business Models

Many industry corporate strategists buy into the two-sided telecoms business model and now want to prove the quantifiable benefits to drive action. Here’s an update on Telco 2.0’s current data and plans.

“Show me the money!”

The 'Two-Sided' Telecoms Business Model

The ‘Two-Sided’ Telecoms Market Model

The theory behind ‘two-sided’ telecos business models is becoming increasingly well known and accepted (see Vodafone’s CEO’s recent presentation). The focus of the Industry strategy and decision-making community is therefore turning to the development of the detailed business cases that will enable the transition from theory to practice.

Many of Telco 2.0’s recent client interactions have also focused on the development of the business case, so here we summarise what has been done to date, what is currently available, and what is coming next. This article covers:

  • Strategic Context: How IP and Maturing Core Markets are changing the game
  • Market Size: The Overall Market Opportunity - Methodology
  • The Detailed Model: by Sector and Geography
  • Adding Further Granularity: Use Cases and Case Studies
  • Tracking Global Developments: The Telco 2.0 Database
  • From Theory to Practice: The New Industry ‘How To’ Guide and Roadmap 

Strategic Context: How IP and maturing markets are changing the game

The initial Telco 2.0TM Market Study “How do we make money in an IP-based world?” articulates the changing market and technical landscape for the Telecoms Industry. It identifies the broad opportunity for telcos to evolve from purely vertically integrated service providers to a more open configuration and structure allowing increasing levels of horizontal integration and creative use of internal assets.

diagram

The Pressures on the Core Voice and Broadband Business Models

The subsequent Broadband Business Models Report “Beyond Bundling – winning the new $250Bn content distribution game” takes this analysis further, and identifies top level ‘two-sided’ business models and technical architectures to deliver against this new opportunity. It also articulates the growing pressure on the core voice and broadband business models in maturing markets, as penetration and revenue growth slows although usage and competition intensifies cost pressures.

Market Size: The Overall Market Opportunity - Methodology

The full theory and analytical background of the ‘Two-Sided’ Telecoms Market Opportunity is articulated in the seminal Telco 2.0 study, “The 2-Sided Telecoms Market Opportunity; Sizing the new $125Bn platform services Opportunity”.

As well as articulating the rationale, precedents and priority areas, the report uses a detailed analytical methodology to identify economic opportunities or “pain points” that Telcos could address across 117 market sectors against the 7 functions identified. The brief five slide presentation below shows the methodology and example analysis from the Report (NB you may wish to view this at “Full” screensize, ‘Esc’ returns you to this screen).

 

 

In summary, Telco 2.0 Analysts reviewed assumptions of what proportion of the opportunities are addressable using Telco assets applied via ‘two-sided’ business models across U.S. and European markets and used this to formulate the top-level analysis. Individual assumptions were benchmarked where possible against individual examples of the costs of industry transaction “pain”.

The Detailed Model: by Sector and Geography

The model (117 sectors, 7 applications) can also be applied to national or regional geographies. Some Telco 2.0 Workshop and Consulting clients have already worked with Telco 2.0 Analysts to create their own private and bespoke market analyses and projections.

Adding Further Granularity: Use Cases and Case Studies

Telco 2.0 ‘Use Cases’ are detailed “business level” descriptions of illustrative commercial models for new Telco 2.0 B2B platform services that help to help bring the theory to life, and provide the next level of detail.

The Telco 2.0 Use Case Project is developing practical examples of the implementation of new business models from three sources:

  • 5 x new ‘Use Cases’ - realistic examples of potential new Telco 2.0 business models
  • 5 x Case Studies from inside the Telecoms Industry
  • 5 x detailed Adjacent Market Case Studies embodying important multi-sided business model lessons.

Each of the ‘Use Cases’ will have a next level market estimate relating to the potential opportunity. These Use Cases and Case Studies will be documented in a new 100+ page Telco 2.0 Strategy Report “The ‘Two-Sided’ Business Model Case Directory” and presented and discussed at our forthcoming Telco 2.0 Executive Brainstorm Events.

Tracking Global Developments: The Telco 2.0TM Database

In 2009 the Telco 2.0 Initiative established a database for clients and partners that documents global developments in ‘two sided’ Business Model Innovation on a project-by-project basis. This will be updated continually and be an invaluable research source for strategists and innovators to track and compare projects and activities.

From Theory to Practice: The Industry ‘How To’ Guide and Roadmap

A new Strategy Report “Two-Sided Business Models: from Theory to Practice”, compiling the lessons and updates from the market, and drawing up the new Industry roadmap to the future Telco 2.0 world is planned for early 2010. This will include updates on market estimates and the very latest on business model innovation in practice, and options for corporate re-structure, infrastructure investment, and cross-industry collaboration.

For further information on any of these activities, please email contact@telco2.net or call +44 207 247 5003.

To share this article easily, please click:

October 6, 2009

Digital Advertising & Marketing 2.0 - new consumer engagement

The video below* sums up the challenges that the marketing industry faces. How can we create a more valuable connection with the increasingly ‘digital generation’?

At the upcoming Telco 2.0 ‘Executive Brainstorms’ (EMEA, 4-5 Nov, London and
AMERICA, 9-10 Dec, Orlando ) we’ll be debating in detail this topic with those at the cutting edge, looking in particular at how to turn mobile into a scaleable platform for engagement marketing, and integrate it with online.


* Thanks to Jeff Swystun from DDB for this ref.

To share this article easily, please click:

October 5, 2009

Ring! Ring! Hot News, 5th October 2009

Telco 2.0 Top Stories

Coming Up: Telco 2.0 Executive Brainstorms, Europe, 4th-5th November, and Americas, 9th-10th December

In other news: Aircom predicts $1.8bn per carrier for LTE deployment; Tony Blair fails to secure Wataniya Palestine’s spectrum; US operators drain the spectrum tub and holler for more; trust busters warn them against disorderly conduct; Telenor/Alfa row finally ends; France Telecom No.2 quits over suicides; Amazon settles over Orwell-zapping; Sony will publish anyone’s book; MapReduce available as EC2 instances; 50,000 new EC2 instances a day - cloud turns black, thundery, grows rapidly; Lotus Notes in the cloud; RIM, Apple, Google work together to mobilise Flash Player; Nortel GSM on the block; Sierra Leone gets mobile money; more war stories from OpenBTS’s David Burgess; 2 billion iPhone downloads; Spotify offers offline; Grauniad for your iPhone; beta release for Google Wave; another 20 million iPhones post-exclusivity; more BBC web traffic comes from mobiles than PCs in Nigeria; Comcast CEO “in top five overpaid execs”

Crash of the über-merger; getting a deal that would satisfy both Bharti-Airtel and MTN’s sets of shareholders and the aspirations of their respective management teams was always going to be hard, and integrating the two companies even harder, but getting the politics right? Really hard. And that’s what’s happened - the South African government, which owns 21 per cent of MTN, has refused the deal on the grounds that they aren’t keen on foreign ownership of MTN.

Whilst we’re on the subject of really big operators, Telefonica has pressed the trigger on its first LTE deployments and the vendor frenzy is beginning to churn, especially as they’re scattering contracts for the test networks across the whole vendor community. Interestingly, Telefonica is asking that the six vendors involved form three teams, of which each team must include one Western and one Asian vendor; some sort of weird party game. So it’s a question of “perm any two” from Alcatel-Lucent, Ericsson, and NSN on one hand and NEC, Huawei, and ZTE on the other. The tests will take place in Spain, the Czech Republic, Germany, the UK, Argentina, and Brazil.

Just to keep the vendors keen, Telefonica is “not ruling out” letting Fujitsu or Motorola on board as well. And it works - although ALU and NEC have an alliance for their LTE businesses, it’s mate against mate on the Telefonica job.

No wonder; Aircom International estimates that a tier-one US operator would spend $1.78bn in the first year of LTE roll-out. So, for four of them, that makes a $7bn cheese for the vendors to get stuck in to for the first year alone - if the operators don’t just content themselves with upgrading their HSPA networks.

In the US, the Knight Commission report is in, and it says roughly what we said about broadband and fibre deployment - in the end, you’ve got to bite the bullet and build an open network. Our analysis is here. The Oxford Business School, meanwhile, published a study sponsored by Cisco that found that only Japan could consider its broadband infrastructure entirely satisfactory and the UK was teetering on the edge of failure - more interestingly, there’s a close similarity with the groups we identified in that report.

In some places, just getting GSM deployed was enough of a struggle. You may recall that Wataniya, holders of a licence to deploy in Palestine, had called in Tony Blair (for it is he!) to persuade the Israeli authorities to release spectrum they needed in order to start service. We were, if I remember rightly, somewhat sceptical of the efficacy of the former prime minister’s intervention. It looks like it didn’t work, as the Israelis are still hanging on to two tiny but crucial slivers of 900 and 1800MHz. It’s not just their fault, however; the local Orange division and the incumbent are also accused of reneging on a promise to make room in the frequency allocation table.

The US operators’ fanclub is hammering on the FCC’s door like a junkie outside the chemist’s, for its part, demanding another 50MHz for instant delivery. So it’s perhaps fortunate that the Department of Justice’s anti-trust division is promising to take a firm line on further mergers & acquisitions among US telcos.

And one of the industry’s oldest disputes, which has been going on since 2004, is over, as Telenor and Alfa Group/Altimo agree to bury the hatchet rather than split the baby. A new company, Vimpelcom Ltd., will own Vimpelcom and Telenor will hold just shy of 36% of it, while Alfa gets 44%. They could have done this much earlier, and saved the industry press literally hundreds of mind-paralysing he said she said stories…

The anger over a string of suicides at France Telecom has had a result; the head of its operations in France has resigned, and Stephane Richard joins from the Ministry of the Economy to replace him. (You can see a rather unhelpful rant from CEO Didier Lombard back in January here, in which he accused provincial employees with civil service status of going fishing for mussels rather than working.)

Meanwhile, some of us may recall that “exaflood” that was meant to cause “Internet brownouts” by 2010 unless the RBOCs got everything they wanted. David Isenberg, who rejected it at the time, points out that 2010 is only weeks away and that the world has inexplicably failed to come to an end. The author of that particular scare story, however, is still around, arguing that the non-apocalypse means no-one needs net neutrality. Perhaps it’s best just to remember that the Discovery Institute, which keeps publishing this stuff, is better known for creationism.

Leaving aside the dinosaur-dodging angle, there is a very good point in Isenberg’s post - for mobile operators, the really scary prospect isn’t so much a wave of video traffic as a trickle of alternative voice traffic, because the applications that make them money are the low bandwidth voice & messaging ones.

Before the courts, Amazon gets sued over the infamous affair when they zapped every copy of George Orwell’s Nineteen Eighty-Four on every Amazon Kindle in the world. Orwell’s novel describes a totalitarian future where books are illegal and all forms of media are delivered through networked touch-sensitive screen devices, everyone is constantly monitored by CCTV, and the working class is distracted with cheap gin and pornography, while a huge sinister bureaucracy controls the official version of history through its vast archives; good job nothing like that could happen in real life, eh. They also zapped his Animal Farm, in which society is ruled by pigs…hold on…

Amazon is paying a student $150,000 for accidentally zapping his notes as well as the book, but the interesting bit is that they’ve also agreed to stop zapping - however, the case doesn’t set a precedent, so this is probably worth as much as a promise from Agent O’Brien. In less sinister e-book news, Sony has announced that anyone can now upload their work to their eReader platform and be paid royalties in the unlikely event anyone reads it - Amazon already offers something similar, but only in the US.

If you’re trying to control the future by editing the entire past to match the party line, you’re going to need better IT than Winston Smith’s colleagues at the Ministry of Truth had; Amazon can help you there too. Users of Amazon EC2 can now get instances of Hadoop, the open source version of Google’s MapReduce data crunching algorithm, and use an SQL-like database query language to find all documents referring to recently unpersoned doubleplusungood thoughtcriminals and render them into correct newspeak.

And don’t the kids just love it. They’re selling 50,000 EC2 server instances a day; amusingly, IBM reckons that as applications move into the cloud, the code bloats out just like it did on the desktop as more processor cycles became available. They should know - Lotus Notes is coming to the cloud as a Web application.

Enterprise crackberry heads can rejoice: Open Screen is bringing Flash content to the BlackBerry platform, so they can finally watch all that stuff nobody actually watches. RIM, Apple, and Google are cooperating to create a Flash Player plugin for mobile devices in an effort to decrease the broken website count.

And the sad and protracted death of Nortel goes on; the GSM business is up on the auction block, and so is the GSM-R operation. That’s GSM for Railways, in case you didn’t know - what could be more French than the combination of centralised switching and 300mph nuclear-powered trains?

Here’s something interesting; South Korea’s regulator wants to encourage a higher percentage of users on flat-rate mobile data tariffs. Clearly, they find the applications/content/services layer much more interesting than the operator economy.

At the other end of the bandwidth scale, the next mobile payments launch is here, in Sierra Leone, and the next instalment of David Burgess’s series on running an open-source GSM network at Burning Man is here as well. Amusingly, one of his biggest problems was getting the provisioning/OSS BSS side working - those grizzled old bellheads clearly didn’t ride up the Amazon on their bikes.

And Apple greets the 2 billionth download from the App Store. In other slightly overhyped news, Spotify is offering offline music for a premium - isn’t that just an MP3 download? - and The Guardian is an iPhone app, because what other device would its readers carry? Google Wave is out in beta, with 100,000 invited users. Meanwhile, Morgan Stanley analysts reckon the end of iPhone exclusivity will sell another 20 million of the things.

There’s an interesting interview here with Tom Bowden of the BBC’s commercial arm. 40% of the BBC’s Web traffic in Africa comes from mobile devices, and it’s actually a majority in Nigeria.

And finally, Comcast’s CEO makes it to the top five overpaid executives in the USA.

To share this article easily, please click:

October 4, 2009

Leaked iPhone Commercial - the power of ‘customer data’

Below is a new ‘leaked’ iPhone commercial* that demonstrates the potential value of the subscriber data that runs through telco networks. You’ll notice that there are a few privacy issues, though…

At the upcoming Telco 2.0 ‘Executive Brainstorms’ (EMEA, 4-5 Nov, London and
AMERICA, 9-10 Dec, Orlando ) we’ll be debating in detail the topic of ‘Customer Data 2.0’, and sharing some quite amazing real-life examples of what you can do with customer data (within the boundaries of privacy) with those working at the bleeding edge.

To share this article easily, please click:

Telco 2.0 Strategy Report Out Now: Telco Strategy in the Cloud

Subscribe to this blog

To get blog posts delivered to your inbox, enter your email address:


How we respect your privacy

Subscribe via RSS

Telco 2.0™ Email Newsletter

The free Telco 2.0™ newsletter is published every second week. To subscribe, enter your email address:

Telco 2.0™ is produced by: