" /> Telco 2.0: January 2007 Archives

« December 2006 | Main | February 2007 »

January 25, 2007

Voice & messaging survey: first impressions

We’ll be closing our Voice & Messaging survey early next week, so if you want a freebie copy of the summary results, you need to get going and complete it now. If you just do the mandatory questions it takes about 15 minutes.

We’ve had a few surprises. Either the Prozac’s been on special offer this month, or things are looking up. You’re overall quite positive about revenue growth in mature markets — but opinions are divided. We’ll be doing some “slide and dice” to find out who and why.

There’s a lot more appetite than we expected for operators to engage in product and feature innovation. We asked:

“In competing with Internet voice and messaging services with rich functionality (e.g. IM vs. SMS, Skype vs fixed line), rank each of the following tactics.”

We gave the following options:

  • FIGHT: Rapidly improve service capabilities to include presence and multimedia features, offer a softphone/IM service, expand interoperability efforts with other carriers, lower prices and/or offer large/unlimited tariffs.
  • EVADE: Build service around unique assets like home hubs and fixed-mobile converged products. Avoid high-priced flat-rate Internet access and sell value-based bundles of services with inclusive connectivity charges.
  • CO-OPETITION: Offer a limited partnership, and co-operate only where capabilities and services don’t overlap (e.g. access to pre-paid payments for premium Internet services). In-source selected products like mobile photo sharing to fill service portfolio holes. Revenue share search and advertising.
  • CO-OPERATION: Partner in sales and marketing, leave the advanced messaging, media and search/advertising services to the Internet partner, and focus on legacy voice/messaging services, billing and customer service.
  • RETREAT: Move to a pure pipe model of selling service. Use Internet brands as primary retail channel partners (“Google phone” etc.), and focus on underlying infrastructure and service delivery.

To which you’ve so far responded as follows:

Just under 50% of respondents so far selected “fight” as the best or 2nd best option. Given the overall lack of confidence you’ve expressed in the industry’s future based on current trends, maybe this is a message to CEOs and boards to switch from playing defence to offence? Perhaps the Apple/Cingular iPhone is a first stage of a new features and user experience war brewing in core voice and messaging services?

We then asked about where the revenue opportunities lie, giving among other options:

  • OPEN APIs: By opening up the voice, voicemail and messaging platform with APIs to enable 3rd party services and extensions, operators can generate enough new revenues from partners to significantly offset price competition in core voice and messaging services.
  • CALLING FEATURES: A large number of users are willing to pay for new advanced calling features (e.g. intelligent call routing based on time of day, calendar, recent activity with caller).
  • PRIVACY FEATURES: A large number of users are willing to pay for privacy features, such as multiple or disposable numbers or temporary identities.
  • REAL-TIME SERVICES: At least one new real-time service offered by operators (other than mobile IM) will significantly increase industry revenues by achieving mass-market consumer adoption (e.g. push-to-talk, push-to-view, voice messaging).

By far the strongest positive response was to opening up the voice platform and enabling integration with 3rd party services. Compare the desire for externally-driven innovation with that of internally-defined features and services:

When asked “How well do we as an industry understand what additional needs user have for voice and messaging products, over and above what they have today?” the overwhelming response is “not very well”. This re-inforces the message that operators can compete directly against Internet giants — but to do this they must create a vibrant rival ecosystem as part of a Telco 2.0 open platform play.

To share this article easily, please click:

January 22, 2007

Mobile NGN - a Real Telco 2.0 Opportunity?

I managed over the weekend to plough through the sixty pages of “Next-Generation Mobile Networks (NGMN): Beyond HSPA and EVDO”, the latest white paper of NGMN.org, an initiative by the CTO’s of China Mobile, KPN Mobile, NTT DoCoMo, Orange, Sprint Nextel, T-Mobile International and Vodafone Group. It provides a technical requirements framework to vendors for the next iteration of mobile networks. (This feat did require the assistance of a large box of chocolate-coated gingerbread biscuits for moral support, though.)

[NB: We’ll be debating the white paper’s contents with one of it’s main authors, Hossein Moiin from T-Mobile, at the Telco 2.0 Industry Brainstorm in March - Ed.]

To be clear, what’s defined is just a technology toolkit. Different carriers may deploy it in different ways with varying business models and services. Until we see the business models, jubilation or damnation is premature. Nonetheless, this is an extremely important document. The “walled gardens” of 3G are starting to look like weed patches, and this is a rare chance to define a truly new Telco 2.0 approach that takes the best of the Internet and traditional telecoms models.

I’m quite pleased by the sense and clarity of the document. It avoids wild flights of fancy about sophisticated combinatorial services, and focuses on practical implementation concerns of mobile broadband. It rightly sees the mobile ecosystem as a co-evolution of devices, access and services. This offers a valid and viable parallel/alternative path to the fragmented and sometimes chaotic Internet approach. It’s clear about what generic classes of service are to be offered, and what tradeoffs are likely to be acceptable. The document also outlines a very much evolutionary approach: business-as-usual, only faster and cheaper.

And therein lie the big questions:
* Does it go far enough in addressing the forces tugging apart network access, services and devices?
* Does it react to the counter-forces that would push them back together in order to address deep architectural issues of IP and the Internet (such as weak security and low efficiency)?

Our answer based on our reading is “maybe, if deployed right” — but you need to be a bit of a Kremlinologist to read between the lines and think about what’s left unsaid.

We’ll start with the easy bit: things in the document that make sense about Making Money in an IP world. Then we can delve into the more philosophical and practical limits of that IP world and how a next-generation architecture might address them.

Plenty to praise

There are many positive improvements proposed. Some highlights might include:

  • Self-configuring networks that cost less to run.
  • Improved scheduling algorithms that focus on user “quality of experience” at the periphery of a cell site, rather than RFP-friendly numbers for maximum burst throughput standing under the cell tower at 3am on Christmas morning.
  • Flexible and modular service-oriented architecture to accommodate future change.

Put simply, whatever NGMN turns out to be, operators want OSS and BSS thought through in advance, and for vendors to take responsibility for the operator and user experience post-installation. So far, so good.

Aligned with several Telco 2.0 trends

There are also some features which come with the “Telco 2.0 Approved” stamp because of their reflection of the business trends we see:

  • The ability to share equipment and do more slice-and-dice of the infrastructure similar to MVNOs, but better. We believe infrastructure sharing and new modes of financing/ownership as being a key Telco 2.0 trend (as we will discuss at our forthcoming Digital Town event workstream).
  • Stronger device and end-to-end security to enable transactions of money or sensitive data. As telcos are already diversifying into the payments and identity business, this can only grow — and depends on such enabling infrastructure. DoCoMo are part of the consortium, and given their trailblazing in payments services, we’re hopeful of seeing diversification successes of operators elsewhere based on their learnings.
  • Detection and mitigation of network traffic resulting from malware or attack. This we feel will be a growth area as the services become less controlled. A limitation of the “intelligence at the edge” concept is the ability of those edges to collaborate to detect and eliminate abuse. The experience of email spam and phishing tells us that not all is wonderful in Internetland.

Moving on, there are several things conspicuous by their absence.

The Internet elephant in the corner

Apart from some in-passing references in a few tables and diagrams, the word “Internet” is wholly absent from the document. It’s a bit like Skype, YouTube and BitTorrent never happened. In fact, you can only conclude this absence is deliberate.

It could very well be that the technology defined can be deployed in very different manners, and operators may take radically different approaches — such as the contrast between 3 and T-Mobile in the UK embracing open Internet access, O2 trying to keep people on-portal, and Vodafone outright banning many popular Internet services such as IM, VoIP and streaming. Will operators want to continue to ride the “Telco 1.0” command-and-control horse, or switch to a more open “Telco 2.0” Internet-centric approach? Will the point of a future mobile network to channel bits back at all costs to a cell tower where they can contend for expensive backhaul to be deep-packet-inspected. metered and accounted for? Or will it complement the other infrastructure that exists?

The IMS mouse in the cupboard

Equally conspicuous by its general absence is reference to IMS. My take is that there could be a polarisation here between “service-centric” operators trying to define interoperable new services and compete against Internet players; and “connectivity-centric” operators who create “smart dumb pipes” and enabling platforms for a wide ecosystem of players. You could deploy NGMN and completely ignore IMS if you chose to do so.

Local connectivity, globally interoperable

At the other extreme of connectivity, another thing not given much ink is the explosion of highly local connectivity. For example, we’ve just passed the billionth Bluetooth-enabled device. Motorola’s Chief Software Architect, John Waclawsky, described this at the last Telco 2.0 event in October in his presentation “From POTS [telephony] to PANS [Personal Area Networks]”. The mobile network itself can still play a part in this, such as offering directories of resources. If you’re sat in Starbucks today and want to print out a document, you’re out of luck — the network can’t help you locate or pay for such services.

Given that this is an integrated vision of handset, network and service evolution, we think it may be gold-plating the longhaul connectivity vision, and underspecified the local connectivity one. The business model will also need to evolve, since there may be no billable event. It has to anyway: products like Truphone will make it ever easier for users to bypass or arbitrage network access.

What’s the commercial vision?

Naturally, the operators can’t write down a collective commercial vision (because of anti-trust), nor an individual one (due to commercial confidentiality). So you have to impute the commercial vision from the technology roadmap.

The stated requirement is for a network that’s low-latency, efficient, high-throughput, more symmetrical, good at unicast, multicast and broadcast, cheap, and interoperates seamlessly with everything that went before it. It’s a bit like low-calorie cream-topped chocolate cake. Sounds like a good idea, until you try making one.

The inevitable billion-dollar question is what are the services and the business model that will pay for all this? The experience from 3G was that “faster” isn’t itself a user benefit of significance (particularly when it doesn’t work indoors!) In fact, given that battery technology follows a curve well below that of Moore’s Law (or its transmission equivalent), there’s the “oven mitt” problem of early 3G handsets still lurking: how to create hand-held devices that are physically capable of sourcing and sinking data at such speeds and over such distances (and high power) — and that create services users care about in the process.

Or, to put in another way, why sync my iPod over the air slowly when I can plug this USB cable into my laptop and do it at 400Mbps for free?

What is a mobile network for, exactly?

There’s a significant difference of expert opinion here that’s worth noting. There isn’t universal agreement on what wireless networks are best used for compared to wireline. For example, Peter Cochrane, the former CTO and head of research at BT has long been keen on forgetting DSL and copper and going all-wireless. NGMN’s ambitions to match and exceed the technical and cost capabilities of DSL suggest a commercial vision of competing against fixed access.

Our take is that success is most likely to come from intelligently blending the best of fixed, mobile and media-based delivery of data, rather than an absolutist approach to any one of these. Furthermore, the unsolved user problems are more to do with identity, provisioning, security and “seamlessness” than speed or even price. Finally, users don’t generally see the up-front value in metered or fixed buckets of IP connectivity, particularly given the anxiety it causes over cost or overage. True unlimited use isn’t technically possible, so the network has to allow connectivity to be bundled into the sale of specific device or application types, where traffic is more predictable.

Stop looking for the platinum bit

The hypothesis seems to be that some bits will be blessed with “End-to-end QoS” and continue to gather super-premium pricing (by many orders of magnitude). The need for this QoS capability is repeatedly stated. At the same time as the network capacity, latency and cost improve to near-wireline levels. I think you can spot the problem. I’ve made a successful Skype call to someone 35,000 feet up on a 747 somewhere over central Asia, and there wasn’t any QoS involved.

Our post yesterday on Paris Metro Pricing attempts to challenge some of the assumptions that drive this requirement. It sounds esoteric to those from the commercial side of the business, but ignoring this small technical detail is telecom’s equivalent of the frozen O-ring. Set the price high, and at some point all the valuable bits flow around the “premium pipe” and not through it, and the commercial model fails.

NGMN could be part of the solution here, not the problem. If operators can switch to a congestion-based mode of pricing, rather than pure capacity, they could offer users a far better deal.

What are the real sources value?

Here are some examples of requirements in the document, and how NGMN provides opportunities for product and business innovation:

  • Making user data more seamlessly accessible, blurring the line between online and offline. The specification includes
    Standardised APIs (i.e. not operator or handset-specific) to sync online and offline data like address books, so the user doesn’t have to care so much about network connection state. This whole process could be taken much further to cover all content. This lecture video by Van Jacobson, former Chief Scientist at Cisco, points to a very different future network architecture based around diffusion of data rather than today’s packet-only networks where you have to know where every pieve of data is located to find it. (Hat tip: Gordon Cook.) This isn’t a theoretical concern: wireless networks readily become congested. Maybe it’s time to reward your neighbours for delivery you the content, rather than backhauling everything across the globe. The Internet’s address space is flat, but its cost structure is not.
  • Deeper coverage, richer business models. The document talks about hub terminals (e.g. femtocells). Deep in-building and local coverage is a clear user desire. The first step is outlined, but there’s no corresponding economic model being included. Companies like FON and Iliad are doing innovative things with user-premises equipment and roaming. We nope NGMN doesn’t repeat the experience of Wi-Fi, where hooks for payment weren’t included (causing a mess of splash screens), and the social aspects neglected (am I sharing this access point deliberately?). The existence of bottom-up network deployment is an interesting possibility. You need to create new security and payment mechanisms so that local entrepreneurs can extend networks based on local knowledge and marketing. Top-down is becoming top-heavy.
  • Support for a diverse array of charging models. It’s in there, but could get lost in the deep-packet-inspection swamps. The genius of telephony and SMS is to sell connectivity bundled with service in little incremental slices. We’d like to see richer, better and simpler ways of device makers and service providers bundling in connectivity. (See out earlier artlce on this for more details.) For example, the manifest of a download application could say that Acme Corp. is going to pay for the resulting traffic — and the secure handset will ensure it’s not abused to tunnel unrelated data at Acme’s expense. NGMN could enable this.
  • Uplinks vs. downlinks. Users create as much content as they consume. Devices are equipped with multi-megapixel cameras and video capture, which will be uploaded for online storage and sharing. That media is then often down-sized for viewing (if it is ever viewed at all). Yet the standards continue to emphasise downlink performance. We’ll acknowledge that from a technology perspective uplink engineering is like trying to fire bullets back into the gun barrel from a distance. Somehow this issue needs to be looked at. NGMN takes us closer, at least.
  • Peer-to-peer. A great requirement in the specification is “better support for ‘always on’ devices, with improved battery performance and network resource usage.”. We’d second that. But given this requirement, where’s the peer-to-peer specification of the services those devices should host? Or do operators still believe that the purpose of the network remains distribution of professionally authored media entertainment from “central them” to “edge us”?
  • Building an identity-centric business. Another good requirement is for more advanced modes of device authentication, such as sharing a SIM among multiple devices. In some ways it defines an “identity network” that is independent of the NGMN, and potentially fixes some serious problems with the Internet. Mobile networks may happen to use those identities, but they’re equals with other uses. We’d encourage more creative thinking in this area.

Summary thoughts

Overall, it’s a good piece of work. Change doesn’t happen overnight, and given a 3-5 year time horizon, the world will not be beyond recognition. Nonetheless, without a parallel vision of business model evolution, much of the investment in NGMN could become as equally stranded as that in 3G. With the right vision, it could make the “mobile Internet” really work, since the “real Internet” continues to be a polluted, expensive and frustrating experience for users.

To share this article easily, please click:

Re-thinking QoS: Paris Metro Pricing

We’ve been reviewing some documentation from network operators recently. A recurring theme is the need for “end-to-end QoS (Quality of Service) guarantees”. We’re not convinced this is such a compelling business requirement any more. Business is about creating or capturing a scarce resource or capability and charging for access to it. We’d urge those involved in commercial functions at operators to question the QoS orthodoxy and what the network engineers are commissioning. What in your business is really creating value that users are willing to outbid each other to access?

Is the user willing to pay for more than best-effort delivery?

One way of looking at IMS is that it is a system for assigning finite network capacity to users based on the technical and economic needs of their applications. Since those applications have radically different throughput, jitter, latency and resilience needs, the claim is that only through active management of the network operator can they be made to co-exist peacefully. Thus for a decade and more the more traditional telecom outlook has been that “quality will win out” over Internet competition. Meanwhile, Internet absolutists reject any possibility that anything other that best-effort non-discriminatory packet delivery is needed. The existence and popularity of Skype should give the former pause for thought; the latter should be concerned about the conspicuous success of services like SMS, which vertically integrate identity, payment, access and application service.

We’d like to point to a “third way” that offers a better solution than either. The particular example we will discuss is Paris Metro Pricing. It brings into question the fundamental objectives of IMS as an industry-wide platform beyond PSTN replacement.

Squeezing the user through the toll gate

Two of the essential parts of the value proposition of services delivered over any IMS architecture are reliability and quality. To differentiate themselves from pure Internet players and turn a profit, fixed and mobile operators perform two actions. Firstly they create circuits or channels with attached bandwidth guarantees. This creates value to end users which can be charged for. They then try to eliminate or discourage alternative “best effort” means of communication (to maximise profit). This can be done by technical or contractual means, subject to market power and regulatory constraints. For example, Verizon reserves most of the capacity on its residential fibre network for its own video service and forces its own restrictive equipment into the user’s internal home network; and Vodafone bans VoIP from 3G services through contract terms.

IMS involves the reincarnation of circuits as sessions to be managed by the service delivery platform. (IMS enables other value-add functions such as smart call routing, but that is a separate issue to QoS. In the absence of QoS issues, other architectures become competitive.)

What gets priority? And who decides?

The problem is a deep and fundamental one. The network operator isn’t necessarily best-placed to decide what is most important to the user (and thus deserves priority). Furthermore, an inflexible vertically-integrated architecture fails to adapt to different needs, and itself becomes an artificial bottleneck whose sole purpose is the creation of billable events.

The Guardian newspaper report on the Mumbai bombings is all-to-typical of what happens in extremis. When people are willing to communicate via any means available, and almost at any price, the network fails them.

Witnesses reported body parts littering the railway tracks. TV news channels broadcast footage of bystanders carrying victims in driving rain to ambulances and searching through the wreckage for survivors and bodies. Confusion and panic was compounded when the local mobile phone network collapsed.

The same happened in New York, Madrid and London. The design decision to give priority to voice traffic—due to the constraints of 1990s technology—fails to reflect user need. For every voice call that goes through, ten are turned away. No nines, not five nines.

“Intelligence at the edge” applies to people as well as machines

This idea that the edges of the network are the ones to call the shots isn’t new. Indeed, it’s the same observation that underlies the 1981 paper that introduced the end-to-end principle that forms the foundation stone of Internet design.

They state:

functions placed at low levels of a system may be redundant or of little value when compared with the cost of providing them at that low level. Examples discussed in the paper include … duplicate message suppression … and delivery acknowledgement.

… The function in question can completely and correctly be implemented only with the knowledge and help of the application standing at the end points of the communication system.

In other words, only the edge points have the context needed to implement these functions. (There are clearly some limits to the applicability to these ideas that we won’t cover here, such as security or broadcast content.) Protocols like TCP/IP were thus born, and best-effort internetworks created with no need for common agreement on flow control or handling dropped packets. (Contrast this with the reams of technological specifications and testing to peer two IMS networks.)

In the disaster scenario, users were unable to engage in a graceful service degradation from wideband audio, to barely comprehensible audio, to push-to-talk, through to IM and eventually to store-and-forward email. The services and their priority were set in stone. Furthermore, every user may have different preferences and needs, all of which change dynamically. Only the users knew what should be given preference, and how much they were willing to pay to get through.

The same issues apply in more mundane everyday situations, just less dramatically and quickly.

How to preserve the best of the Net - and cast off the worst?

Fortunately, we already have a technique to deal with distributed information and resource contention: the market. The issue becomes: how can we create “spot markets” in connectivity in a manner that doesn’t burden the user with decisions?

This area has been explored in depth in the past. One solution proposed and patented by former AT&T Labs researcher Andrew Odlyzko is Paris Metro Pricing (PMP). There are several other competing “less than best effort” proposals, each with its own merits. PMP deserves study because of its minimalism; it may not be the best form for a particular application or operator, but gives us the best insight into what’s wrong with the IMS approach to QoS.

The idea is very simple indeed. Partition the connectivity into multiple “dump pipes”, each with its own price per packet. You’ve not baked-in any particular assumptions about the technical nature or user value of any particular application. Nobody needs to speak IMS dialects of SIP if they don’t want to. Performance degrades gracefully, as long as the applications place their traffic down the right virtual pipe.

The name comes from the former ticketing system on the Paris Metro, where first and second class carriages and service were identical, bar the price. The double cost of a first-class ticket created a self-regulating system of congestion control.

The challenge to IMS

PMP is implementable today with off-the-shelf open standards such as diffserv. It potentially offers a completely different future and architecture than IMS. No longer do you have to provide a “customs declaration” to the network of the traffic’s content in a telco-specific dialect of SIP. Since the edge device selects the quality needed directly, the network doesn’t need to know what the traffic is to reserve the bandwidth. That means peer-to-peer architectures, at a fraction of the cost of expensive central switches, become very attractive.

Peer-to-peer file transfer dominates Internet traffic today. Best-effort real-time services like VoIP work well most of the time. The main user concern is security. Given the alternative of PMP (or its many brethren), you have to seriously question how much value IMS’s QoS infrastructure is really adding.

Nothing in life is free, not even bandwidth

There are numerous ways in which the simplicity of PMP would have to be tempered with the reality of actual networks and customers. For example, in mobile networks there may be additional costs to re-authentication (after long idle periods) and channel set-up and tear-down. PMP doesn’t model these. The user interface to applications may have to change to enable users to dynamically change their preferences. “The network is congested and your call cannot proceed. Are you willing to pay 20¢ per minute to complete the call?” There are implementation and accounting costs.

Any form of QoS ultimately destroys network capacity and throughput, never creates it. The compensation, we hope, is that the lost and delayed traffic is worth less than the incremental gain to the highest priority traffic.

Above all, there are the human factors of marketing, customer expectation, and support to deal with.

Being a “pipe” operator is a good option for many operators, but they are afraid of the “dumb pipe” label and loss of price discrimination ability that service provision gives you. Re-thinking QoS enables you to innovate in the pipe business model. You’re selling priority, not fixed capacity; abundance rather than expensively metered scarcity. Let each user decide whether the online backup or video stream is the more important. Just make them pay for the privilege of displacing other users’ traffic.

None of this is to say that the network operator isn’t actively involved in allocation of bandwidth. As the issuer of a home hub, set top box, media server or dual-mode handset you have considerable scope for integration of these items into a packaged and integrated bundle. It’s just you need an economist to help you do it, not just a network engineer.

To share this article easily, please click:

January 18, 2007

Digital Home - The Opportunity for Telcos

As you’ll have seen for ‘Digital Town’ here, we create detailed hypotheses as a brief for participants at the Telco 2.0 brainstorm . Next up is the Digital Home work stream on Day Two, which looks at the impacts of high speed broadband and wireless access, and the parallel evolution of other technologies, on the home environment. We’re delighted to have senior execs from BBC, Belgacom, BSkyB, FT-Orange, Slingbox, Telecom Italia, Telefonica, Telenor and others preparing special stimulus presentations for this. (Thanks again for input to Alan Patrick from Broadsight who is working with us on this).

So, below is what we think is the ‘Situation’, ‘Complication’, ‘Key Questions’, and ‘Way Forward’ for ‘Telcos in the Digital Home’.

The Situation today is interesting to say the least:

- Broadband takeup in recent years has been phenomenal in many countries, driving a shift in the way all media is consumed in the home
- The Digital home has a high penetration of separate devices - PCs, PDAs, TVs, Mobile, MP3s, CCTV, Games Machines and other intelligent home devices
- There have been a number of mixed media Mobile/VoIP home services launched
- The overall economics of the business world are increasing the amount of home working, and home workers, as HomeComms gets better
- Trends identified in the 1990’s such as “Cocooning” ourselves in our homes show little sign of abating.
- With e-commerce, e-learning, e-health and e-government it’s easier now to live without leaving the house at all, you even socialise with your e-Friends on your favourite Social Networking site (case in point - my home shopping arrived this morning while I did my tax return online and researched and wrote this
article)
- Researchers at MIT and other places have shown how immersive and ambient Home Networks may work, ushering in the new dawn of the Smart Home

However, there are some complications to this e-dream. Clearly a Telco wants to be a major player in the Digital Home, but there are major barriers.

- Can Broadband really deliver big bandwidth if everybody is using it - bandwidth limits, contention and xDSL speeds will all be pushed to the maximum
- The commercial models for a supplier of home web traffic are unclear - a Web TV service will use up an entire monthly allocation of ISP bandwidth in an hour in the UK, for example.
- Mobile / IP mixed media services have not yet been resounding success stories - in fact, mobile 3G overall has not really lived up to its promises, leaving the field open for a variety of other wireless approaches - WiFi, WiMax and - thus muddying the aether considerably.
- The home is a “Digital Mess” today - everyone wants to be the Home Hub, but no one wants to be the Tragedy of the Commons so there is little commercial interest to make different devices interwork easily.
(Some people are amazed that we can connect a PC to a TV) - this will possibly creates a barrier to mass adoption.
- Telcos still, by and large, don’t “get” customer centric multi-service provisioning - their structures are built around discrete stovepipe product provisioning, so many of them fail to delight customers in a multi service world.
- Consumers are limited by choice and budget - in the “Attention Economy”, does a Digital Home actually help people free up time or money?
- With respect to Home Working, Cocooning and the ease of e-Hermiting overall, it is becoming clearer that we are social animals and for our (and our societies’) health we should get out more…..will counter-trends and possibly regulation militate against the Digital Home?

Key Questions

For a Telco the partners in its ecosystem, there are four fundamental questions: Will “Digital Homes” ever be a mass market, if so then what will they look like, what will be the Telco’s role in the Digital Home (and why so?) - and finally, what does the Telco have to do to serve its customers in this new environment? The Telco 2.0 Digital Home session will address some fundamental questions:

- What types of Digital Homes will emerge - what are the drivers and trends for, and against, them emerging en masse?
- What will they look like - what will people do in them, how, and by when?
- What role does a Telco play - pipe provider, service provider, e-Commerce enabler - an where is the benefit likely to be
- Similarly, what roles will its partners play in this environment - and are there new partners needed, old ones to be dropped?
- What do Telcos and their partners have to start doing today to serve its customers - what options to take, services to start, structural changes to make?

Towards a Solution

Our Hypothesis, which we will be debating is that:

- The Digital Home will arise, but it will be a patchwork quilt depending on location, demographics, lifestyles and conflicting needs - flexibility will be key
- Technology adoption will be a patchwork quilt but in the main will be driven by people’s desires for Communication, Entertainment and an easy life - they will live with unsolvable problems rather than solutions that are too hard to use.
- The role of the Telco is to use all its assets across the value chain to make it as convenient as possible for its customers, and their suppliers, to use its capabilities
- To make this a success, Telcos and their partners need to become more customer-centric across the whole customer lifecycle. They will all have to expose more of their end to end capabilities to customers and potentially each other - and other service providers - now.

We are seeing the space polarise into “experience providers”, and “experience enablers”. Our hypothesis is that very few telcos will be experience providers. It’s up to them how much of the enablement they want to participate in
— there’s plenty of opportunity, and correspondingly little action today. Come and debate the issues at the Telco 2.0 event in March.

To share this article easily, please click:

January 17, 2007

Advertising Opportunities: Do Telcos Understand the Value of their Assets?

Knock Me Down with a Feather…Another Telco 2.0™ Survey?
In the last of our Autumn and Winter series of surveys, we have now opened the survey on Telcos and Advertising. Why 3 surveys? Well, they add real insight to the 3 Telco 2.0™ Strategy Reports we are publishing: Telco 2.0 Market Study, Strategies for Growth in an IP based World (updated January 2007); Consumer Voice and Messaging 2.0: Telephony and Messaging meet Skype and Yahoo!; Telcos and Advertising: How to Generate Value. And anyone who completes the survey gets a FREE set of summary results.

For the Advertising survey, we have been very lucky that the GSM Association have sent the link to mobile entertainment and advertising executives in their database and OgilvyOne have circulated the link internally to 400 people in their Telco Taskforce. We should, therefore, get some terrific input from the advertising community (see below for some early indications).

Telcos and Advertising Industry Workshop: 30th March, London
We will be presenting the results from the survey and our analysis at the workshop on how telcos can generate value from advertising on Day 3 of the Telco 2.0™ Event. There will be a number of senior stimulus presenters from the Telco and Advertising communities and we will be exploring what the attending influencers and decision-makers need to do to realise the opportunity.

The event is invitation-only and invitations will be going out in a couple of weeks.

If you would like to join us, or feel there are people who should be part of the debate, please email me.

The Advertising Landscape
In our survey, we have been asking respondents to consider advertising opportunities for Telcos in fixed and mobile. Essentially, we see these falling into 4 quadrants:

  • Telcos funding or subsidising their service portfolio (voice, messaging, content) to their own customer base with advertising (quadrants 1 and 2 below).
  • Telcos leveraging their assets to create enabling services that improve the overall efficiency and effectiveness of digital advertising channels (quadrants 3 and 4 below).

The survey (and the report) explores opportunities and challenges in each of the 4 quadrants.
Telco%20Advertising%20Market.png Do Telcos Underestimate their Advertising Assets?
The survey has recently launched and so far around 65 Telco 2.0 cognoscenti have completed our beautifully crafted efforts. I pulled off some results when we reached 50 respondents yesterday. Interestingly, we have almost a 50:50 split between the Telco and Advertising communities, which makes comparing viewpoints on similar questions very interesting.

For example, in the survey we ask respondents to rate assets and skills of Telcos that could be valuable to Advertisers from ‘Very High Value’ through to ‘Little/No Value’. The following assets and skills are evaluated as potentially valuable to Advertisers:

  1. A big customer Base - Reach
  2. A broad customer base - Demographic Mix
  3. A record of customer usage & preferences - for campaign planning & targeting
  4. Real-time information on customer activity - location, calling, internet usage - for delivering contextually relevant ads
  5. Other customer data - name, address, credit history etc.
  6. A fixed portal presence
  7. IPTV/Video-on-Demand products
  8. Voice and Messaging (SMS, MMS, Voicemail) products
  9. A mobile portal
  10. Mobile device menu/idle screen
  11. High street retail presence (shops)
  12. Customer billing relationship
  13. Customer service - call centre/in-store
  14. A strong and trusted brand

Below is a chart of average results for Telco respondents and Advertising respondents:
Value%20of%20Telco%20Assets.png

A couple of things strike me about these results:

  • Although the results between the 2 groups are closely correlated (r2=0.86 for the mathematical amongst you), overall Telco management perceive telco assets and skills to be substantially less valuable to Advertisers than Advertisers themselves
  • In particular, Telco respondents were substantially less positive than Advertisers about the value of:
    • The Telco Brand. I know several telcos have been getting a bit of bad press about customer service (e.g. Talk Talk) and results (e.g. Deutsche Telekom) but it seems that Advertisers feel that the Telcos have a strong and trusted brand with which they would like to be associated.
    • Customer Data. Ha! We have been banging on about this for a while and I’m delighted to see that Advertisers agree - Telcos underestimate the value of customer addresses/demographics/credit history etc. which they hold
    • Voice Products. Seems like Advertisers would like to be associated with Telco voice products (or to use them specifically as a delivery channel). As we have already said, this represents an opportunity but also a cannabalisation risk to existing Telco subscription revenues.
    • The Mobile Portal. Advertisers obviously feel that success of internet portal advertising is replicable to the mobile portal, Telco respondents are less confident.
    • IPTV and Video-on-Demand. Many respondents from the Advertising community have already commented on the near-term opportunities for growth in IPTV and Mobile TV advertising because it combines a well-understood format (by both advertisers and consumers) with the potential to offer greater targeting and personalisation.

It may not be a Knight in shining armour, but while operators want to provide premium content and develop a future potential revenue source, advertising remains an attractive potential (unproven) option. It could enable operators to increase the volume of content AND reduce pricing and so avoid the criticism that has been levelled at Virgin Mobile in the UK with its TV offering.

The survey is jam-packed with questions about Advertiser needs and Telco attitudes to Advertising and challenges respondents to think through WHAT NEEDS TO BE DONE TO MOVE FORWARD. I’ll share more snippets over the next week or two but, in the meantime, here’s the survey link again - get cracking cos it closes midnight on Friday 26th January!!

To share this article easily, please click:

Unbundling the telco

The telecoms industry is torn between two forces:

  • the desire to maintain control over every stage of production: retail, services, network delivery and user equipment.
  • the reality of the Internet and open standards/open source forcing these things apart.

We’re finding it interesting to compare and contrast some of our own analysis with that of leading thinkers in the field. John Hagel is one of the “must read” writers on the matter, and he recently re-capped his thoughts on unbundling of the enterprise. You can go read the original, but the heavily edited summary is:

“I believe that most companies are an unnatural bundle of three very different types of businesses: Infrastructure management businesses, Product innovation and commercialization businesses, Customer relationship businesses. These three business types have very different economics, skill sets and even cultures, yet they are tightly integrated into most companies today.”

So, what business are you in, again?

What we’re really debating is where the “cleave” points will be, should telcos start to break themselves up and/or re-structure and re-focus. So far, operators have mostly been trading vertically-integrated parts of themselves (Sprint and Verizon’s local networks to Embarq and Fairpoint Communications respectively, or BT’s spin-off of it’s mobile arm, O2, only to be picked up by Telefonica). They’ve also been spinning off ancillary businesses like yellow pages directories.

Our take has been that there are four generic strategies for the unbundled Telco 2.0 world:

  • digital lifestyle brand (which very few telcos are in a position to execute on, and as most have indifferent brands, we don’t tend to dwell on it),
  • protection/product (extending and expanding the current product set),
  • platform (relinquishing some of the customer relationship to focus on the nuts and bolts of delivery of 3rd party devices and services)
  • pipe (which can be very attractive if done right, as there’s plenty of mileage left in creating new ways of tying flows of money and bits).

You could maybe add a fifth which is “diversify”, which we’ll come back to later.

Too much duplicate, competing infrastructure

John’s vision of telecom is already partly true, in that much of the cellular infrastructure is managed by specialists like Towerstream and Crown Castle. Ericsson has done conspicuously well by moving into network management services, and taking that function off the telco. Many telcos have outsourced their IT. Indeed, he states this himself:

“The first wave of outsourcing can be understood as the systematic carving out of the infrastructure management businesses from companies…”

He goes on to predict:

“…we’re just on the cusp of a second wave that will unbundle product innovation and commercialization businesses from customer relationship businesses.”

Show me the product

If Apple’s iPhone is significant for one thing, it is that their Visual Voicemail capability suggests the end of telcos and NEPs as the primary drivers of core product features. Apple’s product didn’t come from an interminable sequence of standards committee meetings in Geneva and Hawaii. Nor did Skype. The complete integration of entertainment, information and communications functionality is more likely to be executed by a Nokia, Sony or Apple in partnership with services companies like Yahoo!.

Our still-open Voice & Messaging survey is also yielding some interesting results. By far the most popular response to the question “Where are there opportunities to raise additional revenue from voice and messaging services?” is to open up APIs into the OSS and BSS systems. Amazon would be the classic example of such a business, with own-brand retail complemented with delivery of multiple 3rd party retail sites using white-label web services and logistics.

New faces, old business models

John also uses The Gap as an example of how you can’t climb your way out of a strategic hole by just changing the guy holding the shovel. One of the findings in our recent survey was how keen respondents were on bringing in fresh blood from outside the industry into the “marzipan layer” of middle management. Probably a necessary action, but not sufficient if the vertical structure is maintained.

Top dog or tired mutt?

Developing the theme of structural change, the influential John Kay writes in the Financial Times about how companies that are at the top of their field rarely regain that position once lost. He muses:

Can telecom giants reinvent themselves where control of the last mile of wire is no longer decisive? When the historic sources of underlying competitive advantage have gone, businesses rarely return.

The exceptions he quote are well-known but nonetheless important. Most notable perhaps is IBM, which moved from a product business to a customer service one where a server and copy of Websphere just turned up as part of the deal without having to be sold separately. The nearest telecom equivalent is probably BT, who have diversified into enterprise services and solutions. Their strategy is not to win RFPs for network services, but to bypass the whole process by being in a smaller field of services businesses — ironically butting up against IBM in the process.

We’ve structured our next Telco 2.0 event to debate these very issues, and you’ll be able to hear from and challenge participants from leading network operators. We’ve also picked on one promising area for diversification, namely advertising, which has its own work stream at the event, plus a telco-centric market report being published next month.

To share this article easily, please click:

January 16, 2007

Meet the Telco 2.0 team

We’re going our bit for global warming over the next month or so by emitting all the CO2 we can whilst the opportunity is still there.

Our Chief Analyst, Martin Geddes, will be part of a panel at the FTTH Council Europe’s event in Barcelona on 7 February, together with a collective of fellow bloggers. He will also be a keynote speaker at O’Reilly Emerging Telephony on 1 March in San Francisco, as well as debating the merits of IMS at the same event. This is a particularly good event, which we’re proud to endorse.

The whole Telco 2.0 team from STL will be at 3GSM in Barcelona from 12-15 February, presenting findings and running workshops on the research we’ve done in partnership the GSMA - covering ‘Voice & Messaging 2.0’ and ‘Role of Telcos in the Advertising Value Chain’. We’re be in Hall 7 (precise location to be confirmed) - come and find us.

Then we’re off to St.Anton (high in the Swiss Alps) on 16th March for an interesting ‘workshop-and-ski’ organised by Jnetx on Next Generation Service Platform architecture.

Back in London, of course, for the big Telco 2.0 Industry Brainstorm on 27-29 March, and then zipping across to New York to support the GSM Association’s ‘Mobile Entertainment & Advertising Summit’ on 30 March (details to follow).

If you are interested in meeting us at any of these events, please do contact us.

To share this article easily, please click:

January 15, 2007

Digital Town hypothesis: the future of access networks

We’re working in the background preparing our next Telco 2.0 brainstorm event. We avoid the word “conference” as the format is so different: interactive technology, brief pre-screened stimulus presentations, real-time expert analysis, and after-event reviews. One of the ways in which we depart from the conference format is we frame up issues for debate — and also have a point of view ourselves.

One way we are trying to improve on the conference format is by giving all speakers and attendees a brief session “hypothesis” that provides STL’s view on the topic of discussion.

First on the lauchpad is our Digital Town work stream, which looks at how to improve the economic and social well being of municipalities via high speed broadband access. In a nutshell, we don’t think investors are happy with “business as usual” for redundant, competing access networks — unless they can capture monopoly rents and also keep the regulator at bay. The uncertainty of regulatory intervention ultimately works against the carriers, as it drives away risk capital. Are there better ways of dealing with the problem? We think so, and successful “pipe” providers will examine and embrace change in network funding models.

We’ll be debating these issues with a line-up of speakers representing communities, altnets, users, incumbents, new entrants, innovators, and technology disruptors. Do join us!

The state of the access market today

  • Retail prices for broadband in competitive markets have lowered substantially, stimulating growth.
  • Commercial FTTH seen in a few markets (Verizon Fios, Japan, Denmark), but generally limited compared to cable and DSL.
  • Variable results from muni network projects: lots of different service models, plenty of early lessons.
  • Investment in access technologies slowly recovering from the telecom slump; wireless (notably WiMax) looking healthier than optical.
  • Copper and coax networks benefit from new technologies (VDSL, DOCSIS 3.0).
  • Partial or total structural separation existing or proposed in many markets (UK, France, Ireland, Denmark, Japan).
  • Wider social and economic benefits of improved connectivity and broadband seen as strong.

The status quo leaves few participants happy

  • Competing wired access providers results in lower take-up rates, and risk bankruptcy cycle whereby losing network comes back without debt.
  • Funding from services revenue (TV, telephony) risks regulatory intervention and/or capture (most notably in US market)
  • Copper assets are being “sweated” and prices may be too low to fund access investment
  • Reach of true high-speed access in limited, and asymmetric networks don’t reflect increasing volumes of user-sourced traffic (home video, online backups, P2P file sharing).
  • Public sector connectivity purchases are highly fragmented, raising costs and limiting general public benefit
  • Emerging industries dependent on universal high-speed access are stifled
  • Social divide problems with access to social care, e-government and information/education facilities
  • Investors are scared of asset confiscation through anti-trust if “winner takes all” happens

What’s the issue?

How can we better fund more abundant access networks using the best of demand-driven market means and private risk capital - without the downsides of monopoly or eternal regulation?

STL’s proposed answer

  • More models for paying for connectivity than just “all you can eat broadband” — ad-funded, device-embedded, service-funded, free (social tax-funded), etc.
  • Different ownership and operational models (e.g. OPLANs, co-ops, muni, private shared like cell towers)
  • Innovation in the fiscal vehicles used for networks, appealing to long-term infrastructure investors (e.g. pension funds) seeking annuity-type returns
  • Diversification of network funding sources to include more of the beneficiaries: real-estate owners, public sector, local commerce, consumer electronics, etc.
  • Better co-ordination of public sector efforts: rights of way, attachments, commissioning, service provision, etc.

We’ll be debating these issues in-depth at the Telco 2.0 Brainstorm in the Digital Town workstream.

To share this article easily, please click:

January 12, 2007

Telco 2.0 in the Press and on TV

It’s been good to get some press coverage of our recent Telco 2.0 research. The journalists certainly seem to think that the initiative is refreshing…

And, then, a nice interview with the extremely nice Guy Daniels of Telecom TV today - view here (under ‘Web 2.0 threat’). (They’ll be covering our workshops at 3GSM as well, I think).

To share this article easily, please click:

January 11, 2007

GSM Association Collaborates With Telco 2.0 Initiative

STL and the Telco 2.0 Initiative is delighted to announce a collaboration with the GSM Association (GSMA).

The GSMA is expanding its range of services for its members and has recognised the cutting-edge thinking and practice that the Telco 2.0 Initiative is bringing to the market, especially around catalysing clearer understanding and change in response to the opportunities and threats for mobile operators in an increasingly IP-based world.

So, we’ll be doing the following:

- Undertaking joint research into new business models and new service areas, specifically: ‘Telco 2.0 Strategies’, ‘Voice & Messaging 2.0’, ‘Ad-Funded Services’.
- Supporting GSMA events (inc. 3GSM) with our ‘Mindshare’ interactive brainstorming processes.
- Collaborating on the Telco 2.0 Industry Brainstorms programme.

GSMA members, of course, will get preferential access to this activity.

Watch this space for updates.

To share this article easily, please click:

Telcos in Advertising Value Chain - Threat or Opportunity?

We’re delighted to be working with Alan Patrick of niche consultancy Broadsight on our Ad-Funded Services programme, which we’re running with the GSM Association. (Alan will also be ‘analyst in residence’ in the Digital Home stream at the Telco 2.0 event in March). Alan has been exploring developments in online advertising for many years based on his experience of working with companies like BBC, British Telecom (OpenWorld and Ignite), AOL Time Warner, ntl and UPC and consulting at McKinsey.

Below he gives us ‘Some thoughts on the impact of Online Advertising on the Telco 2.0 Landscape’. He says:

As we know, over the last few years, the penetration of broadband internet has risen rapidly across much of the world (see here for stats). China and India have recently started to catch up, with China predicted to overtake the US about now.

There is a link between Internet pentration, e-Commerce, and Online Advertising which I have described here. As broadband connections have risen, online shopping and e-Commerce have rapidly followed, due to the ease of use. The rise of online advertising has also risen with the rise of online shopping and e-Commerce, up from a very low percentage of all global advertising in 2002 to c 6% by 2006.

This “rush to online advertising” has also had a knock-on impact on online mobile advertising - SMS advertising has been around for quite a while and has grown respectably, but the rise of 3G services has led to a plethora of partnerships (Yahoo / Vodafone for example) in 2006 between Telcos and Search Engines to gain search engine advertising revenues from mobile portals.

However, the sheer scale of online advertising is also potentially a risk for Telcos of all stripes, for two main reasons:

Firstly, a number of new services that Telcos are looking to launch (IPTV, various Mobile multimedia services for example) are finding that competing solutions are using advertising revenue to subsidise the services, thus impacting many of the subscription based business cases that they were often predicated on.

Secondly, a number of non Telco players are adding Telco services as a free (or at least heavily subsidised) adjunct to increase stickiness to their existing businesses (eg Skype / eBay, Yahoo and Google Mobile services, and Apple (no longer an IT company)

The risk for Telcos is clear - outside players will give away for free the services that Telcos need to make money from, as that is their main revenue - and the scale of subsidy now possible from online advertising makes the problem material - and it will only get worse.

So what is a Telco to do?

We will be adressing the overall issues around online advertising in more detail in our upcoming report which we’re producing in partnership with the GSM Association, ‘Ad-Funded Services and Role of Telco in Advertising Value chain’, and which is being jointly written by STL and us Broadsight (available to order now). However, one very interesting area to examine, which we thought would be very fruitful to blog about and get people’s comments on, is the issue of online metrics.

The online advertising media has dramatically changed the need for metrics to measure and understand advertising ROI.

Pre internet it was all so very simple - media was sold to mass audience blocs that were fairly predictable, and Ad pricing was faily simple as well - audience reached, prime (or otherwise) position on the piece of media and standardised targeting data allowed marketers to get fairly predictable results, and relatively unsophiticated feedback methods were good enough.

The early internet took off on the same lines with banner ads, classifieds and those annoying popups kicking it all off. The Cost per Thousand reached (CPM) pricing was a simple yet “good enough” metric. As online experience increased, Cost Per Click (CPC) pricing became the new approach, with advertisers only paying when the customer clicked on the Ad. Click Through Rates (CTR) were initially very high (c 10%) but dropped as customers became more inured to them.

Where it all changed was when search engines, especially Google, started to match the Ad served to what the customer was searching for, with increasingly discrete targeting - the first real low cost, mass production “pull” advertising systems. Differential pricing and then auctions, detailed analysis of the user’s digital footprint all emerged in rapid profusion, until the measures started to move to Cost Per Action (CPA) pricing approaches, such as Cost per Sale made (CPS), and now Cost per Revenue obtained (CPR). Microsoft has now started to trial a next generation system that makes use of user identity demographic data to increase accuracy even more. Add to that the impact of Web 2.0 technologies such as RSS that make established online metrics such as pageviews increasingly inaccurate. See notes here, here and here or even here for commentary on these issues.

Mobile went a slightly different route, via text based sms and (much less so) mms adverts, and as 3G increased penetration there was an increasing focus on internet style search based ads. In 2005 / 6 there has been a large rise in trialling advertising approaches for mobile multimedia, including Mobile TV - but the result is much the same - the increased targeting requires far more discrete metrics

This may be an excellent opportunity for Telcos to create some further value from online advertising.

To run these sort of metrics and services effectively requires large amounts of discrete data, some of which these players do not have at present. Telcos however, because they hold billable user identities (ie detailed demographics) and have been provisioning the end services and measuring their usage, so have a wealth of some of this detailed data. Mobile Telcos can add moving location and on-the-move service usage data.

The first step is to understand its value. The next question is what to do with it. How best to maximise the potential benefit - sell, partner, procure - or something else? We’ll be exploring these issues in our new online survey (live later tonight) and at the Ad-Funded Services Workshop at the Telco 2.0 event in March.

To share this article easily, please click:

January 8, 2007

Deconstructing Microsoft’s “Telco 2.0” approach

I’m an admirer of Microsoft. I have friends who are current and ex-Softies. Microsoft has probably been the single largest source of privately-generated consumer welfare in the last 20 years. Although it’s now largely forgotten, their success came from making more functional, more usable, and consistently more affordable products than the entrenched competition.

But they struggle to achieve the success you’d expect in the telecom space. That’s probably because of a mismatch of cultural expectations and motivational incentives, as well as their propensity to treat telecom as “just another vertical”.

The following story is not apocryphal, but we’ll keep it anonymous to protect the innocent. It’s 2002. Microsoft are doing a tour of telcos demonstrating their upcoming Live Communications Server product. A crowd of senior telco execs are in the room watching the demo. “And now, the user just chooses who she wants to talk to based on her ID, and she picks the carrier from this drop-down box…”

Someone in the audience asks: “How does she know which one to pick?”

“Well, I guess she’d choose the cheapest.”

Errr, sorry, and how do we, the telco, build a customer relationship and brand here? And how are we supposed to be differentiating ourselves? I don’t think they understood why the reception seemed quite so frosty.

They recently put out an interesting press release on their “Telco 2.0” strategy. (If they’re having problems doing trademark searches, we can recommend a good lawyer or two to them — anyhow imitation is the sincerest form of flattery). Let’s see how they’re progressing: is Microsoft the telco’s enemy, friend … or lover?

Nobody got fired for buying a Microsoft Exchange server

Here’s what’s at the core of their thesis:

In the Telco 2.0 era, operators will be able to offer hundreds of services that bring together numerous applications and types of content from a variety of sources to form composite services; in this world, the combinations of potential new services are nearly limitless.

What’s interesting is who those services are targeted at: business users. These already have Microsoft desktop and messaging/collaboration systems. Microsoft’s hosted messaging product is a challenge to the IMS and telco-centric vendors. The message to operators is surely this: do something special to add value in distributing or customising this solution, or go to commodity minute hell.

IPTV found wanting?

Microsoft also push their IPTV offering as part of their Telco 2.0 suite. This had well-aired deployment issues in the past which seem to have been addressed recently. But the bigger problem remaining is probably one of not knowing which horse to ride: they’ve missed the “long tail” of YouTube, whilst not capturing the economics of P2P video distribution and user-driven recommendation and search. There’s is a monolithic application, closed and controlled. Not very ‘TV 2.0’, I’m afraid, especially at a time when even the Director General of the BBC says “We’re less than 5 years away from fully individualised drag and drop TV stations”.

[Editor’s note: We’re currently gathering together the great and the good of the ‘Digital Home’ world to debate real sources of value for telcos around home entertainment and communications at the Telco 2.0 brainstorm in March - Belgacom, BT, Orange, Slingmedia, Telecom Italia, Telefonica, Telenor etc. Watch the site for updates on stimulus presenters]

Nokia and Motorola will never surrender

Finally, there’s their Sisyphean task of getting the world to adopt Windows Mobile. “If only everyone used this operating system, your compatibility and delivery issues would go away.” But the problem is exactly that the “familiar Windows® user experience” doesn’t seem to be what people want from a communications-centric device. In 2003 I personally saw Steve Ballmer promising that they would ride down the Moore’s Law curve, and the $700 Windows smartphone would come to dominate the market. In the process, they totally missed the opportunity to put the Microsoft brand into the hands of two billion new telephony users, who sought out more basic features and functionality.

Another example of the PC mindset and assumptions: mobile IM has been around for ages, and fails to set the industry alight because the answer isn’t migrating PC applications to mobile devices, no matter how hard you hype the service. A truly mobile-centric solution would include richer presence, as shown by people like Packetmobile.

What should they do differently?

As the biggest single force in IT bar none, Microsoft’s business and product strategy is endlessly dissected, so we’ll keep this brief. They’ve spent the last 5 years focusing their engineering effort on Windows and Office. These are both tied to the PC platform, whose importance is declining in comparison to the web and mobile platforms. The real action should have been with Microsoft’s online services, which belatedly were rebanded Windows Live. That platform could be extended to better embrace the telcos, rather than disintermediate them. This is particularly true given the enormous growth of the mobile industry, where Microsoft remains comparatively weak.

Microsoft’s Telco 2.0 strategy will remain a laggard until it explicitly acknowledges the forces “horizontalising” the industry. They need to offer a capability of supporting the three core Telco 2.0 strategies: pipe, platform and product.

Dump pipes can make smart business

A Microsoft approach that supports a telco pipe strategy would, among many other things, have done the following:

  • Distributed superb Wi-Fi finder tools to every PC user.
  • Made it easier to discover and pay for value-add services like VPNs.
  • Increased security to avoid problems like “evil twin” hotspots that steal login credentials.
  • Made moving around between networks simpler.
  • Smoothed the process of contacting and interacting with network support.

Some of this gets better in Vista, but there’s no consistent telco-friendly approach. How come Nokia get to supply all the critical PC apps that make my mobile work with my PC, and not Microsoft?

APIs are everything

Microsoft is possibly the only IT player with a complete self-sustaining ecosystem around its product set. They could have leapfrogged efforts like Parlay and defined API sets that ISP and telcos could have used to expose their messaging, billing, and customer services platforms. It’s what .NET could have been: the API of the Internet, not of the networked PC. The opportunity was to put the “third leg” on the stool: 1.) the device OS, 2.) the user data (and services in the cloud), … and finally 3.) the telco OS. Plug in a Microsoft device anywhere, and with a bit of bootstrapping and auto-discovery the provisioning, payment, profile, security, etc. would all self-configure painlessly. By leaving the telcos out of the picture, and pretending they had no assets of value bar a dumb pipe, they cut themselves off from much of the marketplace.

Telcos are very much in need of a “platform” enabler, but they’re suffering from a “chicken and egg” problem that only someone like Microsoft can break: why expose lots of network APIs if there are no devices to access them? (For that matter, how come PCs don’t offer standard APIs for peer-to-peer services, for example? We’re stuck in a time-warp with the Windows file-sharing and printing services.)

A telco-centric product vision

Microsoft’s pitch to operators on revenue-generating services would have more clout if it had a clearer vision on what those services should be. We’re of the general opinion that the type of product development that operators should engage in is not “feature-based” innovation. This is better left to Internet innovators and niche vendors. Rather, we prefer coupling telco’s unique assets (identity, devices, delivery and payments) together in different ways - see Market Study for how. SMS short codes would perhaps be the best such example. It’s unclear why a telco would tightly align with Microsoft as a technology supplier if there’s no tangible sell-through/sell-with the corporate channel or product set.

As Tomi Ahonen reminds us, the mobile industry is huge: just the SMS product is bigger than pretty much all content industries put together. Microsoft haven’t (to date) given the telecom industry the attention it deserves, and aren’t making the impact they ought to.

But with a more creative analysis of the telcos’ (real) assets and needs, a more productive and cooperative relationship could emerge…

To share this article easily, please click:

January 5, 2007

2007 Telecom Predictions - Review of the Best

We’ll leave you to decide. Does a collection of a dozen sets of 2007 telecom predictions make a wise crowd or a demented mob? Anyhow, we’ve had a bit of fun puling together some of the highlights and lowlights of the analyst, media and blogosphere outlook for 2007.

Outlook from the ‘Blognoscenti’

We aren’t going to review every item, just pick some of the more interesting ones. VoIP industry godfather Jeff Pulver kicks us off with this one:

While the hype surrounding Fixed Mobile Convergence (FMC) will grow during 2007, the FMC marketplace will continue to stagnate until such time that software becomes widely available for dual model phones that offer seamless roaming across unaffiliated wifi/wimax hotspots.

This seems to be a common theme, and we’re not alone in suspecting that FMC is a dead-end for consumers in its current form. Extending cellular coverage in-building (the bit users care about) looks like something best left to… well, cellular technology. Until you start adding features and capabilities, IP is just adding overhead. It’ll take an innovator like Apple to kick-start that process.

He also has some good news for equipment vendors:

TDM services will continue to be “end-of-lifed”. All Telecom operators will be IP-based within 5-7 years.

We’d agree, but then as we’re about Making Money in an IP World, we would, wouldn’t we? But then again, if you’re a TDM switch vendor, you’re in the interesting position of having the right sales force, support, brand and customer relationships … to sell Cisco IP gear to telcos. [Boss — are we allowed to say this out loud?] Jon Arnold’s probably right that there are too many vendors chasing too few large capex deals.

Fellow blogger Dean Bubley predicts the emergence of the corporate MVNO. You can imagine an operator with a good network but poor distribution to corporates doing this in alliance with a systems integrator or IT behemoth (or both). This would fit our “pipe” and “platform” Telco 2.0 strategies. The smarter the operator, the better they’ll do at “white label” handset logistics, support and billing — with the pipe just being a hygiene factor in any deal. One to watch.

He also suggests:

[Location-based] navigation becomes rather more important on mobiles. Mobile search doesn’t.

This fits with our outlook too. Communications beat content. As we said before, the specific case of rendezvous in time and space is important. These generally trump information and entertainment. The GSMA found the same thing in their research. Mobile search will take a long time to mature: users are looking for information to advance them to their next objective, and today’s search just isn’t geared to it yet. A big opportunity for a future year, or an already solved problem?

In a shock finding, we’re going to disagree with one of Delphic Keith McMahon’s prognoses:

With data access, I see 3G completely nailing WiFi as prices of data access tumble.

Normally, coverage beats speed every time in telecom. “Distribution” is the winning meta-strategy. However, in this case there’s a tussle between physics and economics. WiFi needs a lot of access points (costing more), but like a crowded room at a party, they whisper to neighbours rather than shout across the room, allowing for more overall traffic. WiFi also doesn’t suffer from the overhead of usury spectrum costs. We’d probably see a co-evolution of both technologies, with ultimately a closer alignment of provisioning, security and payment methods, and the users will care less and less about which is which.

Another we’d agree on that crops up a lot: mobile extension of social networking and youth entertainment services will be big.

An electric issue remains the role of local government in enabling local connectivity. We’ll be delving deeper into this in our Digital Town session at the next Telco 2.0 event. Our take: whilst “municipal networks” per se will fade from the headlines, government procurement will get more organised, and investors will be less tolerant of duplicative infrastructure, resulting in new modes of network funding and ownership emerging.

Maybe it’s not important, but a lot of blogger lists for 2007 barely mention the giant telecom industry. You’re background noise, your products boring, your brands aren’t worth talking about?. Just a rumour of the icon for Apple’s purported new spreadsheet app was big news today. You don’t see the same customer adoration of telcos anywhere.

Enter the analysts

In-Stat have made their #1 2007 prediction easy to come true by choosing something that’s already happened: HDTV content downloaded via the Internet. On a less barbed note between competing analysts, they pick up on the WiMax trend, specifically noting:

At least one successful WiMAX “d” vendor will go out of business as the transition to “e” comes to a head in 2007.

WiMAX will be more widely deployed in Europe and Asia than in North America.

In other words, mobility will win, and it’s expensive to lay underground cables in old cities. We’d buy that. Forbes go further in suggesting that the realities of deployment will push back profitable deployment.

Controversially, In-Stat suggests that…

Longer term than 2007, there will be a backlash against so-called “network neutrality” as the rising amount of rich media content on the Internet slows down other traffic. Today’s proponents of net neutrality will begin to plead for tiered service—the option of paying more money for priority on the backbone.

Whilst a telco dream, this is very unlikely to happen. However, alternative forms of tiering (e.g. local vs international traffic caps; blanket “business” traffic priority levels; Paris Metro Pricing) are possible future phenomena. Network Neutrality, will however, not be a hot potato in a year’s time, since it’s really a mis-labelled bucket of other phenomena (like antitrust) that are better addressed individually.

We’re also somewhat sanguine that “The Venice Project’s P2P video project [i.e. Skype for video] will be the big viral media sensation of 2007”. We think this will prove more difficult to deploy than expected in the face of ISP traffic shaping, bandwidth usage caps, and user habit. As a long-term play it’s the right direction, but the jump from YouTube to “TV on the Net” is too big to make yet for the mass market.

Gartner and Telecomweb are determined to prise large amounts of money from you to hear their predictive insights. One of the Gartner predictions is that:

Through 2011, enterprises will waste $100 billion buying the wrong networking technologies and services.

We use Skype internally at STL, and probably get the same or better experience as the average megacorp user at a tiny fraction of the price. Their number looks large, but in fact may be conservative. Luckily, the definition of “waste” is pliable enough for our prognostications to never be disproven. Although, given Gartner also predicts the peak of blogging in 2007, we’ll predict the peak of over-priced hazy analyst reports.

Media predictions

A couple of Infoworld’s guesses have telecom angles to them. The idea of Cisco getting into the anti-virus game — at the network switch level — is an interesting thought. Former (pre-merger) AT&T CTO Hossein Eslambolchi used to often be quoted in the press that (in our wording) the centralised function that was worth embedding in a “smart” network was the detection and elimination of negative-value traffic, as the edges find it to hard to co-ordinate the information and response. Food for thought to the Level 3’s and Global Crossings, plus the incumbents with their national backhaul networks. Maybe a “clever dumb pipe” isn’t such a bad thing after all?

Another of their forecast is that “Mashups Meet SOA”. (SOA is “Service Oriented Architecture” — basically a way of breaking up monolithic IT systems and processes into re-usable chunks that still maintain the integrity of business rules.) Most telco IT departments will already have had their doors battered down by vendors bearing SOA architectures. Few will have thought through the possibilities for changing the business model to embrace external 3rd parties and be part of a wider platform.

Forbes predicts consolidation. We’d agree, as does the blogosphere, but the more interesting part might actually be “deconsolidation” as some telcos break themselves apart (e.g. into retail and wholesale) or are plundered by private equity or leveraged buyout. Maybe 2007 will be a comeback year for Gordon Gekko?

Oh, and our predictions? Well, one of us has already spouted off elsewhere, and we probably couldn’t top this set anyway.

To share this article easily, please click:

January 4, 2007

Are telcos communications companies?

The return-to-work week at the beginning of January is always a time for reflection and self-examination. We continue the theme of “What industry are we really in?” with some thoughts about personal communications.

The question for operators is: to what extent they should be innovators in their core voice and messaging products? After all, these services still account for the majority of the revenue of most telcos, be they fixed or mobile.

It’s an old debate, but one which in 2007 will take on a lot more significance as the march of technology releases us from the constraints of legacy infrastructure. Broadband has become mass-market in developed countries. Some smaller operators like Telio already have all-IP infrastructures. The leading-edge incumbent operators are getting close to launching IP-based replacements for much of their legacy equipment. Mobile operators are making similarly heavy technology investments to enable fixed-mobile products. Wi-Fi marches on and the power, QoS, security and provisioning issues start to ease. WiMax reaches the market, and creates new possibilities in markets with weak or no fixed access.

Customer expectations are rising

The rising Digital Youth generation of users aren’t going to stick with 1990s telephony and messaging products forever. The decision time is approaching:

  • Invest in core communications service innovation, define differentiated software and devices, build channel.
  • Partner with someone else who has these capabilities, and be a platform enabler for payments, service, logistics, etc.
  • Exit the services space, and focus on pipes — or diversify into other areas.

At the moment we see a confusing mix of these in the market — as our survey results have confirmed. So, if you do engage in services innovation, where should you focus your money? We think we’ve got a slightly different angle on the problem.

A great deal of blog discussion has been prompted by Alec Saunders’ pre-Christmas manifesto on the future of presence-driven communications services. Alec is the CEO of Iotum, a start-up that builds intelligent call and message routing services generally targeted at mobile hyper-connected professionals. He has therefore been pondering the problem for a while. Similar thoughts and roadmaps have been previously been seen from companies like IBM and Microsoft, who have interests and products that fill a lot of the boxes.

Since your boss is taking an extended beach break and your inbox is unusually quiet, go take a quick look at Alec’s essay. If nothing else, scroll down to the diagram half-way which has the three critical buckets: profile, context and relationships.

By the time the phone rings, the money-making is done

The slant we’d put on it is that communications services providers are (amongst other things) in the business of enabling rendezvous. As connectivity gets cheaper, more of the value comes from the set-up of the conversation: the right participants, time, place, devices and medium. They should act collectively in making this process work. Service providers should then compete individually on the features and functions of the communications service itself.

Presence — the part Alec focuses on — or even “New Improved Presence 2.0”, is just fodder to feed the rendezvous process.

Upstream of rendezvous is the brokering of relationships themselves. This is a domain-specific problem best left to partners of dating, employment and e-commerce sites. Next comes the poorly-understood part of real-time relationship enablement. This is still a research project. Then comes the bit the communications industry (whoever that turns out to be) needs to focus on next: rendezvous of people who know they want to interact. The downstream part of message or call delivery is commoditised to death. Indeed, one notable phenomenon is that VoIP is the least important part of this, and that some of the best click-to-call solutions simply use the existing switched telephony system and everyday purpose-made phone handsets.

The money’s in the social interactions

We would therefore suggest IM networks should federate their presence data, but need not allow a Yahoo! user to directly IM her MSN buddy: clicking on the MSN buddy would bring up the MSN client (together with ads). Likewise, telcos could finesse some of the issues of federation and (cough) “customer ownership” by federating presence-type data but retaining the user-facing interface (with its branding, up-sell and advertising potential).

A YouTube viewer should be able to right-click a video and send it to his MySpace friend. Both parties keep their part of the user-facing experience.

Iotum’s business is about getting the timing and medium of a rendezvous right. But there is also the problem of getting together in space, with people frequently using mobile devices to manage ad-hoc or hazily-defined meetings. Today a lot of calls are made to manage the timing and place of such meetings. Given some calendar and location information, your device could be doing a lot better job of brokering these connections. Users get the most value from calls that never even have to be made!

Keep it simple

The mega-successful telecom services have been noteworthy for their simplicity: telegrams, telephony and SMS. Those that added even small incremental complexity, such as telex, fax and MMS came to serve narrower (but still broad) audiences.

The kinds of innovation or improvements we’d like to see are those which help keep the rendezvous process simple:

  • When you call me back after heading my voicemail, my called ID says “Bob returning your voicemail”.
  • When my mobile shows my next meeting, and I call the organiser to say I’m late, he sees “Jane calling about 10.30am meeting”.
  • The hard-to-use parts of the experience like voicemail retrieval would be replaced with simpler multi-modal interfaces, and we’d deliver the messages down to the device just like emails. Indeed, recording a voicemail would be done locally in wideband audio, and nobody would ever suffer a drop-out.

There are plenty more ways of limiting telephone tag and improving the user experience — it just takes a little imagination.

Innovate early and often

There are a few operators like Vodafone and Telenor with interesting portfolios of holdings of operators in developing economies. If we were looking for somewhere to experiment with “better telephony”, those are the places we’d look to experiment.

We’ve been doing plenty of thinking about the voice and messaging industry, and you can look forward to reading more (for a modest fee) when we publish our report in a few weeks’ time.

To share this article easily, please click:

Telco 2.0 Strategy Report Out Now: Telco Strategy in the Cloud

Subscribe to this blog

To get blog posts delivered to your inbox, enter your email address:


How we respect your privacy

Subscribe via RSS

Telco 2.0™ Email Newsletter

The free Telco 2.0™ newsletter is published every second week. To subscribe, enter your email address:

Telco 2.0™ is produced by: