" /> Telco 2.0: July 2008 Archives

« June 2008 | Main | August 2008 »

July 31, 2008

New Telco 2.0™ Manifesto - Second Edition Preview

The current telecoms business model is approaching its ‘end of life’. Today, we’re previewing here on our blog an updated Telco 2.0™ Manifesto which we hope will provide a cogent reference point for creating a vibrant new business model at the heart of the digital economy.

This second edition reflects the changes in our thinking over the last two years since we launched the Telco 2.0™ Initiative. It is based on output from four major Telco 2.0™ ‘executive brainstorm’ events, multiple consulting engagements around the world, and our formal research programmes. We’d like to thank the many people who have wittingly (or unwittingly) provided input to it.

The Manifesto is relevant to:

  • Those developing strategy across the telecoms, media and technology (TMT) sector.
  • Corporate managers in all vertical industry sectors looking to improve efficiency and effectiveness through Information and Communications Technology.

We believe it provides new insights into future business models for the ‘information economy’ at large. The Manifesto seeks to answer eight critical questions:

  1. What are the fundamental properties of today’s telecoms business model?
  2. Why do these create challenges for future growth?
  3. Why are current efforts to find a new business model too limited in scope?
  4. What are the real issues that need to be addressed?
  5. What are the key principles behind the new business model?
  6. What are the core products and services of a Telco 2.0™ business model?
  7. How big is the size of the opportunity?
  8. What does the journey to this new business model look like?

We very much encourage your feedback, either in via the comments tool below, or directly to contact@telco2.net.

Boom, bust and bundling

The ‘Telco 1.0’ business model has been stable from the inception of the telegraph right through to the mass adoption of the mobile telephone. This model has two pillars:

  • vertical integration, where the network owner controls the services on the network, and repays the capital investment by billing for them. This control can be created either by (i) embedding the service in the network (as with voice or SMS), or (ii) through control over the edge devices — handsets, set top boxes, network storage devices, home hubs. Device control can be direct via technical means, or indirect, such as via subsidies of devices with preferred configurations.
  • the revenue model is a one-sided market, where the telco buys equipment and content from suppliers (‘upstream’), integrates them, and bills the end user for services (‘downstream’). The upstream side is cost, and the downstream side is revenue.

This model survived both technological revolutions (e.g. fibre optics, digital switching, microwave radio, and spread spectrum wireless) as well as regulatory change (e.g. divestiture, privatisation, and unbundling). It has been very successful, particularly in emerging markets. All aspects of a service, from sales to support, are conveniently packaged in a single easy-to-buy proposition to the end user.

The arrival of Internet access as a mass market consumer product in the 1990s challenges these two pillars. Users can acquire content and services independently of the network operator — a horizontal market structure. Furthermore, the business model of many Internet content companies is a two-sided market (more here and here). They acquire a ‘downstream’ audience using either cheap or free content. Advertising is the primary revenue source, coming from the ‘upstream’ side of brand owners and merchants.

The demand for Internet access sparked an infrastructure boom, which ran in parallel to the mobile boom.

Following the subsequent dotcom bust, telcos have been focused on three activities:

  • Managing the explosive growth of the mobile business (especially in emerging markets), as well as (mostly fixed) broadband in developed markets.
  • Bundling the voice, video and data services together as the ‘triple play’, to reduce churn.
  • Consolidation via M&A, to maintain prices.

All three have placed huge strain on the back office systems, and attention has largely been focused internally on operational issues, rather than strategic or structural ones.

Triple play trouble

The telco business model is under strain. The hyper growth phase in mobile and broadband is over in developed markets. The underlying tensions between the telco and Internet models are no longer masked. We are seeing increased price competition. The regulatory environment is becoming less favourable, with reduced termination fees, capped roaming rates and effective unbundling rules.

Most importantly, each element of the triple play bundle — voice, video and data — has problems with either growing revenues, or the cost of service delivery, or both.

Voice: Slower growth precedes decline

In developed markets, increased usage of voice is no longer sufficient to compensate for price deflation. Revenues are starting to peak and fall. High-margin mobile voice and SMS services are vulnerable to arbitrage (e.g. roaming SMS, international calling). This is particularly true when IP is used as a signalling system independent of the telco network and charging regime.

Complete ‘over the top’ replacements for telco voice and messaging services have achieved adoption in some markets (e.g. MXit vs. SMS in South Africa, Skype for small businesses replacing long-distance, international and conference call revenues). These services remain at the periphery at present, but are still growing fast.

Video: Hard to enter than expected

Both fixed and mobile video are failing to generate the level of profit anticipated. In both cases, telcos lack the content acquisition, packaging and promotion skills that more mature media players have long perfected. The Internet market is driving rapid innovation in content aggregation at a speed telcos cannot match. Telco forays into becoming a media business have generally been underwhelming. For mobile video, user interest isn’t matched by a willingness to pay. The only exception in media is ringtones, which is a market that is also maturing and facing decline. Music may also flourish for a short time, although that is a very troubled industry indeed due to piracy.

Data: Not a golden goose after all

Users fail to intuitively understand megabytes and megabits, and would prefer ‘postage and packing’ to be included with the device or content. There is minimal differentiation between ISP plans. Pricing, usage and value are disconnected, since price discrimination is difficult. Price competition is the norm. Online video is driving the need for fresh capital investment, as well as operational expense. Some of this can be justified by reduced operational costs (as fibre is cheaper to maintain than copper, and LTE/WiMAX have more capacity than 2G/3G) but there remains a significant funding gap. Mobile data usage is growing very rapidly, but typically over 90% of traffic is from laptops, which don’t generate commensurate revenue. Continued growth may result in congestion and spectrum exhaustion in urban hotspots.

New sources of value remain elusive

Network operators are aware of these issues and are experimenting with new business models. Media products, as noted above, have met only patchy success; yet operators are heavily investing in servicing the media and entertainment sector. Advertising-funded services exist, but the entire online advertising industry — including Google — is still under 2% of global telecoms revenues. Advertising alone cannot significantly impact the telco business model.

Advertising is too small to be the basis for a new business model

Meanwhile, in the economy at large, there are inefficient business processes in every industry, through every stage of production from creating a customer relationship and promoting the offer, via service delivery, through to billing and customer care. Typical examples might include delivering parcels, authenticating banking customers, or servicing welfare recipients. These often waste labour and energy, and tie up working capital. Could the telco be in a position to optimise these time and trust sensitive processes?

The trillion dollar re-think

Given these issues and opportunities, is necessary to answer two questions: What is the purpose of a telecommunications service provider? And what does the future business model look like? To answer these we need solutions to the following problems:

  • How should the underlying infrastructure be funded, and what is the role of the telco in this?
  • How do we protect and evolve the core voice and messaging products?
  • How do we turn online video distribution into a profit driver, rather than a cause of cost inflation to the ISP?
  • How do we create more value from our current assets, both physical (e.g. networks and IT systems) as well as those less tangible (e.g. brand, trust, customer data)?
  • How do we find new classes of customer to service, and therefore revenue sources?

In answering these questions, we see a need for change in the industry to reflect a new world increasingly unlike that experienced before.

Telco 2.0: A new vision

To resolve these issues, telcos must learn from the structural changes that have taken place in other industries where vertical integration was weakened. There has to be a change of priorities:

  • A shift to revenue growth via wholesale and business-to-business services, rather than consumer retail.
  • A shift to evolving the core personal communications products — a conduit for businesses to interact with the customer — rather than media services to temporarily fend off user boredom.
  • A shift to treating customer data as a valuable by-product, not a form of digital waste.
  • A shift to servicing universal business processes performed across many industries, rather than competing with services specific to verticals (e.g. finance, entertainment, IT services) that inevitably compete with entrenched suppliers.

Defining the new business model

The future telecoms industry structure comprises the following functions (although not all may necessarily be found in any one telco):

  1. Infrastructure services. Our view is that in the long term passive infrastructure becomes part of a completely different multi-utility business, and not part of the telecoms industry at all. Rather than build duplicative competing access networks, capital has to be freed up to invest in ‘network edge’ assets. Telcos should aggressively pursue network sharing and outsourcing initiatives, and co-opt municipal or open access models.
  2. A retail arm, which deepens the intimacy of the customer relationship by offering packaged ‘digital lifestyle’ products and services. For example, a wireless picture frame for the grandparents is the perfect complement to a picture messaging phone. Innovation is centred on the core personal communications products, which must be integrated closely with the online services the customer prefers. The retail arm invests in the home and office network, and these assets form part of the network that the wholesale division can re-package. It also broadens the range of goods and services on offer, with each integrated into a single e-commerce, identity, billing, operations and support infrastructure. To do this is has to learn new skills by emulating the best of the retail sector, such as grocers.
  3. A rich wholesale delivery platform. Compared to today this platform addresses a much broader range of online delivery problems, on behalf of a much broader range of commercial customers, using a broader range of delivery assets. It turns ‘over the top’ competitors into customers, rather than threats to revenue and sources of cost.
  4. A business process platform that creates value-added services (VAS) that extract more value from the customer data assets of the telco. These services address a wide range of cost, efficiency and effectiveness problems in the economy at large, again for a broad range of commercial customers.

We have identified seven core value-added B2B value-added services:

  • Identity, Authentication & Security;
  • Advertising, Marketing Services & Biz Intelligence;
  • E-Commerce Sales;
  • Order Fulfilment - Offline;
  • Order Fulfilment - Online (E-content);
  • Billing & Payments;
  • Customer Care

Other complementary businesses may legitimately exist - systems integration, managed IT services, hosting, money transfers - but are not core to the Telco 2.0 model.

Each component of the ‘triple play’ is impacted:

  • Voice minutes are sold at wholesale and re-packaged and sold under a wide variety of branded services, where voice is just am integrated feature of that service, not the whole product.
  • For video delivery, the telco video platform focuses on supporting consumer electronics companies, content providers and aggregators to creating the user experience. Revenue comes from taking pain and cost out of their businesses, and enabling value to flow along the whole chain. The telco should exit trying to run gatekeeper portals for video.
  • Data products are diversified beyond the simple retail ISP, with a mixture of hybrid consumer/business products (e.g. home worker services), and fixed/mobile products (e.g. femtocells backhaul over landlines). The retail ISP works to offer a ‘connected lifestyle’, not a series of disjoined broadband products tied to particular places, devices or times of use.

Two key enablers are (i) sender party pays data, where the upstream party pays for delivery of voice, video or data; and (ii) communications enabled business processes (CEBP), which optimise interactions between consumers and merchants through the voice and messaging tools. CEBP demands that new (wholesale B2B) capabilities are added, such as the ability to directly deposit an interactive voice message into a voice mailbox.

Size of the opportunity

We have modelled the potential size of the opportunity, as shown below. Our model suggests that by 2017 (ten years out) this is around $250bn for new wholesale services and $125bn for the VAS.

Size of the opportunity

We have deliberately excluded a revenue opportunity from the retail side. In a two-sided market, a key feature is that the charges for using the platform have to be balanced between the two sides to get the right size of audience. So newspapers sell at a price that barely covers their print and distribution costs. This maximises revenue from advertisers, who are less price sensitive than readers. Just as with advertising, telcos will be forced to adjust their retail pricing to maintain mass audiences: retail is the tool you use to acquire a customer relationship that can be monetised through activities such as CEBP.

Making the journey to Telco 2.0™

The seven key steps to approaching this challenge are:

  1. Divest yourself of infrastructure assets and activities not strategically aligned with the Telco 2.0™ model.
  2. Create new financial and operational metrics to measure the progress of your organisation.
  3. Update your processes and product pipeline gating criteria to reflect your new priorities.
  4. Focus your retail business on (i) evolving the voice and messaging products, (ii) creating the necessary edge assets for video distribution, (iii) engaging a wider range of partners and products, (iv) benchmarking yourself against the leading non-telco retailers.
  5. Considerably enhanced your wholesale product portfolio, as this is the new growth engine.
  6. Add in the two-sided value-added services based on your strengths and local market conditions.
  7. Work with existing partners, and collaborate across the industry, to create the transaction networks needed to support these value-added and 2-sided market services.

You are very welcome to join us at our fifth Telco 2.0 Industry Brainstorm in London on 4-5 Novemeber to discuss these ideas in more depth with your senior industry colleagues.

To share this article easily, please click:

Guest post: Why Ribbit is worth $105m to BT

We’ve long believed that the real reason you build an open telco platform is to facilitate interactions between merchants and users, not to enable a supply chain of media or entertainment products. The telco industry uniquely has relationships with nearly every economically active person, a means to reach them, and customer data that nobody else has. We’ve asked Thomas Howe, an authority in the space of communications enabled business processes, to explain the real significance of BT’s entry into the telco platform space — the start of a global telecom platform war.

As I reported a few weeks ago, Ribbit has indeed been sold to BT. The selling price — $105 million — has caused some surprise. However, it makes complete sense to me.  Here’s the math that makes that work:

  • BT has relationships with several thousand global companies in Britain and beyond: British Airways, the BBC, HSBC, Barclays, Royal Dutch Shell, BP, RBS…  you get the picture.  Each of these companies will one day demand (if not already) that their telecom provider offer APIs so that they can integrate their business process with the communications infrastructure.  Thus, the BT Web21C APIs are born.
  • As a round number, assume that each large company has between twenty and forty large applications that require integration and management. We can count six areas that all have right off the bat : CRM, ERP, HR, logistics, inventory management, IT automation. No stretch to imagine that each area has several applications in it, or different divisions have different needs, etc.
  • Again, as a round number, business efficiencies of 20% are commonly seen in CEBP applications, providing ample reason to integrate communications systems with enterprise applications.
  • So, from simple multiplication, we have several thousand companies with 20 to 40 applications each, giving us about 30,000 CEBP applications for BT’s large customer base alone.
  • From a world-wide market perspective, just multiply that number times the number of large carriers.

So, what are the chances that there are 30,000 CEBP engineers in the world? Would you say about… zero? I would.  What happens when you have a tool that any web developer can use, like Flex or Flash? You’ve got a fair sight more than the 100 or so CEBP engineers that exist now.   By aquiring Ribbit, BT acknowledges that there’s a severe go-to-market issue with CEBP deployments: there aren’t enough engineers to do them.

[Ed - You can see Thomas present at our next Telco 2.0 Industry Brainstorm in London on 4-5 November.]

To share this article easily, please click:

July 30, 2008

Close to Boiling Point: ISPs, Aggregators and Music Rights Holders

The current EC review of Telecoms Law and the UK government consultation on Illicit P2P Downloading (announced last week) threaten the ISPs relationship with its customers. Legislation alone will not solve the Content Industries problems with the internet - ISPs have capabilities they can bring to the table to help ease the pain.

Throughout history, whenever an industry is in meltdown, bullets of blame are sprayed everywhere and the industry players turn towards governments and the legal system for protection. These days, the music industry is in meltdown and the bullets of blame are targeting the ISP industry. Failure to react could cost the ISP industry dearly. We examine some of the options below.

Note that music is not suffering from a demand-driven meltdown: consumer demand for music appears as strong as ever. The problem appears to be music is being increasingly delivered by the internet, and it is proving difficult to monetise this demand across the whole of the value chain.

All the evidence seems to point towards a significant number of consumers who are sharing music without the rights holder’s permission and without compensating them. Demand for legal online services pale into insignificance compared to the illicit demand. Even worse, consumer behaviour and expectations seems to have changed — free sharing of music is becoming the default mode. Not surprisingly, music companies are looking for the law for protection and other content industries fearing a similar fate are jumping onto the bandwagon.

The Legislative Pipeline

In Europe, the EC is proposing amendments to European telecommunications law which will allow the monitoring and blocking of services by ISPs. The amendments will also permit ISPs to sanction users by suspending or terminating Internet access. Our friends at TelecomTV are so concerned about this that they have started a “throttle the package” campaign.

In the UK, the Department of Business Enterprise and Regulatory Reform (BERR) has issued a ‘Consultation on Legislative Options to Address Illicit Peer-to-Peer (p2p) File-Sharing’. This consultation seems to be driven by the Gowers report, specifically:

“Recommendation 39: Observe the industry agreement of protocols for sharing data between ISPs and rights holders to remove and disbar users engaged in ‘piracy’. If this has not proved operationally successful by the end of 2007, Government should consider whether to legislate.”

Any new legislation will pose interesting questions about the potential loss of individual privacy, opening the door to censorship and digital disenfranchisement. However, the Telco 2.0 angle is less about civil liberties, as we are more interested in the impact on the value chain and business model.

Enforcement won’t be cheap

In France, the Oliveness Agreement recommends the majority of costs will be borne by a government agency employing 30 people, sending 3 million infringement notices per annum and costing £15m per annum.

In the US, the rights holders seem to bear the brunt of costs of the Digital Millennium Copyright Act (DMCA) — monitoring aggregator sites (eg YouTube) and issuing take-down notices. This can take a significant amount of resources. The DMCA seems ineffective for file sharers, and obviously is inapplicable to aggregators based outside the USA.

There is no doubt that the ISPs will have to install equipment on their networks to monitor traffic. Yet detecting and differentiating between legal and illegal traffic is a formidable challenge. Most UK ISPs have already installed Deep Packet Inspection (DPI) equipment on their networks, but a technological arms race seems inevitable and ongoing costs will be significant.

In all probability, over time enforcement will reduce the amount of the consumers file sharing on the internet. The internet is far from the only way to file share. A recent British Music Rights (BMR) survey showed that sharing and ripping of CDs is more prevalent today than file sharing on the internet.

Without wishing to delve deep into the cannibalisation debate, we suspect that legislation alone will probably not generate any new money to the music industry.

Changing Behaviour

The BMR survey shows just how much people love and value music, and highlights that a significant amount of that value is currently unmonetised:

“It forms when fans really connect to a piece of music or to an artist. They develop a bond and will be prepared to pay more for a specific item: the original CD, band paraphernalia, or concert tickets. The value to the music consumer in this case rests in the item itself or to the individual who produced it”

It seems clear that there is value in proving a legitimate purchase of music has taken place — a digital token proves support of an artist. Lessons from retailers with loyalty schemes allowing discounts on associated products are clearly appropriate for content aggregators.

Another key finding of the BMR survey is about that people enjoy experimenting with music:

“It is about trying-out, searching, exploring, investigating, giving something a go, rating, and recommending to others. The value to the music fan is in access to a large range of music for experimentation, and participation in a community of like minded music lovers, rather than in any one track.”

A new generation of services are becoming available to meet this need - last.fm, and Pandora are just two examples. I have been personally experimenting with a service called Spotify and can see great advantages compared to managing a digital music library on my hard disk.

Simplifying Rights & Access to Catalogues

In the EU, there is not a simple, streamlined system for clearing music rights. The EU is trying to change the status quo, which will allow licensing across Europe and also competition between collecting societies.

In the UK, the MCPS-PRS, which is the major UK music royalty collection society, has a standard Joint Online Licence which costs music aggregators around 8% of Gross Revenue, but has some minimum rates which could push up the bill. For details see here. These royalties are paid to music writers, composers and publishers.

It seems obvious that the more difficult it is to license music, the higher the probability that someone will infringe either accidentally or deliberately. Price is also a big factor, for example the robot-powered music recommendation and streaming service Pandora closed down in the UK complaining that royalty rates were too high.

Getting access to the record companies catalogues is an even bigger problem. Even a company the size of Nokia, and probably with a huge budget, seems to be finding it difficult to get access to all catalogues.

Some artists who own their own rights even refuse to put their material online. The Beatles are probably the most famous such case.

We firmly believe that the rights holders should be free to choose their own distribution channels and set their own prices. However, the current market problems points towards more flexibility being required.

ISP incentives

BSkyB’s announcement of a music partnership with Universal is also interesting, although there are more questions than answers about the service at this early stage.

The trend is set and it won’t be long before all of the major UK fixed ISPs are offering some sort of music service. We strongly believe that partnerships are the way forward here for ISPs, rather than building vertically integrated solutions. ISPs should focus generic capabilities which can be offered to any type of content:

  • Making content delivery cheap and efficient — by building CDNs, caching content close to the edge and making P2P delivery more network-aware. There needs to be a menu of choices of content delivery, for example off-peak data — not just ‘premium’ options like assured QoS streaming.
  • Helping content aggregators build a sustainable economic model (and reduce their need for underpriced ‘over the top’ distribution). It should be straightforward to charge to your monthly ISP bill; or offer an ad-funded model with the ISP using its customer knowledge to target ads; or offer separate per-megabyte charges for delivery to the user for specific partners or applications. Then offer the billing and collections service on behalf of the third party.
  • Market services to the base. The ISP should know better than most which of its base likes which type of content!

Future Challenges

Douglas Merrill, ex-Googler now head of digital media at EMI, sums up the challenge of how to make money in a digital age perfectly:

“We don’t know the answer, and that’s kind of exciting, when you don’t know the answer, you try a bunch of things, and in careful ways you measure the result. We’ll figure out how to make money later.”

We don’t believe there is a silver bullet on the horizon either. We are certain that carrots are needed as well as sticks. And, we are certain that participation amongst parties in the value chain is the only way forward.

[Ed - Telco 2.0’s new ‘Content Distribution 2.0’ research practice will continue to track these developments and suggest solutions. Watch this space for launch later in the summer]

To share this article easily, please click:

Case Study: Qualcomm digs itself into a (very good) hole

Qualcomm, best known for owning a lot of patents on cellular radio tech, has an interesting new product out — and it involves a large yellow backhoe loader. Not quite what you’d expect! But a regular theme in Telco 2.0 is that the real value for the telecoms industry isn’t in the consumer entertainment applications everyone loves, but in the enterprise. That means working with people and things, driving out labour, energy and working capital costs.

First we’ll go through what the product does, before diving into what it means for the business models of both Qualcomm and telcos wishing to apply the same lessons.

Diggers are a product, digging holes is a service

The world’s leading manufacturer of backhoes and the like is JCB (for whom we have a nostalgic fondness). They have started to add Qualcomm GPS/GSM modules to some of their vehicles. These communicate with a Web application back at headquarters, also Qualcomm’s work. If you buy one, and pay the extra service charge, you get to log in to the Web page and see the data the device is collecting. Obviously, it regularly reports the digger’s location, which is handy in the event it gets stolen, or if its operator decides to nip off for a crafty pint of beer.

But it’s not just location information — a common flaw in a lot of LBS projects — but rather location added to data of some other kind. This could be a map or network diagram visualisation of the data, or data retrieved or filtered by geographical criteria. So it should be easy to get reports from a remote station that could include any kind of data you want, which could include location, and to send things back. JCB and Qualcomm’s service does just this. It provides among other things details of whether or not the engine is running, and if not, when it last ran, how long it has been running, and how long the vehicle has until it needs servicing. The significance of whether or not the engine is running is of course that if the engine is running, the machine is probably working. As well as that, it also monitors engine instruments — oil pressure, temperature, RPM, exhaust lambda sensors — so that the service schedule itself can be adapted depending on how well the digger is getting on.


But the real user value comes from the system’s user-configurability. As well as defining areas on a map and requesting an alert every time a digger enters or leaves one, you can arrange for an alert to fire when a vehicle whose time-before-overhaul has reached less than a working day enters a geofence around the maintenance workshop, and therefore get the vehicle into the shop when it passes by. You could direct the alert to the driver’s mobile phone, instead, and do yourself out of a job.

So for JCB, they are taking waste out of an everyday business process of driving diggers around: downtime for maintenance, under-utilisation, and cost of repairs. They do this by reducing lag in business processes (e.g. getting the digger into service at the right time) using data by-products of the business they’re already engaged in. Rather than selling diggers — a business input — in a price-based competition with other manufacturers, they instead sell outcomes — more holes dug and filled per unit of labour and capital.

Qualcomm, no longer the default radio technology, switches business model

It’s also an example of something else. Beyond the world of standardisation, there is a further layer of standards which are in a sense much more important, namely the assumptions in everyone’s head. You can see one in JCB’s promotional material for Livelink — the geographical interface for the system is Google Maps. (Or something with a near-identical user interface — Yahoo! and Microsoft Virtual Earth also look a lot like Google Maps.) That is how we expect a map on computer screen to look and behave, and the format used to display geographical data on the map is the same (Google’s Keyhole Markup Language and GeoRSS). Similarly, the default operating system for a new IT project is some form of Linux, the default network protocol is Ethernet, and the default GUI is a Web page.

And, of course, the default mobile telecoms system is GSM/UMTS. We can’t be the only ones to have noticed that UMTS and GSM networks are still being built, sometimes replacing IS95 or CDMA2000 ones, existing UMTS systems are being rapidly retrofitted to HSPA standards, and where another network technology is used, it’s invariably WiMAX. Whatever LTE turns out to be — and it may well just be HSPA+ with a flatter network architecture — it’s going to be the default standard, and WiMAX will be the only real alternative. Nobody is deploying Qualcomm’s 4G design, UMB (1xEV-DO Rev.C as was). Even their traditional best customers, Sprint-Nextel and Verizon Wireless, are going their own ways, to WiMAX and LTE respectively.

Now there’s a problem; what do you do when you stop being the assumed standard? Qualcomm settled its row with Nokia. It tried and failed to derail IEEE802.16e standardisation in favour of 802.20. It will of course continue to collect its rent on UMTS silicon. But it seems clear that their real strategy for a future is to specialise in services, applications, and platforms — to become the most developer-focused vendor in the telecoms industry. JCB Livelink is just one of the products of this drive to exploit the power of combinations.

Telco products are missing the magic service ingredients

JCB, meanwhile, remains the digger everyone thinks of when they think “digger”, but they have only managed to stay there by continuous innovation. Having invented the first backhoe-loader was never going to be enough. Livelink represents their move towards integrating their machines into a product-service system. And it’s about time too; the property bust has cut JCB’s order book by 20 per cent.


We’re not sure if making them dance is part of their strategy, though, or whether it was just too much fun.

So what innovation is missing in the telco space? Well, today’s basic telco voice and messaging products are peaking in usage and revenue in mature markets. The answer is to turn them into conduits to facilitate interactions between users and merchants. Telcos have a number of advantage here. What they supply is communication — not just location. Furthermore, the telco already has a relationship with pretty much everybody who is economically active, via their landline or cell phone. Rather than aiming for the ‘digger’ business process or vertical, there are a set of generic business processes (see slide 16) that they should tackle.

Note that BT just bought Ribbit, the new voice & messaging firm whose developer friendliness we praised so much. BT is doing very well at coping with the end of its role as default telco; Qualcomm is on the same road.

This is an example of something we keep banging on about: the reason for exposing the network APIs required to handle the seven questions isn’t to make each one a discrete product like a telephone call. Instead, it’s to make it possible to recombine them into new applications. Rather than try to guess what the users want, it’s all about making it possible to build bits of the telco into new things and processes that telco management couldn’t even imagine.(“I know - an application for managing large yellow backhoes!” Try that at Vodafone HQ.)

So in this case, if the normal driver of the backhoe is in Spain on holiday (well, they’re not doing much construction there any more…) then the message can instead be relayed to his boss, or someone else. The API tells us if that person is roaming, for example. Rather than the employer having to issue everyone with a phone, the existing personal infrastructure of the users is adopted. Suitable privacy controls and opt-ins are put in place. The telco is in a position to arbitrage away the difference in cost between “one phone per application” and not spending anything on hardware.

And mobile has a special role here, because of its unique ability to extend the reach of IT systems into the physical world. So we certainly ought to be thinking about yellow diggers, offshore wind turbines, shipping containers, and perhaps even cows. (No, we don’t mean Cells on Wheels.)

To share this article easily, please click:

July 28, 2008

Ring! Ring! Hot News, 28th July 2008

In Today’s Issue: All the Vodafone that’s fit to print; just what’s in that tall glass of mobile data?; the Spanish builder menace; AT&T discovers principled objection to mergers, porcine aviator sighted; Sprint flogs towers; Sprint’s multi-gigabit radio backhaul, departure from the NGMN; is MediaFLO short of spectrum?; frantic open-source activity; Nokia pays for friends; Intel dumps Ubuntu from its mobilinux; Win95 on a Nokia N810; better voicemail for all; Bundesnetzagentur’s odd idea of regulation; BT begins to move on fibre

Vodafone found that once the stock market doesn’t like you, there’s very little you can do about it this week. You wouldn’t imagine that interim results including the phrases “first-quarter £9.1bn revenue” and “expecting full-year profits around £11bn” could scare the markets, but that’s what happened — vodashares were marked down by around 11 per cent. The monster carrier responded by offering to buy back a billion pounds’ worth of stock. Yet if the best repartee is a parliamentary majority, as Prime Minister Disraeli once suggested, the best trading statement is usually a bag of cash.

Perhaps the markets were reacting to the enigmatic surge in data revenues? These were up by some 29 per cent, and are marching steadily towards the billion pound mark — which you’d think is great news for Vodafone. However, like all carriers, Vodafone has managed to get its non-SMS data traffic moving by the simple expedient of slashing prices and broadening the offer to emphasise its role as a mobile ISP. (There’s a good reason why the star mobile data product across all the UK operators is a little Huawei E220 radio modem for your laptop.) Back in the dotcom boom, the argument for mobile data was a) that it would be additional to voice and messaging revenues and b) that it would be a high-margin branded product in its own right rather than just being bulk IP traffic.

However, what we’re actually seeing is quite the opposite - voice and messaging are the high-margin products, non-SMS data services are substitutes for them, and the top selling service is a pipe in the sky. How much of the extra data usage is made up of instant-messaging or social network usage that competes with Vodafone SMS and branded/portal IM? How much is Skype traffic from laptop users? Perhaps the City recognises these issues. Or perhaps they just thought “No-one ever got fired for selling Vodafone (not since 1999 anyway, and that’s before my time)”? Alternatively, they noticed this bit in Arun Sarin’s statement:

According to the group, economic and competitive effects particularly impacted Spain. Sarin added that the British mobile phone group had been hit by the decrease in economic migrants who had been working in Spain, often in the construction industry. He said the slowdown in Spain’s construction industry had also resulted in a reduction in the number of builders who used mobile broadband devices when working on sites.

Tradesmen and small businesses have traditionally been a huge market for the industry; this was one of the biggest surprises at the very beginning of Vodafone’s history, when they expected bankers and got a surprising number of plumbers. It’s also the same pattern we see in the emerging markets, where so much growth and creativity comes from small independent traders. So this probably isn’t good news.

Speaking of monster carriers… AT&T is objecting to the Sprint-Clearwire WiMAX deal, on grounds that it would lead to an undesirably high concentration of ownership in the market for…wait for it….leased surplus spectrum from the educational fixed television sector. The FCC assigned some speccy from the 2.5GHz band for the use of educational institutions who wanted their own little TV station, years ago. Not many use it, and the ones who don’t often lease the spectrum out. Obviously, the lessors have to be people who use the 2.5GHz band. And who does that but Sprint/Clearwire?

Perhaps it’s a real concern, but it certainly sounds a lot like chutzpah coming from the operator formerly known as SBC/BellSouth/Cingular/AT&T Wireless/AT&T (Ma Bell) and probably some other mergers we forgot. Relatedly, Sprint has also parted with the ownership of thousands of cell towers; rather than the kind of giant network-outsourcer deal we often talk about, however, it’s more of a financial exercise.

Meanwhile, here are some details about how Sprint/Clearwire plans to backhaul its WiMAX base stations: by provisioning them 1.6Gbits/s of point-to-point microwave Ethernet, apparently. It’s a reminder that radio may be black magic, but there are times when it can even rival fibre. If you already have towers, it saves so much trouble digging up the road. Sprint, meanwhile, has quit the NGMN, apparently on the grounds that nobody cares about Qualcomm’s UMB any more, and anyone who doesn’t want LTE will go for WiMAX.

An interesting piece at Daily Wireless which raises the question of whether MediaFLO has been left standing in the race for mobile TV spectrum, touches on the possibilities of ultra-localisation as a way of making mobile TV actually interesting. It also reminds us all how smart the old IPWireless team, now with NextWave, really are. (They’ve also just taken their original good idea about UMTS-TDD as a mobile TV medium and applied it to WiMAX.)

The febrile activity in the mobile OS and developer platform world continues, and it’s like a bathful of angry octopi down there. Linux folks at OSCON were threatening to move on from their 18% share of the embedded OS market to attack the 43% or so held by proprietary and non-Microsoft products. The president of the Symbian Foundation is promising to push for mass developer adoption.

He’d better. Nokia bewails that its developer community is mostly people they pay. This isn’t necessarily a bad thing, though; hackers need a business model, too, and so far there isn’t much in the way of a market for Symbian apps outside Nokia and the operators.

Linux people like nothing more than a good row about the content of an OS distribution: and Intel obliges, by removing Ubuntu components from its own Moblin mobile Linux implementation. Expect more tensions between closed and open innovation models to come. Journalists meanwhile like nothing more than a really improbable merger tale, and analysts obliged, suggesting that Android and Symbian would merge. They didn’t say how, and getting an octopus to embrace a squid doesn’t produce a beautiful dolphin — just a mess of tentacles. And Nokia’s linux shop gets MacOS and Windows 95 running on an N810. You have to ask why, don’t you… still, nice to be paid to do something in these economically challenged times.

That’s perhaps enough techieness for the time being. Here’s a service that replaces your carrier voicemail and routes messages into your IMAP e-mail, thus rendering voicemail somewhat less user-loathing. Naturally, there’s no business model for it yet, except for asking readers of The Register to make suggestions. And if that doesn’t work, we think they’ll start going to VC meetings with a kitten, a meat cleaver, and a banner reading BUY ME BEFORE I KILL AGAIN. (The Reg says that their lack of a business is “like any self-respecting Telco 2.0 company”. Cheeky little monkeys.)

A spectre is haunting Europe; the spectre (or possibly sceptre?) of Viviane Reding. The EU Commission slaps down the German Federal Networks Agency; it can’t force a new entrant to the market to pay an “unjustified” charge to the incumbent. This is one of those things where the only possible answer is “I should bloody well think so too”, as it seems crazy, but the German regulator did actually try to make new DSL operators pay a tax to Deutsche Telekom. Now that’s what I call regulatory capture.

And finally: it’s happened! BT wants to spend £1.5bn on fibre-to-the-cabinet. Watch this trench for more details…

To share this article easily, please click:

July 17, 2008

Big Cheese Interview: Tony Rallo, CTO, Televisa

Telco 2.0 is running a series of depth interviews with senior people in the Telco-Media-Tech sector. To start with, to support our summer research programme on new Content Distribution business models, we caught up with Tony Rallo, CTO of Televisa, the Mexico-based media giant which is also the world’s biggest creator and distributor of spanish-language content.

Tony has deep expertise in digital media (previously he worked for Apple in Europe). The global spanish-language market he serves (LATAM, Spain, US Hispanic) is huge and complex, with a GDP of $2.4 trillion, bigger than that of China. Televisa creates over 50,000 hours of video content per year (more than ABC, NBC, CBS combined) and has been at the forefront of exploiting it through the multiple media platforms they own (4 x TV channels, cable companies, soccer teams, live event venues, websites, merchandising operations) and through selling it to others worldwide.

Tony has a strong belief that Telcos potentially have much to offer the content industry…if only they’d think differently. Here’s what he had to say to us about the current state and future direction of content distribution - his personal views, not his company’s - over a beer as he passed through London:


Q1: Is the movement of video content into the online world an opportunity or a threat for media companies?

A1: It’s a big opportunity for all players in the value chain but you have to be aware of some important usage factors when creating an effective strategy. Firstly, the terminals. The PC, the TV and the mobile support different types of experience that are affected by time of day and type of content. This is more subtle than the normal debate around ‘lean-back’, ‘lean forward’, ‘participation’ and ‘snacking’. They all have different impacts on revenue and cost models.

For example, advertising models need to be adjusted depending on if the content is live or time-shifted. Sports content is consumed (and monetised) in different ways to entertainment and sub-categories like soap operas (‘telenovelas’). Mobi-sodes and other made-for-mobile content is going to be very important for the majority of mobile phones, but when you get to 3.5” screens (iTouch/iPhone) and bigger (Mobile Internet Devices) then normal video content can work fine.

Q2: What are the best business models for monetising video content online?

Advertising is still king for non-movie Video on Demand. Paying for individual content is not going to work on a mass scale partly because users just won’t be able to manage all their media files effectively and will get frustrated. Clearly there will also be a “Download to own” market within walled gardens like iTunes or XBox Market Place, but compared to the advertising and sponsorship models, I believe it will be small.

Renting will be a better model, especially for movies. Syndication and merchandising is obviously important - Televisa generates huge revenues from live events and merchandising around telenovelas like Rebelde [link]. Bundling with other communications products may also be an effective approach. For example our cable network in Mexico City - Cablevision - allows us to add voice to our packages as well as giving us another reverse channel for our online content. [Ed - It’s a similar model to how Comcast competes with Verizon in the US.] We’ve sold 27,000 voice lines on our triple play bundling since December 2007. We’ve also recently made investments in two other cable operators, Cable Mas and TVI, which we think is important given the nature of the Mexican telecom market.

In terms of our bundling with mass entertainment brands like Rebelde we take a 360 degree approach that goes beyond merchandising and Live events to include special Rebelde content for paid TV, Premium SMS (Ringtones, etc), a Rebelde custom made Magazine, eMarketing through databases to registered fans, specific online activites and exclusive (‘behind the scenes’) content online, an official website, and a Home Video Division which sells DVDs with all kinds of extras on too!

Q3: What’s the key role of IP (internet protocol)?

IP really helps address the issues of matching a multitude of different user experiences with a growing number of devices and content types. New technology, helps us to see what people are watching. This helps us to target advertising. For example we are currently evaluating Black Arrow for our online delivery platform. In cable we track 10,000 set top boxes per day in Mexico City via a custom made tracking application.

Q4: Broadcasting across frontiers - how well does content originated in Mexico play in other Spanish-speaking territories?

Telenovelas and similar drama-based entertainment seem to have universal appeal that transcends geographical and cultural borders. While Televisa is a Mexican-based company, we believe we are a global player. We have been selling content for more than 40 years. Today we sell in over 90 countries around the world, either via cable signals to MSOs or as ‘canned’ content to broadcasters. Most of the content is dubbed into 15 languages: from Mandarin to Turkish to Italian, Portuguese and French. But, as part of our expansion into new business areas, we are now locally co-producing successful formats like ‘Ugly Betty’ in countries such as China and France, producing them with local talent in the native language.

Q5: What are your views on piracy and security issues around online video?

Firstly, the industry must accept that hackers will never be beaten. Secondly, we need to think creatively about ‘traffic shaping’. Rather than the current approach of stopping people doing what they want, we need to think about differentiating services and access prioritisation based on consumer needs.

For example, one approach might be to improve throughput of important business traffic like email during office hours and then let other activities like P2P file transfers to happen unfettered at night time. Ultimately, every service provider will have to develop an appropriate approach based on their local telecom market situation and regulatory environment. Comcast has been in the press a lot recently for their battle with BitTorrent . Maybe if a Telco should develop a service proposition that charges differently depending on what priorities of services the user wants? This has not been resolved at all and it goes back all the way to the Net Neutrality discussion everyone is having. [Ed. - we couldn’t agree more. The two-sided telecoms business model addresses this head on. See here.]

DRM (Digital Rights Management) is another issue we haven’t seen well resolved so far. In delivery platforms such as “Pay TV” (i.e Sky, Cable) the digital rights are managed de facto since the content is encrypted throughout the distribution chain. But many content owners are “covering the sun with one finger” since today My HD Digital Signal can be transcoded to Analog and back to a digital file in a format like Windows Media or Quicktime with very HIGH quality and then uploaded to any website such as youtube. I mean any normal PC with good horsepower and Media Center Windows Edition can record a lot of content from any Analog” Signal….

Therefore I expect DRM on the business to Consumer side to be a long, complex debate. Steve Jobs recently proposed making iTunes DRM-free, but the industry hasn’t fully processed the importance of what he was suggesting. My view is that if we get our pricing right users will always be willing to pay for high quality protected content just as they do for high quality software.

YouTube is great for user generated content [see Chinese Backstreet Boys example below!], but Telco 2.0 readers should keep a close eye on what is happening with Hulu in the US. There you have high quality content delivered in a high quality fashion. There are predictions that within two years Hulu will have larger revenues than YouTube. It is very hard for YouTube to justify advertising in content which they haven’t developed and don’t own the proper rights. [Ed. - cf. Our analysis of YouTube vs iPlayer in the UK here].

Q6: How do you see telcos role in content creation?

Telcos have proven throughout the world, time and again, that they are not good at creating content. Telefonica made an interesting move in buying Endemol, but that seems to have been a temporary blip.

For one thing, they don’t really appreciate how complicated a critical issue like digital rights management is. All the contributing elements of a piece of commercial video (actors, musicians) have different types of rights attached to them, which is extremely difficult to manage.

Secondly, technology choices: a recent survey showed that the best penetration rates for IPTV were typically only 10% of the DSL install base. Why would users switch from cable if the cable services are up to par or better? My view is that content distribution via IP is an expensive option compared to broadcasting via antenas or satellite, even though the cost of broadband is dramatically reducing in a number of key markets. We are trying to do our share in Mexico since broadband penetration is directly correlated to economic prosperity, especially in the small and medium sized business market which is a very important part of the social fabric of the country’s economy.

For consumer entertainment, though, DTH (Direct-To-Home) satellite is a much better choice for a telco trying to create a national footprint, especially in a country with large geographic size.

Approaches to Mobile TV need some re-thinking too. The network technology is just not ready yet. Mobile TV certainly has a future as a broadcasting platform, but only for certain types of content. DVB-H, MediaFlo and DMB are very good complementary solutions for content owners and wireless operators to prevent streaming video saturating operators’ GSM and 3G networks and allowing 14 (or more) good quality video channels to be broadcast to a handheld device. [Ed. See more analysis of video on mobile here and of MediaFlo here].

Q7: So, telcos should focus on content distribution?

Yes, they should become a new type of MSO, but using a different platform and architecture. [Ed - We’ll be exploring what this might look like in a report coming out in September].

Q8: And what about the huge costs of over-the-top content distribution that ISPs are currently incurring (cf. iPlayer example in the UK)?

In the past users consumed roughly 30% of their daily bandwidth allowance, but were charged for 100%. Now, with P2P file sharing, and increasingly with services like the iPlayer in the UK, Hulu in the US, and high-quality YouTube they are starting to consume 100% (and more!). So, ISPs need to wake up and smell the coffee - they should charge their consumers more cleverly for different levels of content and service quality, as I mentioned before.

In Mexico, for example, pricing has been a big problem, but that is more because of local monopoly issues. Having broadband at USD$30 a megabyte won’t be sustainable in the long run!

Q9: So, what propositions could telcos develop to support your online video business, and make a decent profit in return?

Usage data to help targeted advertising? Yes, they could do a lot more here. If you consider how much we pay for unscientific usage data today, we’d certainly be willing pay for anything more accurate. However, as it stands today, we have received no propositions from telcos in this area.

Billing and payments, to collect content charges indirectly? Yes, but mobile operators are being far too greedy today. They are asking for 50% of the transaction, when Amex want 6% and Visa 4%! As a result we’re looking at alternative (pre-paid) approaches to avoid telco charges.

Content delivery services that reduce distribution costs and improve QoS? Yes, theoretically. But companies such as Akamai and LimeLight already have a very holistic offering. Their footprint and intellectual property in this area is very strong. Telcos are not good at maintaining server racks in the way Akamai or Limelight are. Perhaps they should buy a CDN (Content Delivery Network) company!

Q10: What’s your big message to the telco industry?

Telcos need to create a new type of brand proposition to the content industry - a new B2B2C ‘Lovemark’ . It needs to be based on a win-win, two-way street which helps the consumer receive a high Quality of Service, Quality of Content and Quality of Experience.

Telcos should be talking to content providers a lot more than they been to date. Otherwise situations like the iPlayer in the UK will continue. We need to avoid situations where the content provider launches a new platform/ service and the Telco Infrastructure becomes stressed. Paradoxically both sides need to communicate a lot better.

Ultimately, telcos need to help content owners move from ‘Network’ Television to ‘Networked Television’.

Telco 2.0 take-aways:

Telco’s clearly have a role in distributing video content. More and more content players we speak to see telcos as an additional distribution channel to audience viewing, fulfilling the content owners’ need to distribute to as many eyeballs as possible. Satellite, Cable, Fixed-line and Mobile all have a place in their distribution plans.

Security of content, quality of viewing experience and usage metrics are clearly features that content owners value and appreciate that Telcos have potential supporting capabilities in these areas. However, Tony’s view that Telco’s are, relative to the CDN players, not strong in managing racks of servers and software should be a real worry - we believe that this will be a core competence for content distribution in the future.

We are not surprised that Tony highlights telcos’ lack of success in creating content. We believe that vertically integrated approaches whereby Telcos buy content rights is the wrong strategy and one doomed to failure (a view we get supported time and again in our survey results).

Our analysis suggests that the two-sided business model offers the best prospects going forward for both content owners and telcos. And there are some good early models to learn from (cf. Telenor’s Content Provider Access programme).

Most importantly, the interview with Tony highlights the most interesting conundrum facing online video distributors: broadcasting is a cheaper way of reaching mass markets and attracts higher advertising rates, so how can Online Video become profitable - is it through interactivity, personalisation or just as a niche play?

[Ed - we’ll be debating this issue at the November Telco 2.0 event]

To share this article easily, please click:

July 16, 2008

Online Video Usage Scoreboard: YouTube thrashing iPlayer

Online Video consumption is booming. The good news is that clearer demand patterns are beginning to emerge which should help in capacity planning and improving the user experience; the bad news is that an overall economic model which works for all players in the value chain is about as clear as mud.

We previously analysed the leffect of the launch of the BBC iPlayer on the ISP business model, but the truth is that, even in the UK, YouTube traffic still far outweighs the BBC iPlayer in the all important peak hour slot - even though the bitrate is far lower.

Looking at current usage data at a UK ISP we can see that the number of concurrent people using YouTube is roughly seven times that of the iPlayer. However, our analysis suggests that this situation is set to change quite dramatically as traditional broadcasters increase their presence online, with significant impact for all players. Here’s why:

Streaming Traffic Patterns

Our friends at Plusnet, a small UK ISP, have provided Telco 2.0 with their latest data on traffic patterns. The important measurement for ISPs is peak hour load as this determines variable-cost capacity requirements.


iPlayer accounts for around 7% of total bandwidth at peak hour. The peaks are quite variable and follows the hit shows: the availability of Dr Who episodes or the latest in a long string of British losers at Wimbledon increase traffics.

Included within the iPlayer 7% is the Flash-based streaming traffic. The Kontiki-P2P based free-rental-download iPlayer traffic is included within general streaming volumes. This accounts for 5% of total peak-hour traffic and includes such applications as Real Audio, iChat, Google Video, Joost, Squeezebox, Slingbox, Google Earth, Multicast, DAAP, Kontiki (4OD, SkyPlayer, iPlayer downloads), Quicktime, MS Streaming, Shoutcast, Coral Video, H.323 and IGMP.

The BBC are planning to introduce a “bookmarking” feature to the iPlayer which will allow pre-ordering of content and hopefully time-of-day based delivery options. This is a win-win-win enhancement and we can’t see any serious objections to the implementation: for the consumers it is great because they can view higher-quality video and allow the download when traffic is not counted towards their allowance; for ISPs it is great because it encourages non-peak hour downloads; and for the BBC it is great as it will potentially reduce their CDN costs.


YouTube traffic accounts for 17% of peak-hour usage - this is despite YouTube streaming at around 200kbps compared to the iPlayer 500kbps. There are about seven times the amount of concurrent users using YouTube compared to the iPlayer at peak hour. Concurrent is important here: YouTubers watch short-length clips whereas iPlayers watch longer shows of broadcast length.

P2P is declining in importance

The real interesting part of the PlusNet data is that peak-hour streaming at around 30% far outweighs p2p and usenet traffic at around 10%. Admittedly the peakhour p2p/usenet traffic at Plusnet is probably far lower than at other ISPs, but it goes to show how ISPs can control their destiny and manage consumption through the use of open and transparent traffic shaping policies. Overall, p2p consumption is 26% of Plusnet traffic across a 24-hour window - the policies are obviously working and people are p2p and usenet downloading when the network is not busy.

Quality and therefore bandwidth bound to increase

Both YouTube and the iPlayer are relatively low-bandwidth solutions compared to broadcast quality shows either in SD (standard definition) or HD (high-definition), however applications are emerging which are real headache material for the ISPs.

The most interesting emerging application is the Move Networks media player. This player is already in use by Fox, ABC, ESPN, Discovery and Televisa — amongst others. In the UK, it is currently only used by ChannelBee, which is a new online channel launched by Tim Lovejoy of Soccer AM fame.

The interesting part of the Move Networks technology is dynamic adjustment of the bit-rate according to the quality of the connection. Also, it does not seem to suffer from the buffering “feature” that unfortunately seems to be part of the YouTube experience. Move Networks achieve this by installing a client in the form of a browser plug-in which switches the video stream according to the connection much in the same way as the TCP protocol works. We have regularly streamed content at 1.5Mbps which is good enough to view on a big widescreen TV and is indistinguishable to the naked eye from broadcast TV.

Unlike Akamai there is no secret sauce in the Move Networks technology and we expect other Media Players to start to use similar features — after all every content owner wants the best possible experience for viewers.

Clearing the rights

The amount of iPlayer content is also increasing: Wimbledon coverage was available for the first time and upcoming is the Beijing Olympics and the British Golf Open. We also expect that the BBC will eventually get permission to make available content outside of the iPlayer 7-day window. The clearing of rights for the BBC’s vast archive will take many years, but slowly but surely more and more content will be available. This is true for all major broadcasters in the UK and the rest of the world.

YouTube to shrink in importance

It will be extremely interesting to see how YouTube responds to the challenge of the traditional broadcasters — personally we can’t see a future where YouTube market share is anywhere near its current level. We believe watching User Generated Content, free of copyright, will always be a niche market.

Online Video Distribution and the associated economics is a key area of study for the Telco 2.0 team. We are planning on producing a full report in time for the next Executive Brainstorm in November.

To share this article easily, please click:

Verizon’s P4P initiative: will it support the value chain effectively?

The Telco 2.0 research team is undertaking some detailed business modelling around ‘Rich Media Distribution’ over the summer. We’ll also be debating this with industry leaders on 4-5 November at our next event in London. More on both of these anon. In the meantime, here’s some analysis of Verizon’s P4P next generation file swapping initiative:

We’re not sure how it happened, but Verizon appears to be turning into one of the most interesting telcos around. For a start, there’s the fibre - but then again, even AT&T has an FTTH roll-out of sorts going on. Then there’s ODI, their developer platform initiative. The whizzy portal-like Dashboard application Verizon Wireless is putting on its LG Chocolates has a publicly available API so people can do evil things to it. But perhaps the most significant change at Verizon is P4P, an attempt to reconcile the huge RBOC with the world of peer-to-peer applications, using a technology developed at Yale University as Haiyong Xie’s PhD research project.

We’ll start by noting that a lot of people read “P2P” and think copyright. Of course, the means by which you distribute something don’t determine its content, and certainly not its intellectual property status, so this is a red herring. Anyway, we’ll recognise this and move on - we’re interested in the telecoms aspects, not the record industry.

Why don’t telcos/ISPs like P2P?

Theoretically it should be one of the most efficient ways of delivering heavyweight content like video, music and big datasets; as more people want a specific file, so more sources of it are available. The scaling process is that of a mesh network. But of course, it doesn’t work like that; the Internet isn’t actually a random mesh network, but it does look like one to its participants. Instead, the underlying topology is very much scale-free, with more mesh-like areas interconnected by unusually critical and heavily-used links.


OpenP4P chart showing traffic by type

Thinking of it in economic, rather than technical, terms, this comes out even more strongly; the cost of delivering bits varies sharply as they make their way across the Net, depending on the markets for various kinds of lower-layer connectivity, the distinction between peering and transit, regulatory issues, time, and geography. The problem with P2P clients is that they tend to hammer away exactly as if they were part of a dense mesh network, where it doesn’t really matter where traffic comes from or goes to - but in fact, their behaviour can cause serious economic problems for ISPs if it means a high-cost sector of the network is heavily used.

This could be literally anywhere - for an Australian or New Zealand operator, it could be a congested transpacific cable, for an emerging-market one it could be an expensive international satellite link, for a British operator it could be the BT-owned local loop under IPStream or their BT Wholesale backhaul links, for an operator in a small but highly connected country like the Netherlands it could be metro connectivity, and for an FTTH operator it might be their peering relationships at their friendly local IX. But what’s certain is that there’s always somewhere, but it’s never the same place.

It’s Not That Simple

Some P2P clients now attempt to prefer local peers; but this is where the second clause comes in. What is excellent optimisation for one network will be poison for another; trying to maximise local traffic is exactly what the British or Dutch examples don’t want. The British ISP will have to fork out much more to Openreach and/or BT Wholesale; the Dutch one would much rather push traffic out to the abundant and cheap international connectivity of AMS-IX.

The problem is really that the business models of both parties to the game don’t work. Both ISPs and P2P users are constrained by their assumptions to behave as if they were adversaries; one desperately trying to stop the flood of traffic or sting it for more money, one desperately trying to evade them. What they really need to do is to co…well, whatever the verb from “co-opetition” is. If there was a way for ISPs to announce details of the network’s cost structure, so the P2P client can programmatically adapt its behaviour, this problem could be overcome. Rather than imposing crude preferences on the users through QOS and deep packet inspection, the ISP would play a tune for the clients to dance to.


OpenP4P Results

This, in a nutshell, is the aim of OpenP4P. Here are the results of Verizon,Telefonica, Pando Networks, and Yale’s field trial of the system. They are impressive, on the surface at least; but we’d note that so far, they’ve only tested it in the context of Verizon and Telefonica’s network topology. Obviously enough.

The data showed a dramatic cut in the traffic hitting VZ’s external peering links and also a big cut in traffic on their long lines between metro areas; unsurprisingly, given that the project’s aim was to localise traffic, the utilisation of local loops and metro backhaul was dramatically increased (this went from 6% of the total P2P to 57%). We don’t know how well it would work if the optimisation target was different - for example, to maximise traffic on LLU lines and lay off the backhaul, or to minimise internal traffic and maximise external.


OpenP4P results on Verizon: a metric of hop count over time for P2P and P4P traffic

However, Telefonica’s half of the trial does suggest that it works; they saw a 57% cut in the number of hops a P2P packet traversed, compared to a fivefold reduction for Verizon, but saw a much greater increase in the amount of traffic served within their local networks (it increased by a factor of 36, compared to a factor of 10 at VZ).

More Unanswered Questions

There are also some issues of security and trust that still need to be cleared up; the “trackers” that provide the network data to clients are in a very responsible position, which hackers would give their eye teeth for. A malicious tracker would be able to steer all the P2P traffic on the network down the most critical link, for example.

And how is this going to make money? By definition, if we’re publishing this information to the Internet at large, we can’t really restrict who reads it. More fundamentally, we don’twant to do that - because this doesn’t serve our interests at all. Refusing some group of applications the information would just add them to the heap of undifferentiated P2P traffic that’s clogging the lines.

Conclusions: Fundamentally Two-Sided

But there is an implicit two-sided market here. Users’ cooperation is rewarded with improved throughput and latency; content providers’ cooperation is rewarded by better quality delivery; the telco is rewarded with lower costs. Trade, in a sense, has been facilitated by the creation of a new network API. P4P is precisely what telcos and ISPs ought to be doing, faced with this coordination problem. However, we would recommend considerable caution in deploying technologies like this until the engineering and operations aspects of security have been clarified. The best way to clarify them would, of course, be to participate in the Working Group.

To share this article easily, please click:

July 14, 2008

Ring! Ring! Hot News, 14th July 2008

In Today’s Issue: Some phone or other launched; “ZZZPhone” debunked; Verizon ODI dip stick; Launchcast “Dashboard” open to hackers, in a good way; Verizon - dangerously interesting?; Sprint pushes push-to-talk; iPhone Truphone; NTT DoCoMo on the unwise monster acquisition trail again; unwitting private equity fund sups with Richard Li, helps 3UK double its customer base; pass the separator, Mme Reding; Comcast in trouble with the FCC; open search at Yahoo!; the coming mobile data boom?

Apparently the 3G version of some device or other launched today….but beyond such trivia, there were far more interesting things going on in the industry. For a start, wouldn’t you like to design your own phone? It’s a cool idea, but you’re probably best starting with an OpenMoko; it turns out that the devices are actually a job lot of old ZTE stock and the orders tend not to be fulfilled.

Here’s something more, ah, fulfilling: the first Verizon ODI device is out! And it’s something genuinely interesting - a module that fits into a tank of liquid and reports the level of the contents by text message. That’s actually useful, and could actually make money for operators and users, which is why nobody’s going to promote it very much.

Meanwhile, Verizon Wireless shipped the first gadgets with its “Dashboard” portal on board, aka Adobe Launchcast. More interestingly, details of it are being published so developers can make things to fit in with it; is Verizon gradually turning itself into one of the most interesting telcos around? FTTH, P4P, ODI, and this…which’ll come in handy, as they surrender to the falling price of SMS.

Quietly, Sprint gets some work in on its core voice & messaging products, rolling out its push-to-talk service to 47 new markets around the US. On the other hand, though; Truphone launches an implementation of its SIP client for the iPhone, but you can’t use it over-the-top on the 3G data network, because Apple doesn’t want its users burning the bundled data connectivity competing with the operators who both buy and subsidise the Jesus Phone.

Mike Elgan of Computerworld, meanwhile, asks the right question. Mobile phones have improved beyond measure in the last ten years - but what about calls? Hear, hear….literally.

The first rule of telecoms investment: don’t buy a Japanese network operator. The second: don’t buy a Japanese network operator’s good idea for use elsewhere. The third: if you are Japanese, don’t buy a foreign network operator. It is with a heavy heart that we read that NTT DoCoMo wants to buy stakes in foreign telcos again; oh dear.

Another piece of good advice is not to become a minority shareholder in anything Hutchison-related; there have been so many asset trades inside the empire that result in one-off gains for them and dilution for the others. Here go some private equity funds, looking at a 45% share of HKT. After all, 3UK will be wanting some capital if it’s going to double its business by 2012.

In the fixed world, meanwhile, they are about to feel the force of Viviane Reding for a change; so far the mobile operators have been the prime target, but now the European Commission is looking to push functional or structural separation across Europe. It might hurt at first, but it’s for your own good…

In other regulatory news, the FCC isn’t happy about Comcast and their Chinese Firewall-style use of spoof TCP RST messages to spork BitTorrent users.

They’re trying to buy Yahoo! again - that’s Microsoft and Carl Icahn, who’s apparently temporarily had enough of kicking Motorola around the pub car park like a sack of waste. Perhaps this is why? Yahoo! is introducing a platform to let third parties build their own specialised search service using Yahoo! APIs….which sounds cool.

And Informa crystal-ball merchants reckon we’re going to see mobile data volume race past voice in 2011.

To share this article easily, please click:

July 9, 2008

Two-sided markets: why do they matter?

In a previous article we provided an introduction to what we believe is the template for future growth in telecoms: two-sided markets. Having got the basic facts laid out, now we can take a closer look at some of the consequences of moving from one-sided to two-sided markets.

Two-sided markets in a nutshell

A brief reminder of what we’re talking about. In a one-sided market, merchants buy in equipment and services, taking on inventory risk. They combine them in some value-adding way, and sell the result on to end users (or other intermediaries in a value chain). The suppliers and customer do not interact directly. Most of telecoms works within a one-sided model today.

In a two-sided market, the middleman facilitates two groups on either side to interact with each other via some platform. This lowers transaction costs and builds scale. Critically, the price structure of using the platform is set to encourage participation from the most price-sensitive side, maximising platform revenues overall, rather than separately for the two groups. For example a newspaper typically charges a cover price well below that which would maximise reader revenue alone, because it needs a big audience to attract advertisers.

A good example of a company that moved from a one-sided to two-sided model is online bookie Betfair. Originally they only offered their own set of bets and odds online. They then allowed other bookies to offer bets in competition with one another, at which point Betfair’s business took off very rapidly. (Another in-depth example on job sites is here.)

In telcoland, i-mode is an example of a two-sided market, joining application developers to users. Our hypothesis is that operators need to focus on developing capabilities and services that facilitate a much wider range of business processes than content retailing. In each case, ‘upstream’ parties wish to interact with the telecommunicating public in some way, and the telco takes friction out of this process. The data by-products of the current triple/quad play products are key enablers, along with assets in the ‘edge’ devices (handsets, home hubs, set top boxes, smart meters, etc).

Elephants are born big

The first observation we have about two-sided markets is that they always need scale. That means you need to kick-start the market in some way to make it attractive to your initial customers and users. Typically some kind of trend-setters or marquee users are used to pump prime the platform. For example, an upmarket shopping centre looking to attract both retailers and shoppers will want a John Lewis or Nordstrom as anchor tenant. Every music platform will wants the Universal Music Group catalogue. In telcos we see this effect with peering and interconnect sites, with large anchor tenants attracting the smaller players.

This leads us to conclude that telcos are better advised to try building new two-sided revenue streams off their existing core voice, messaging and broadband businesses. This contrasts with the current approach of building whole new propositions, particularly around entertainment media, from a base of zero. Ultimately a collection of transaction platforms — for advertising, payments, and customer service — will be converging from multiple industries, such as online search, e-commerce sites or banking. Better to compete off a strong base when engaging such powerful rivals.

Gasoline, girls and guys

It would be nice to think that users will appreciate your wonderful new entertainment products and gladly surrender some more money for value-added services. Sadly, they seem to have a budget in mind that they want to stick to, and you end up dissipating a lot of your profit in marketing expenses.

A more compelling proposition is through creating efficiency and cost savings in a broader range of industries. In particular, saving labour costs or energy are obvious targets. It is possible to create one-sided solutions for specific verticals, e.g. a fleet tracking solution using cellular location. However, we feel that a two-sided market offers a more defensible opportunity, which means business process services (e.g. advert targeting) that involve interacting with the mass retail customer base in some way.

This is not to disregard the traditional focus of telco platform efforts, which is to optimise the supply chain of purely digital applications and content to users. This is necessary, but ignores the wider opportunity, and the consequent chance of spreading the risk and cost of building the platform across a much larger range of revenue sources. It’s the “analogue” world of people, trucks and raw resources where most of the economy still lives.

What do you know about the customer?

The ‘upstream’ party that wants to interact with the telco customer will have some data on that customer and their own relationship. However, this is likely to be significantly different from what the telco knows about the customer. The data assets of the telco, and the permission to use it derived from the customer relationship, are every bit as critical to optimising business processes as the ability to transmit data over distances via networks.

Where this data is most valuable is the ‘real time’ intelligence the telco can provide. Every time UPS delivers a yellow sticky ‘sorry you were out’ notice, instead of a parcel, resources are burnt to no productive end. The telco’s role is to use customer information/data — such as whether your mobile is associated with your home femtocell, if you’re in the middle of a call, or are roaming abroad — to help time interactions and facilitate transactions.

This is a common feature across two-sided markets: the more the middleman can help personalise the interaction, the more the platform can charge for its services. Operators need to reconsider how they manage such data assets, gather user consent, and extract value from them.

No bronze medals

Another property of two-sided markets is that they tend to follow ‘winner-takes-all’ economics, with a small number of dominant platforms: Windows and Mac OS; Visa, Mastercard and Amex; or Google and Yahoo!. In general, individual telcos will struggle to achieve scale in two-sided markets. The implication is that they will need to co-operate to either create clearing houses or hubs themselves, or work through existing aggregators or transaction networks (e.g. mBlox).

The goal for the platform must therefore be either to aim for general monopoly (in much the way the PSTN/PLMN voice network is the platform for all personal communications), or differentiate to dominate a niche.

The danger is that the telco is enveloped by other transaction and commerce platforms. A naive approach to exposing location, presence or other data could not only leave value on the table, but even worse have the telcos subservient to a small number of powerful external intermediaries.

A regulatory misfit

In competitive industries, and over the long run, prices reflect underlying costs. Competition will weed out inefficient players, making costs and prices converge.

Two-sided markets don’t work this way, and this can cause serious regulatory and competition issues, as one-sided rules are applied to two-sided markets. The heart of the problem is the way the platform subsidises one side to get them ‘on board’ in order to more than make it up on the other side.

The classic case that’s made its way through the courts in the EU’s anti-trust action against interchange fees on the Mastercard network. What happens is that when you walk into a store and pay a merchant with your your credit card, the merchant’s bank has to pay a fee to the bank that issued you the card. This fee doesn’t reflect any specific operational cost being incurred. The contention of the EU was that this is anti-competitive and welfare-destroying. The reality is that it is necessary to ‘bribe’ the public with zero-fee cards, low introductory rates, cashback rewards etc. to adopt credit cards and use them. Without mass adoption, the cost of the interchange fee becomes moot, since merchants lose sales (as customers don’t carry enough cash with them, or need credit the merchant doesn’t offer), and costs rise (as you have to handle lots of cash).

We see the same dynamics in telecoms, with roaming and termination fees. These exhibit some of the properties of two-sided markets. Imagine there are two groups, businesswomen (who travel internationally) and househusbands (who don’t). The high price of roaming could be seen as a ‘tax’ on the price-insensitive businesswomen, which can be used to subsidise pre-paid service for the househusbands, driving mutual benefit of service adoption. Likewise, calling-party-pays and high termination fees in Europe have driven adoption far higher than in the superficially more ‘equitable’ North American model.

Competition law concerns

These competition and regulatory issues will be a major battleground. You can charge more than cost, and it’s not necessarily a sign of lack of platform competition. You can charge below cost, and it’s not necessarily predatory. (You can read more on this in this academic paper.)

Indeed, even having competing platforms may end up dividing the market into two sub-scale platforms that ends up destroying value to all users. Whether platform predator or prey, it is common for a dominant platform to reach such scale that it starts to cause wider competition concern. We see this with Microsoft and Google, for example. But there may be no public welfare gain from breaking these platforms up or constraining the scope of their activities. Standard competition law doesn’t apply here.

Where competition law does become more of an issue is where a successful platform adds on features and leverages its existing base to move into other areas. We’ve seen this with Microsoft (Windows plus browsers, media players and security suites), and it’s an ongoing issue with companies like the BBC using distribution of public service broadcasting content to enter the pay-per-view market, much to the objection of commercial rival Sky.

Not a binary choice

You can have elements of both one- and two-sided business models, simultaneously. Wal-Mart acts as a retail platform (two-sided) as well as a traditional merchant that buys inventory at wholesale top resell (one-sided). The two-sided mode is favoured when the middleman can either eliminate uncertainty and risk for the upstream parties wishing to interact with the users, or has distribution economies of scale. The driver is a balance between inventory and risk of the one-sided merchant mode vs. the cost of affiliation to and use of the platform.

We see this in Blyk, which gives away a limited amount of free calling in return for receiving targeted adverts, but continues to charge for higher usage. Telco business models will be complex hybrids for the foreseeable future.

A pricing challenge

The pricing of platform services can be a tricky subject. The obvious question is how much to shift cost away from the price-sensitive side. You can’t charge less than zero, but you can start to give away products and services (e.g. Google’s search, mail, and other content sites). Where do you stop? Could all those service delivery platforms end up being used not to create new consumer value-added services, but instead to create the ‘give away’ services you can trade for customer data and permission?

There may also be a need for ‘signalling’ of commitment by one side or the other. Job sites for high-end positions can filter out under-qualified job applicants by charging to become a member. Normal dating sites typically charge both sexes, but adulterous men have to pay a huge premium to signal intent to wayward women. It remains to be seen how these kinds of exceptions might play out in identity, payment and content services offered by telcos, but collecting cash from people merely so they can signal intent sounds like a profitable proposition.

Finally, where you have competing platforms they typically need to follow opposite pricing models. So platform A charges end users and gives away service to the upstream parties, and platform B does the opposite. This lets them co-exist in two distinct segments, rather than over-dividing the pie.

A very different kind of business

As we’ve seen, two-sided markets require a very different way of thinking about telcos and their role in the economy. They have very different economics to most of today’s telco products, and break the assumptions behind today’s regulatory and competition rules. They require new skills, partners, channels and organisation structures. However, they also offer an escape route from the problems of today’s one-sided telco business model, and since they principally re-sell data (packaged in special ways), they don’t require masses of capex to implement and are often highly profitable.

For more information on the opportunity for telcos to build two-sided market business models, see our report The 2-Sided Telecoms Market Opportunity

To share this article easily, please click:

July 7, 2008

Ring! Ring! Hot News, 7th July 2008

In Today’s Issue: OFCOM moves towards BT’s line on fibre; £31,000 phone bill; GTalk for mobile, where’s the talk?; iPhone bank run; pity about the OS security patches, though; cross-platform widgets; Files On Ovi; MTN-Reliance rift; EU offers more 2.6GHz TDD, WiMAX Forum delighted, Ericsson furious; RIM shares dive after profits double; how do you value mobile ads?; GSMA’s funny figures

Big news: OFCOM director Ed Richards spoke to the UK IT trade association, Intellect, in terms that suggest he’s leaning towards offering BT concessions on regulatory pricing in exchange for deployment of fibre in Openreach’s access network. This is an instance of the two-level bargaining process we described here; it looks like BT is still succeeding in monopolising influence on the regulator, but the next step will be to see how this can be made compatible with the existence of a competitive ISP/Altnet market in the UK. Details are to be published in September. Here’s a telling quote:

One thing is certain: the government is very keen that taxpayers don’t shell out to make Britain’s internet infrastructure competitive with more advanced networks in countries such as South Korea and France.

The horror….the horror…

Whatever OFCOM concedes to BT, it surely won’t be anything like some of the pricing people still encounter for mobile data roaming. Marvel at the £31,000 phone bill, to say nothing of the “IT consultant” who downloaded a whole episode of Prison Break on a roaming UMTS link and was surprised when he got a huge bill…

Meanwhile, here’s another case of Voice & Messaging 2.0. Google has developed a browser-based Google Talk implementation for iPhones, which looks likely to dig into iPhoners’ SMS traffic. However, despite the fact that GTalk has basic voice capability and uses standard protocols like XMPP, there’s no voice yet; one wonders what the reason for this is.

O2 UK’s…err…keen iPhone pricing caused the carrier to run out of stock when the 3G Jesus Phone went on sale. No surprise there: consumer surveys in the US show that 55% of those currently after a smartphone want a fruity one.

How many of them realise that the iPhone’s OS is several months behind MacOS X on its security patches? Could be a PR accident waiting to happen.

In a sense, this gets less and less significant; here’s Access offering a widget interface for Windows Mobile phones, as well as its own Linux system. Back at MWC this year, we noted that the LiMo community was already running various Symbian things on top of their Linux system. As Adobe and others develop universal clients, will anyone care what OS is underneath?

Meanwhile, Nokia launches a “GoToMyPC” clone as part of Ovi. Well, that’s nice, but will they let you put scripts or other executables there? Mobility implies you’re not necessarily on line, so one of the major barriers to mobile/desktop integration is the need for some sort of persistent presence on the Net that can act like a POP3 mail server, and transfer your stuff to wherever you are (and not just view a screen remotely). A solution for this with activation energy low enough for everyday users is badly needed.

The MTN-Reliance deal is falling apart, due to the dispute between the brothers who own the founding stake in Reliance Industries. Stand by for more Vodafone rumours.

Two worlds collide; the European Commission, with the assistance of the usual spectrum allocation alphabet soup, has changed the rules regarding the 2.5-2.69 GHz band in Europe to give more space to time-division duplex radio technologies like WiMAX, UMTS TDD, and perhaps LTE. The WiMAX Forum is delighted (and probably OFCOM, which is keen), Ericsson, as befits a vendor focused ruthlessly on UMTS, is furious.

RIM shareholders gave them the bird, after the company only doubled its profits; apparently no-one believes in their new consumer-focused strategy. Neither do we, really; they could be improving their voice features and integration with communications-enabled business processes, rather than trying to get into the ‘shiny’ market just as Sony Ericsson finds it’s drying up.

Open Gardens has some interesting thoughts about how to value mobile advertising opportunities. As is traditional over there, they manage to work IMS in somehow. Not so sure about that…

And finally, the GSMA tells off the EU over regulation. Apparently mobile operator CAPEX has fallen from 13 per cent of revenue in 2005 to 11 per cent today. Well, that’s nearly a whole 0.7% a year!

More seriously, could this possibly have something to do with the fact that voice revenue, coverage, and penetration were still rising in major Western European markets in 2005, and now they’re not simply because there are only so many phones one person can use, there is only so much land to cover, and voice is competing with people who offer it FREE? Perish the thought.

To share this article easily, please click:

July 3, 2008

Symbian goes open — or does it?

The big news in mobile this week is that Nokia has bought Symbian, the mobile operating system provider, so that it can give it away. In the first article on this news we looked at the deal from the view of the shareholders and competitive threats. In this second article we take an in-depth analysis of the nuts and bolts of software licensing and governance, to see if Symbian really lives up to its ‘open’ headline.

Symbian: open in parts

With operators being pushed towards ‘open’, what can they learn from Nokia and Symbian’s approach?

A ship with a small hole still sinks

Like it or not, technocratic choices of licensing schemes really matter. The plan is to release the entirety of the Symbian OS and S60, UIQ, and MOAP under the Eclipse Public Licence by the end of 2009. What is an Eclipse Public Licence? Well, it’s a development of IBM’s Common Public Licence, originating when IBM Canada released the software development environment known as Eclipse. It differs from the purist open source GNU General Public License in that it doesn’t require work released under it to be purely open-source. For example: you can build something containing both Eclipse-licensed code and code you wish to keep as your intellectual property, and release the whole thing under Eclipse.

This implies that you can’t include anything licenced under EPL in a “more open-source” project — the GPL bars you from imposing restrictions on the user beyond the requirement to maintain the GPL status of the work, so the fact Eclipse licensing permits IPR restrictions would preclude it. Similarly, as you cannot legally assert restrictions on GPL code, you can’t include it in an EPL project. You must choose.

So, this is somewhat less open than Google’s rival OS, Android, whose guts are subject to the Apache licence. It is also significantly less open than the mobile Linux OSes LiMo and OpenMoko. Both of these are GPL, and the latter is open enough to satisfy the most rigorous free software fundamentalist. It’s glatt kosher software. On the other hand, though, it’s a big step forward in terms of openness from most vendor OS so far.

Even if only a very small part of the code is tainted with restrictions, it is likely to be a new and important feature with IPR held by some third party. As Linux users know to their cost, it doesn’t matter if every other part of your PC is working if you can’t get a driver for your particular network or graphics card.

Who is really steering the ship?

Software licensing and IP law, despite or even because they are the sexy issues here, are far from the whole story. There are a lot of other questions. Governance of the project is one, and another is the status of the developer infrastructure around it. The Nokia announcement allows us to know quite a lot about the organisation structure…

The members of the foundation’s Board of Directors will also have seats in each council, with additional seats in the councils available for other foundation members. The foundation will operate as a meritocracy [sic]. Device manufacturers will be eligible for seats based on number of Symbian Foundation platform-based devices shipped, with the other board members selected by election and contribution. The initial board members will be AT&T, LG, Motorola, Nokia, NTT DOCOMO, Samsung Electronics, Sony Ericsson, STMicroelectronics, Texas Instruments and Vodafone.

That’s certainly an interesting definition of ‘meritocracy’, one you could easily mistake for ‘plutocracy’. Note the detail that, as well as shipments, you can get on the board by “contribution” - this seems to mean that the more lines of code your developers check in, the better you are. (All the traditional objections to measuring code value by the line are well and truly in effect here.) And what are Moto doing in there, as a company that has a proprietary OS, some Windows Mobile products, and a major role in LiMo — but is about to sell its handsets division and cut half its R&D staff?

It’s also worth noting that two pureplay chip makers are involved; perhaps their low-level microcode is the proprietary treasure the Eclipse licence is meant to protect.

Who is navigating the ship?

So what will this big carrier/big vendor/big software house club be responsible for?

The foundation will be responsible for managing the software roadmap and releasing the software platform, with the source code available to all foundation members. The development of the platform will be the responsibility of the foundation members, with the foundation coordinating development projects and managing the master code line.

So, not that open; if you’re not on the board, it doesn’t look like you’ll have much input, even if you’re an important stakeholder, like the operator who has to subsidise these products. And does the provision regarding the source code being “available to all members” only last until the 2009 release to Eclipse? Or will there be chunks that the Foundation keeps to itself? It’s been pointed out that most Symbian Foundation people will be Nokia employees, but this doesn’t worry us much — OpenOffice, various Linux distributions, and Sun’s Java are all maintained to a large degree by Sun, Novell, and IBM coders.

Keeping the ship sea-worthy

A huge issue in all open-source projects is the status of the developer infrastructure. Things like standardisation working groups, the process through which new software releases and updates are handled, preparation of things like software installation packages and Windows installers, tech support, and the tool chain — software development kits, documentation, IDEs, debugging tools and compilers — often define their culture and success as much as the headline stuff about GPL & Co. In fact, a large part of the very first Free Software Foundation project, GNU, was the development of free utilities in order to make GNU independent of proprietary software throughout.

The Platform will be completely free and open to developers whether enthusiast, web designer, professional developer or service provider. Of course, membership is not required to develop services and applications on the platform. The Symbian Foundation’s developer program will provide a single point of access for developer support, providing a wide offering of tools and resources - most available free - including:
  • Software Development Kits (SDK’s)
  • Documentation
  • Sample code
  • Knowledge base
  • Application signing program
  • Incident based technical support

Well, that sounds pretty good — except for the “most” tools and resources. Which ones won’t be? We suspect it’s probably the Symbian Signed process, which isn’t free (either in terms of free beer or free speech) today, and is widely hated by developers. Symbian, or should we say Nokia, is keen on keeping it going as a barrier to mobile malware, but it’s hard to see how it fits in an open-source project that’s a development platform rather than an application. Think of it like this: obviously, open-source means that I can do anything to my copy of Firefox or Linux I damn well like, and further that I can submit the alterations to the Mozilla Foundation.

Beware of customs checks and duties on arrival

Obviously Mozilla is within its rights to make me go through its development process before they let any of my code into the version that is available to the public, and certainly no-one would expect them to let J. Random Hacker offer their version of Firefox as an official Mozilla product through their release process. If I really wanted to, I could fork the project, and offer my own browser under some other name.

But the Symbian signing process doesn’t just apply to changes to Symbian itself — it applies to all applications developed for use on Symbian, at least ones that want to use a list of capabilities that can be summed up as “everything interesting or useful”. I can’t even sign code for my own personal use if it requires, say, SMS functionality. And this also affects work in other governance regimes. So if I write a Python program, which knows no such thing as code-signing and is entirely free, I can’t run it on an S60 device without submitting to Symbian’s scrutiny and gatekeeping. And you though Microsoft was an evil operating system monopolist…

May require proprietary fuel to operate

What else should a Symbian developer be careful of?

Furthermore, the platform will embrace the runtime technologies already used by the developer community allowing for efficient use of existing assets and skills. The platform will support an extensive offering of development environments including native Symbian C++, POSIX C and C++, Python and Web Runtime based on WebKit.

Webkit is open; so is Python, although the Nokia-contributed bits like the GUI toolkit and the wrappers for the Symbian API are debatable. There’s no mention of the Carbide IDE in there, or the standard Symbian C++ developer toolchain. It’s also not clear what influence the Foundation’s directors will have on the signing process.

Curiously, the mobile industry is adopting this sort of semi-open model just as the IT industry has given up on it. Sun has just finished open-sourcing the entirety of the Java world, and has even chosen to use the open-source version of Solaris instead of its own. But both Google and Nokia are choosing to be slightly less than open, or not quite the same way. Criticisms of Google over Dalvik (their homebrew Java virtual machine, which underlies Android, but unlike Java itself isn’t GPL-licenced) are perhaps overdone - Dalvik is covered by the Apache Public Licence, and nobody really doubts the openness of Apache.

But Symbian could be much more open, in more ways than one. Is Nokia under pressure from the chip vendors or the carriers to maintain certain special restrictions? And what would have happened if Psion had gone open source back in 2001?

To share this article easily, please click:

July 2, 2008

GupShup — ad-funded mobile services done right

In our Voice & Messaging 2.0 Report we listed over 70 new services we’d come across in our travels. One that’s come to our notice since publication is SMS GupShup. Whilst there were many ‘me too’ Internet messaging services we reviewed, this US/Indian start-up is noteworthy for its business model.

As operators find voice and messaging markets mature and revenues stagnate, they are looking for new growth. One route is to try to create elaborate new services and persuade consumers to part with money for them. The other is to find ‘upstream’ parties willing to pay to interact with telco retail customers directly with the telco as an intermediary. Advertising is the starting place for many such initiatives, and GupShup as a template for this begs the question: what is the role of the operator? Bit pipe, enabling platform or complete services provider?

Humans are tribal creatures — hairless monkeys with a grooming instinct

Today’s core telephony and messaging products suffer from many limitations, but perhaps the most central is that humans live and interact in groups, and that these products don’t support such activities well. Conference call systems are notorious for their poor user interface, and your phone’s address book never seems to learn that you message the same three people over and over.

But perhaps the most critical limitation is pricing: telcos are determined to scale price linearly with the number of participants in the conversation, but the sender of a message doesn’t see the value scale the sale way. In emerging markets, where alternative Internet and PC-based forms of communication are much more limited in penetration, this forms an important barrier to usage.

Fixing the pricing problem with adverts

GupShup is an SMS (and Web) based group messaging service available only in India, and with 7 million active users. Superficially the functionality is similar to Web 2.0 poster child (or enfant terrible) Twitter, minus the downtime. Users can send messages to a group, and can choose to follow up subscribe to multiple groups. Messaging is push-pull — you send the message into the cloud, but recipients can total control over what they receive. Like Twitter, the result is a stream of banal human existence. Fortunately, that’s what the users want, and is the basis for an SMS industry worth approaching $100bn/year.

The service has a diversified revenue model, comprising:

  • Premium content services, with GupShup handling the billing and payments.
  • “Business class” service, with priority message delivery and fewer restrictions.
  • Ad-funded service.

What makes it special is how the advert is managed and how every advert immediately provides value to the user. Group messages are limited to one hundred characters, with the remaining 60 in an SMS given over to brand advertisers. Sending a message costs the same as your usual mobile rate for one message, but the cost of forwarding that message is then picked up by the advertiser. Everyone wins, and unlike media advertising’s bait-and-switch, there’s a powerful social driver behind it, and the potential for personalisation and innovation.

With ads, everyone really must win prizes

This approach contrasts with the greedy attitude of carriers in the developed world. Many are trialling ad-serving technology that personalises adverts based on the user’s demographics and click stream. Such trials have been secretive, and failed to get user opt-in. Most importantly, they never answer the user’s issue: so, what’s in it for me? The user feels they’ve already paid the full rate for a broadband connection, and what are you doing wiretapping my Web browser and fiddling with the ads?

No wonder the result is a PR disaster and carriers are back-peddling fast.

The lesson is simple. You want to use the customer’s data and the customer’s device and create new revenue streams from them. Note the apostrophes — it’s not ‘customer data’. You’ve got to offer something in return for what you take. And there’s nothing better than the reward being immediate.

So what should carriers do?

As we wrote in our report Telcos’ Role in Advertising Value Chain, overall we are sceptical of operators trying to provide completely ad-funded services, or generate their own advertising inventory. Operators like Blyk are addressing a narrow, high-risk market. That said, as GupShup only cannibalises a rarely-used feature — messaging to multiple recipients — and is likely to stimulate new usage, it could be one worth emulating.

Alternatively, we would consider differentiating our retail pricing by (at least pretending) there’s no more cost to sending to multiple recipients than one (with the reality being you’d probably drop your bucket size). Another approach would be wholesale deals with online services that are heavy SMS users, again to facilitate some creative retail pricing to undo the “group penalty”.

However, a more Telco 2.0 approach is to ask not how to compete with such services, but how to become a supplier to them. Indeed, those very same ad-serving technologies become a lot more attractive in this scenario. Services like age verification, cash collection, credit management, customer care — there is a long list far beyond just the bit pipe. [Ed - which of course you can read all about in our report on The 2-Sided Telecoms Market Opportunity.] It just requires a new mindset around high volumes, ‘horizontal’ business process and value creation — not rent-seeking on the underlying access assets, or dazzling media services.

To share this article easily, please click:

Telco 2.0 Use Case: Trading Hub for the Transport Industry

Telco 2.0 readers will be well aware that we’re very keen on any application that uses telco capabilities to remove friction and inefficiency from the wider world of business - perhaps the fundamental insight in the 2-sided business model is that the telco doesn’t only sell telephone calls as a finished product to end users, but also a much wider range of functions for upstream businesses to integrate into their production process. In terms of economics, these communications-enabled business processes usually exist to reduce transaction costs and thus facilitate trade that would otherwise not happen. Alternatively, they help larger enterprises to overcome their internal diseconomies of scale.

This use case is of the first kind; the telco platform as a trading hub, allowing the many companies that would never be able to build the mass-production IT systems that their bigger competitors use to benefit from increasing returns to scale.

You’d be surprised how significant backloads are for the transport industry and the economy more broadly. At first sight, the economics of a truck route would seem trivially simple; you pay for diesel and wages and capital, collect rates from customers, and costs scale by the mile - right? But it becomes a lot more interesting when you realise that the profitability of a marginal load is highly dependent on whether it is carried on the way out or back; as the truck has to come back anyway, the costs of both trips must be accounted for in the price of the trip out, so the margin on the trip back can be 100%.


This can have profound consequences - the phenomenon of fresh fruit and vegetables from eastern Africa showing up in European supermarkets, for example. Airlines providing cargo service to Kenya and other places noticed that the return trip tended to be empty, and unsurprisingly the answer was to drop freight rates dramatically. Any price at all was better than simply shipping air. It therefore became possible to export produce on a large scale, which has become one of these countries’ biggest sources of foreign exchange.

At a more micro level, the problem for an individual firm or owner-driver is finding a backload in the first place. Unless you can arrange it in advance, this is a major source of uncertainty, and one that may grant bigger companies increasing returns to scale - they can afford a monster IT system to track all their vehicles and customers and match them up, and they have more locations, staff, and vehicles out there looking for backloads. They can dedicate salesmen full-time to searching out and buttering up regular backload customers.


The text-book approach to this is to start an exchange; if everyone brings their supply (i.e. trucks looking for loads) and demand (i.e. loads looking for trucks) to the same place, the chances of a good match are dramatically increased for everyone. The more business the exchange does, the better it gets; prices are more stable, the range of deals on offer wider, and you can have greater confidence that you can buy or sell when you need to. Liquidity goes to liquidity. This dynamic is well-known, and can be perceived in stock markets, Internet exchanges, ports, produce markets, Web search engines, and dive bars. It’s interesting that some of the very first commercial exchanges of this kind were for freight - specifically shipping. There’s a good reason why the London freight bourse is called the Baltic Exchange when it trades in shipping to every port on the planet; when it started, that was where the trade went.

Telcos might create such an exchange, have a share in it, or simply be suppliers to it - this will vary between markets and territories, just as it makes sense for an operator to be a bank in Kenya but not in Germany, or it’s possible to make money from MMS in a human-machine application but not in a human-human one.

But doing this raises some big technical challenges, as set out in this slide from the 2-Sided Business Model report.


If you’re Maersk Logistics, these aren’t such big issues. You can afford to build this kind of capability, and you can pay IBM Global Services to do a lot of it (which is what Maersk did). And you’re a big enough customer that your friendly local telco is likely to be receptive to a wholesale deal. If you’re Charlie Cox, not so much. But Telco 2.0 could change this. It’s all about commercialising the core telco assets by making the key capabilities that spring from them available in forms which allow third-party businesses to use them in new ways, right? In this case, we’re using the telco’s secure messaging capability, its location capability, its identity capability, and its payments capability, just as all of these would be used to deliver an SMS message; we’ve just taken apart that finished product and built something new out of the parts, which we can only do if the telco doesn’t supply it in a sealed box.

While we were researching this for the 2-Sided Business Model report, we calculated that improved backload finding could be worth up to £218m a year in the UK, on the basis of a 5% improvement in load factors. So there’s serious money to be had in there. It’s not surprising we also encountered a number of start-up freight exchanges trying to do just that - one of them is even offering an application for Windows Mobile devices in order to mobilise their IT system. That sounds like an ideal partner for a telco.

This is a special case of our general model for the telco future. At the bottom of the stack, telcos own huge legacy assets which we’ve characterised as pipes, packets, and platters. These produce certain functional capabilities that grow out of them - we’ve described those as the seven questions and various other things. Traditionally, these were then combined into a standardised product by telco engineers and commercialised by telco internal marketers direct to end users. In the future, however, we think they will be sold in three ways - as plain APIs for third-party developers, as integrated products created by third parties in partnership with the telcos, and within products the telco offers under its own brand.


To share this article easily, please click:

Symbian — Its Role in the Mobile Jigsaw

The recent purchase of Symbian by Nokia highlights the tensions around running a consortium-owned platform business. Obviously, Nokia believes that making the software royalty-free and open source is the key to future mass adoption. The team at Telco 2.0 disagree and believe the creation of the Symbian Foundation will cure none of the governance or product issues going forward. Additionally, Symbian isn’t strong in the really important bits of the mobile jigsaw that generates the real value to any of the end-consumer, developer or mobile operator.

In this article, we look at the operating performance of Symbian. In a second we examine the “openness” of Symbian going forward, since “open” remains such a talisman of business model success.


Symbian’s core product is a piece of software code that the user doesn’t interact with directly — it’s low-level operating system code to deal with key presses, screen display, and controlling the radio. Unlike Windows (but rather like Unix) there are three competing user interfaces built on this common foundation: Nokia’s Series 60 (S60), Sony Ericsson’s UIQ, and DoCoMo’s MOAP. Smartphones haven’t taken the world by storm yet, but Symbian is the dominant smartphone platform, and thus is well positioned to trickle down to lower-end handsets over time. What might be relevant to 100m handsets this year could be a billion handsets in two or three years from now. As we saw on the PC with Windows, the character of the handset operating system is critical to who makes money out of the mobile ecosystem.

The “what” of the deal is simple enough — Nokia spent a sum of money equivalent to two years’ licence fees buying out the other shareholders in Symbian, before staving off general horror from other vendors by promising to convert the firm into an open-source foundation like the ones behind Mozilla, Apache and many other open-source projects. The “how” is pretty simple, too. Nokia is going to chip in its proprietary S60, and assign the S60 developers to work on Symbian Foundation projects.

Shareholding Structure

The generic problem with consortium is typically not all members are equal and almost certainly have different objectives. This has always been the case with Symbian.

It is worth examining the final shareholder structure which has been stable since July 2004: Nokia - 47.9%, Ericsson - 15.6%, SonyEricsson - 13.1%, Panasonic - 10.5%, Siemens - 8.4% and Samsung - 4.5%. At the bottom of the article we have listed the key corporate events in Symbian history and the changes in shareholding.

It is interesting to note that: Siemens is out of the handset business, Panasonic doesn’t produce Symbian handsets (it uses LiMo), Ericsson only produces handsets indirectly through SonyEricsson, and Samsung is notably permissive towards handset operating systems.

SonyEricsson has been committed towards Symbian at the top end of its range, although recently is adding Windows Mobile for its Xperia range targeted at corporates.

Nokia seems almost committed though has recently purchased Trolltech — a notable fan of Linux and developer of Qt.

The tensions within the shareholders seem obvious: Siemens was probably in the consortium for pure financial return, whereas for Nokia it was a key component of industrial strategy and cost base for its high-end products. The other shareholders were somewhere in between those extremes. The added variable was that Samsung, Nokia’s strongest competitor, seemed hardly committed to the product.

It is easy to produce a hypotheses that the software roadmap and licence pricing for Symbian was difficult to agree and that was before the user interface angle (see below).

Ongoing Business Model

Going forward, Nokia has solved the argument of licence pricing — it is free. Whether this passed to consumers in the form of lower handset prices is open to debate. After all, Nokia somehow has to recover the cost of an additional 1,000 personnel on its payroll. For SonyEricsson with its recent profit warning, any improvement in margin will be appreciated, but this doesn’t necessarily mean a reduction in pricing.

It also seems obvious that Nokia will also control the software roadmap going forward: it seems to us that handset operators using Symbian will be faced with three options: free-ride on Nokia; pick and choose components and differentiate with self-build components; or pick another OS.

We think that given the chosen licence (Eclipse — described in more detail in next article), plus the history of Symbian user-interfaces, and the dominance of Nokia, all point towards other handset operators producing their own flavours of Symbian going forward.


Nokia may have bought Symbian, even without competitive pressures, purely to reduce its own royalties. However, the competitive environment adds an additional dimension to the decision.

RIM and Microsoft are extremely strong in the corporate space and both share two features that Symbian are currently extremely weak in — they both excel in synchronizing with messaging and calendaring services.

Apple has also raised the bar in usability. This is something where Symbian has stayed clear, but is certainly not one of the strengths of S60, the Nokia front end. The wife of one of our team — tech-savvy, tri-lingual, with a PhD in molecular biology — couldn’t work out how to change the ringtone, and not for lack of trying. What do you mean it’s not under ‘settings’? Some unkind tongues have even speculated that the S60 user interface was inspired by an Enigma Machine stolen to order by Nokia executives.

Qualcomm is rarely mentioned when phone operating systems are talked about, and that is because they take a completely different approach. Qualcomm’s BREW would be better classified as a content delivery system, and it is gaining traction in Europe. Two really innovative handsets of last year, the O2 Coccoon and the 3-Skypephone, were both based upon Qualcomm software. Qualcomm’s differentiator is that it is not a consumer brand and develops solutions in partnership with operators.

The RIM, Microsoft, Apple and Qualcomm solutions share one thing in common: they incorporate network elements which deliver services.

Nokia is of course moving into back-end solutions through its embryonic Ovi services. And this may be the major point about Symbian: it is only one, albeit important piece of the jigsaw. Meanwhile, as we’ve written before, Ovi remains obsessed around information and entertainment services, neglecting the network side of the core voice and messaging service. Contrast with Apple’s first advance with Visual Voicemail.

As James Balsillie, CEO of RIM, said this week “The sector is shifting rapidly. The middle part is hollowing — there are cheap, cheap, cheap phones and then it is smartphones to a connected platform.”

Key Symbian Dates.

June 1998 - Launch with Psion owning 40%, Nokia 30% & Ericsson 30%.
Oct 1998 - Motorola Joins Consortium

Jan 1999 - Symbian acquires Ronneby Labs from Ericsson and with it the original UIQ team & codebase.

Mar 1999 - DoCoMo partnership

May 1999 - Panasonic joins Consortium. Equity Stakes now: Psion - 28%, Nokia / Ericsson / Motorola - 21%, Panasonic - 9%.

Jan 2002 - Funding Round of £20.75m. SonyEricsson tales up Ericsson Rights.

Jun 2002 - Siemens Joins Consortium with £14.25m for 5%. Implied Value £285m

Feb 2003 - Samsung Joins Consortium with £17m for 5%. Implied Value £340m.

Aug 2003 - Five Years Anniversary. Original Consortium Members can now sell. Motorola sells stake for £57m to Nokia & Psion. Implied Value £300m.

Feb 2004 - Original Founder Founder Psion decides to sell out. Announces to Sell 31.7% for £135.5m with part of payment dependant of future royalties. Implied Value £427m. Nokia would have > 50% control. David Potter of Psion says total investment in Symbian was £35m to-date, so £135.5m represents a good return.

July 2004 - Preemption of Psion Stake by Panasonic, SonyEricsson & Siemens. Additional Rights issue of £50m taken up by Panasonic, SonyEricsson, Siemens & Nokia. New Shareholding structure: Nokia - 47.9%, Ericsson - 15.6%, SonyEricsson - 13.1%, Panasonic - 10.5%, Siemens - 8.4% and Samsung - 4.5%.

Agree to rise cost base to c. £100m/per annum and headcount of c. 1,200.

Feb 2007 - Agree to sell UIQ to SonyEricsson for £7.1m.

June 2008 - Nokia buys rest of Symbian with Implied Value of €850m (£673m) with approx. payout of - Ericsson - £105m, SonyEricsson - £88.2m, Panasonic - £70.7m, Siemens of £56.5m and Samsung £30.3m. Note, Symbian had net cash of €182m. The price quoted by Nokia of €262m is the net price paid by Nokia to buy out the consortium not the value of the company.

To share this article easily, please click:

Telco 2.0 Strategy Report Out Now: Telco Strategy in the Cloud

Subscribe to this blog

To get blog posts delivered to your inbox, enter your email address:

How we respect your privacy

Subscribe via RSS

Telco 2.0™ Email Newsletter

The free Telco 2.0™ newsletter is published every second week. To subscribe, enter your email address:

Telco 2.0™ is produced by: