" /> Telco 2.0: November 2008 Archives

« October 2008 | Main | December 2008 »

November 26, 2008

Videos from Telco 2.0 event, November 2008, London

A big thank you to Telecom TV for their continued support of the Telco 2.0 ‘executive brainstorm’ series. We’ve already communicated some of the general sentiments coming out of the event here. Now videos of some of the stimulus presentations are available on the Telco2.0TV site (here).

Below is an initial selection of our favourites: Amazon’s ‘flywheel’ for growth, the two-sided telecoms Business Model, Telephony 2.0, Vodafone’s new open mobile internet platform, Long Tail de-bunked, Turkcell walking the talk, Impact of the Credit Crunch on investor sentiment, Clarion call for Change:

First, Dr Werner Vogels, CTO of Amazon, describes Amazon’s platform business model and its ‘flywheel’ for growth (see analysis of the key lessons for telcos here):

Chris Barraclough, co-founder, Telco 2.0 Initiative describes how the ‘two-sided’ telecoms business model works, and what’s needed to turn it into reality:

Martin Geddes, Chief Analyst Telco 2.0 Initiative articulates the untapped opportunity for growth in the core telephony business (more on this here):

Pieter Knook, Director of Internet Services, Vodafone Group, unveils his vision for a ‘next generation open mobile internet platform’:

Cenk Serdar, Chief VAS Officer, Turkcell on how his company is blazing a trail with identity and banking services.

Evidence that challenges the Long Tail theory (more on this here).

James Enck, Senior Associate Analyst at the Telco 2.0 Initiative on the impact and opportunity of the Credit Crunch for telcos (more on this here):

And a stirring speech by Trudy Norris-Grey, MD Strategy & Transformation at BT Wholesale challenging the Telecoms sector to make the change to realise the $375bn Telco 2.0 growth opportunity.

Many more videos and interviews from the event from Cisco, Telecom Austria, Alcatel-Lucent, BT, Sicap, Phorm, NokiaSiemensNetworks, Openet, Habbo Hotel, Intel, SAP (here).

To share this article easily, please click:

Telco 2.0 reader survey - help us improve our value to you

Dear Reader,
We would really appreciate your input to refine the direction of the Telco 2.0 Initiative’s research programme for the next 12 months.

Please give us your input using the short online questionnaire here. (It takes just 5 minutes).

An incentive? Apart from better research output from us, we’ll donate up to £1,000 to Oxfam this Christmas. £1,000 will buy 40 goats for communities in poor countries or care for 20 vulnerable kids. We’ll give the equivalent of a Goat’s worth to Oxfam for every 10 of you who complete our reader survey. Thank you in advance for stimulating our generosity as well as our insight.
(Full link to survey here: http://www.surveymonkey.com/s.aspx?sm=BJHNA2inDLokplN0PpEBGA_3d_3d).

To share this article easily, please click:

Use Case: Telco customer data for advertising in OTT video

In our upcoming report on Two-Sided Business Model Use Cases in Advertising and Marketing (part of a new Advertising & Marketing 2.0 research practice), we examine more closely how telcos can leverage their data assets to improve their positioning in the advertising and marketing value chain via a platform strategy.

As a taste of our ongoing research, we presented a short example from the over-the-top video world at the 5th Telco 2.0 executive brainstorm at the beginning of the month, to test out some of the concepts with the audience. Interestingly, the presentation seems to have resonated with participants, who were asked to rate current telco strategies for supporting the advertising and marketing industries.

They responded with an overwhelming bias towards “weak”. Commentary in the feedback session included observations that telcos are in an ideal position to add insight into their customer base, but have failed to data-mine properly or develop monetization strategies. The audience also identified three next steps for the industry: understand the data, respect the customer, and open the system.

enck-eventNov08-1.png Below is a use case scenario for telco customer data in the context of an over-the-top video and advertising platform:

The event participants’ feedback really goes to the heart of our approach. A central tenet of our work around the Two-Sided Business Model is that telcos could, and should, do much more with customer data beyond the purely operational contexts in which it has been used historically (see article on the Customer Data Revolution).

This requires that telcos understand not only what data they have, but also what significance that data may have for an upstream partner or customer. This may not always be obvious, and indeed, much of this data has been viewed by telcos historically as the by-product of delivering connectivity, voice and data - the wood shavings on the workshop floor, left to be swept away at the close of business. The opportunity, as we see it, is to harvest, analyse and put that data to work.

First of all, we need to briefly revisit what data assets we are talking about here. As may be familiar to frequent readers, we have previously defined these along eight category lines, illustrated below. While each of the individual elements of the eight is of varying levels of value to the telco in provisioning and maintaining the services underlying the customer relationship, how might an upstream partner or customer view the same, and what value might they have for them?

enck-eventNov08-2.png

The answer will vary depending on the objectives of the upstream player in question, but broadly speaking, we have defined seven questions which upstream customers might commonly seek to answer, and aligned against them telco protocols and assets which might hold the answers, or at least give additional insight as to what we can infer about likely answers.

enck-eventNov08-3.png

Because the range of potential platform customer needs is so vast, let’s narrow the discussion down and look at a specific potential upstream customer of telco data assets and consider what it might seek to gain from having access to them. We have chosen to look at a theoretical use case involving Move Networks.

Move Networks is a venture-backed private company based outside Salt Lake City, Utah in the U.S.A. With investors including Comcast Interactive Media, Cisco, Microsoft, and Grupo Televisa, and clients including ABC, Fox Networks, Televisa (see interview here) and ESPN.com, Move Networks might be considered as a poster-child for how traditional media and players in the distribution value chain will seek to harness and monetize internet video in future. Move Networks’ offering to content publishers is in fact both a distribution and advertising platform, with a number of differentiating facets:

  1. Upon content ingestion, content is split into many smaller segments, with each segment encoded at a variety of bit rates to suit publisher targets for viewer experience across a range of access technologies (dial-up, broadband) and end user devices (mobile, computer, television). This content is stored on multiple standard HTTP servers rather than proprietary media servers.
  2. Rather than a dedicated desktop client or media player, Move Networks uses a lightweight browser plug-in, the presentation of which varies depending on content partner, and can be branded and re-skinned as required. The Move Media Player plug-in constantly provides the HTTP servers with a view of the network conditions, throughput and latency at the client side, allowing the servers to seamlessly synchronize and insert lower or higher bit-rate content segments dynamically, as required. This is Move’s solution to buffering, and reflects the company’s strategy to satisfy traditional media partners’ demand for a broadcast-like experience in consistent near-HD quality - a key consideration given that some of its media customers are offering content on a subscription basis.

  3. Given that the media player plug-in reports a variety of information back to the servers, Move has the ability to track and analyze viewing histories, day-part viewing patterns, abandonment rates, pause patterns, etc., which provides the content publishers and media buyers with a granular view into viewer behavior. This can be useful in designing and refining programming, and particularly, advertising formats. The Move player is also able to return high-level geo-location data about viewer location, and presumably high level browser data such as language settings and operating system.
  4. Beyond the granular behavioral analysis which Move affords advertisers, it has the added advantage of being “TiVo” proof, in the sense that viewers cannot skip ads. For “must-see” content, this gives advertisers the confidence to sponsor entire television episodes, as was the case with the original “soap opera” format in the early days of radio. One demo we have seen of Move advertising capability includes a “sponsored intermission” for Toyota, with one frame in the window reserved for an ad avail for the local Toyota dealer, as inferred from the viewer’s geo-IP data.

enck-eventNov08-4.png

In some respects, Move’s offering would naturally be of interest to traditional content publishers and media buyers, as it delivers something akin to a conventional broadcast-like experience, but with a greatly enhanced behavioral tracking and analysis capability, which gives them the ability to continually experiment with and optimize advertising formats. Moreover, it theoretically allows advertisers to more accurately infer user preferences in pursuit of more finely tuned targeting.

For example, if Move’s analysis of a viewer’s history shows frequent views of sports or automotive-related content, late at night, in a city which is home to a number of universities, it may
be a decent bet to conclude that the viewer in question is a male student, and Move can then serve a suitable ad from its inventory.

Move’s platform is considered by many to be the envy of the industry, and it certainly seems to be further along the path of mainstream media world acceptance than many
of its competitors. So what, if anything, might a telco do for Move?

  1. Guaranteed QoS? - This is already the case to some extent, at least in the sense of traditional CDN services. Move has relationships with a number of CDN providers, among them two offerings from telcos AT&T and Level(3). However, we question whether any more is really needed within the individual telco networks (as in network-caching solutions). Given that the Move browser client is the intelligence which manages bit-rate dynamically, the current approach is probably acceptable for the foreseeable future, notwithstanding congestion of a very extreme and persistent nature.
  2. Co-marketing? - Not likely. In the absence of some revenue share agreement between telcos and publishers, there is no incentive for the telcos, and in the absence of any added value being delivered by the telcos, there is no incentive for publishers.
  3. Market intelligence? - Quite probably. As we mentioned above, at the moment Move actually only has a limited amount of information with which to infer viewer profiles. Undoubtedly, a bank of telco customer data parsed on a more granular basis (location data, payment and credit-related histories, calling patterns, web session data) could make Move’s targeting capability more sophisticated, which should translate into higher CPMs on its platform.

It’s important to note that we’re not talking here about data identifying individuals which is handed over to Move, at least not without a very robust and transparent opt-in system. Sensitivities over privacy aside, it may be, as we heard from one company in the analytics space recently, that for some upstream customers, too granular a dataset introduces too much complexity and negates the benefit of the data.

Rather, we think it might be sufficient for Move or a similar player to have access through telco APIs to anonymized customer segment data which defines customer attributes into buckets or categories, which could then be matched against both Move’s own analysis and third party data sources as appropriate to an advertiser’s own targeting criteria. This approach could form the first step towards a telco-specific Mosaic of sorts.

Specifically, we think the following telco data assets would be relevant to Move Networks’ targeting efforts, while keeping customer data in the non-controversial arena with regard to privacy (please note we have excluded mobile data from this as Move currently is only compatible with Internet Explorer 6 and 7, and Firefox 2 and 3, which makes mobile largely a non-issue for the immediate future - though a range of mobile data assets would definitely be attractive to a comparable player with a mobile delivery capability):

Product data - A telco API could allow the Move player to query a database of anonymized telco customer data sorted by product mix. Depending on the targeting criteria defined by Move or its advertisers, the range of product choices by telco customers can inform ad targeting decisions. What proportion of the customer base is top/mid/low tier? What proportion takes multiple services? What proportion makes use of advanced options like voicemail, ring-back, etc.? What proportion are pay TV households? Of those, how would the average channel bouquet be classified? What proportion of the customer base has made service changes recently (upgrade, downspin, or cancellations)? Married with other telco datasets such as location, this manner of segmentation could be very interesting to an advertiser in defining audience segments by inferring “wallet size”, technology adoption, media consumption and other attitudes and preferences, as it might fill in some information blanks left by other targeting tools.

Session/call-flow data - For advertisers to whom factors such as sociability and “connectedness” are relevant, telco data is a treasure trove. For the entire customer base, or for specific sub-segments as defined by other telco datasets, what is the ratio of inbound to outbound calls? How large is the average household’s calling circle? What is the frequency and duration of calls? What proportion of the customer base could be considered above average in calling habits? What is the time distribution of calls? What proportion of the broadband customer base could be regarded as “super-connected”? From each of these questions, which telcos can answer quite definitively, advertisers and brand managers can make inferences about sociability and “connectedness” which inform ad targeting decisions, particularly if cross-referenced with other datasets, both telco and non-telco in origin.

Payment history - Again, depending on the criteria defined by the advertiser, insight into the telco customer base’s payment history and relationship with the telco itself can potentially yield valuable insight, particularly when linked to other subsets of data. What proportion of households pay by monthly direct debit vs. manually or by phone? What is the timeliness/reliability profile of the customer base or sub-segment? What are the correlations between delinquency rates and product mix? How has that trend changed over time? All of these can be mixed with other criteria, such as location, to triangulate customer segment profiles.

Loyalty measures - Similar to payment history, just as brand advertisers might be keenly interested in targeting based on attributes around ability and willingness to pay, loyalty is another key metric. Churning from telco services is much more painful for the consumer than changing brands of soap, and thus may not be comparable data from the view of some advertisers. However, as part of a broader consumer profile, data around length of customer relationship, attempted churn/churn pre-emption/win-back history and churn risk level (both as defined by telco CRM software) would no doubt be of interest to advertisers in other industries where the pain of churn is high for the customer, but equally devastating for the supplier (banks, insurers, utilities).

Location data - Married with all of the above, more specific location data is probably an El Dorado for a company like Move Networks. A telco API could allow the Move player to query a database of anonymized telco customer data by local exchange. This could then be referenced against either third party data (census, credit scoring, lifestyle surveys) or other internal telco datasets (product mix, CDRs, broadband usage intensity, payment history, loyalty measures) to create customer segment profiles by location, which could then be included in the targeting decision algorithm. For example, the output to the targeting algorithm might effectively say:
  1. “50% of customers connected to exchange #2008 exhibit high sociability levels (they make and receive lots of calls);
  2. 40% are classified as high spenders (they take top tier or multiple services, or recently added or upgraded services);
  3. 30% are super-connected (they spend significantly more time online than the average for this area);
  4. 70% are credit-risk code green (they’ve never missed or been delinquent with a payment);
  5. 35% are classified as brand loyal (they’ve been with us for more than three years).

The ad ultimately served will depend on a wide variety of criteria stipulated by the advertiser or media buyer, but we believe there is no doubt that the non-invasive, anonymous telco customer data employed in generating this decision has immense value to the decision makers.

There are, however, many open questions surrounding this use case, with which we grapple and hope to answer in the final version of our report. For example, would advertisers be interested in this sort of data if it only related to one operator in isolation? Our view is probably not. This begs the question of whether telcos need to federate their data to provide as representative a sample as possible by country.

This, in turn, begs the question of whether the new data platform in question should be outsourced to an independent entity, whether an existing player in the data management space or a newco joint venture between telcos. We welcome your comments, criticisms and suggestions on this work in progress, as we firmly believe it should involve those who live and breathe the complexities of telco reality on a daily basis.

[Ed. - The Use Case Report will be available as a large report in Feb 2009, or available in installments via the new Telco 2.0 Executive Briefing subscription service].

To share this article easily, please click:

November 25, 2008

Amazon Cloudfront: Lessons for telcos in content distribution

Following our article on the Future of Online Video and on Amazon’s platform business model we look here at another interesting product from Amazon - Cloudfront - to tease out some more lessons for telcos:

Everyone’s impressed by the low low prices Amazon Web Services is offering for content delivery from its new content delivery network (CDN), CloudFront. Heavy users might pay as little as $0.09 a GB! This feeds into another story — the paradox of CDNs’ increasing importance, but decreasing profitability. Nobody seriously considers working with heavy traffic sources like video without using a CDN of some description. This is chiefly because their importance has led to a wave of new entrants - both VC-funded startups and telcos who integrate CDN capability in their networks - and a price war. Surely, Amazon’s entry to the market must mean a further wave of price-slashing?

But there are reasons to suggest that CloudFront is considerably less revolutionary than it sounds:

That’s not a heavy user - this is a heavy user!

For a start, there is one significant exception as regards CDN profitability - Akamai Technologies, the original CDN operator, is making more money than it ever has done. And this is quite likely to be the case, even with Amazon offering low, low prices — as Dan Rayburn points out, what Amazon considers a heavy user (over 150TB/month) is nowhere near as heavy as what Akamai considers a heavy user (closer to 800).

Further, the only way CloudFront accepts requests is as HTTP GET requests; we tend to describe things like YouTube as “Web video”, but it would be more accurate to say that they are video wrapped in the Web. In fact, the actual streaming occurs between the Flash application running in your browser and the video server, using their own streaming protocol. This is how PlusNet were able to characterise iPlayer traffic so easily. Video streamers won’t get much benefit from CloudFront yet - it is possible to stream over HTTP, but it’s noticeable that nobody does when they have any other options. By comparison, Akamai (and friends) positively specialise in handling seriously large volumes of music and video.

All CDNs are not the same

In effect, CloudFront isn’t a CDN so much as a mirror server, the old-fashioned way of distributing static files. CloudFront would be a great way to deliver images, software distributions, or video for download, as long as you weren’t planning to do Akamai-like volumes and you weren’t particular about latency. Going by the published list of CloudFront server locations, it seems that Amazon has opted for a deployment strategy that places servers at major Internet exchange points, rather than pushing the CDN further down into the ISP infrastructure. For example, the European locations match the top three IXen - LINX, AMSIX and DECIX, and the North American ones match up with PAIX, NAP of the Americas, Equinix Ashburn, 111th St NYC, and the two in Seattle. This is cheaper in every way, but it faces fundamental limits.

Here’s a BGPlay visualisation of an Amazon Web Services prefix from their European data centres:

AWSEuNetwork.png

Obviously, content from a macro-CDN like Limelight or CloudFront must transit your internal network to reach the access network. Further (as Akamai pointed at last autumn’s Telco 2.0 Executive Brainstorm), as bandwidth increases, round-trip latency accounts for a bigger chunk of the time it takes a particular chunk of content to arrive, making CDNing more important. The haul from the CloudFront server in London could easily be of the order of 400 miles for subscribers in the UK and Ireland; as UK webhosting is so concentrated in London, this is arguably little better than naive serving.

Akamai’s deployment strategy is the opposite - they endeavour to put content servers as far into the eyeball ISPs’ networks as possible, so as to shorten the round-trip time and save traffic on metro networks as well as backbone networks. Also, Amazon doesn’t seem to be planning a Google-like peering strategy, as its Internet presence is entirely accounted for by its transit providers (NTT America, Level(3), Qwest and Tiscali). Compare the list here, for Amazon, and Akamai’s European network.

Here’s a similar visualisation of Akamai’s London network:

AKALONnetwork.png

Cache is king

Further, it doesn’t yet provide for dynamic content (like enterprise applications, gaming, user generated content, e-commerce etc), where most of each page has to be generated by an application on request. As CloudFront requires the use of Amazon S3 storage, we’re obviously thinking in terms of all-Amazon Web Services development here.

Consider the case where some application has a Web server running as an Amazon EC2 instance. Now, the server could also be an application server itself, or it could be the frontend to an app server running elsewhere in EC2. It doesn’t matter. When the server receives a request, it has to either process it or call the app server and wait for the callback to return, depending on the details. But once it has the response, it would have to then write it to an S3 bucket, and then call the CloudFront API to register the new content, handle the response, remake the Web page accordingly, and then do the same process again to put the page on the CDN, and redirect the user there.

There are a lot of moving parts there, and a lot of latency-generating operations — and you have to use S3, a storage service — rather than Amazon’s message queue service. Akamai, however, has a whole specialised CDN (Edge Computing) devoted to dynamic content and application servers. The big CDNs also provide a lot of other functionality for your money - greater or lesser integration in your DNS, use of your own origin servers, access to raw server logs or to data provided by their analytics program.

So if it’s not suited to streaming or dynamic content, it doesn’t reach down into the eyeball networks, and it doesn’t offer anything special in terms of analytics, what is it for?

Keep it simple, stupid

The first thing is that it’s a simple solution for simple problems. Got a startup? Are you yourself a startup? Got a credit card? You’ve got yourself a distributed Web-serving infrastructure. We can certainly imagine small software companies, who can be small in the sense of one or two people and typically have to load out lots of very big files that can easily be broken, finding this very useful. Another wonderful feature about all the AWS products is that the notion of “bursting” your bandwidth allowance is disposed of. Haven’t you noticed that popular sites stopped disappearing behind “Contact Our Billing Department - SomeISP.com” splash pages about when S3 was launched?

The second is that it’s easy for Amazon. They have to have a presence at major IXen to keep their infrastructure going, they have to push out really enormous numbers of graphics files to generate Amazon.com pages. Why not extend it a little, create yet another way to interact with their core infrastructure, and generate even more of those lovely transactions?

Conclusion: it’s another spin of the flywheel

The big take-away from this is that Amazon are following their classic “flywheel” strategy that we wrote about last week. They reduce prices and costs of entry to the “long tail” of merchants and application developers. Capabilities that only the “big boys” could access before are democratised. This drives volume across the whole of their platform infrastructure.

Whether Cloudfront is profitable or even competitive against Akamai is almost a moot point; it’s like asking if a particular robot in a Toyota car assembly plant is profitable. It makes no sense in their strategic worldview. Amazon’s platform is more than the sum of its parts, and it just added an important new part. Telcos should not only learn from this, but also take heart - quality distribution is still an area with big opportunities for those who are willing and able to provide it.

[Ed. - the new Telco 2.0 report on the Future of Online Video Distribution - Fixing the Broken Value Chain is out in December.]

To share this article easily, please click:

Monaco Media Forum and Ofcom conference: Telecoms opportunities abound

Clearly, we’re pleased to see the industry moving so enthusiastically to grapple with the realities of Telco 2.0 principles, but two valid questions to ask are: to what extent does this reflect the specific audience which Telco 2.0 attracts, and how well, if at all, do the challenges of Telco 2.0 concepts map across other segments of the value chain?

Luckily, we received invitations to two interesting conferences which pleasingly echoed many of the same issues we are striving to address: the Monaco Media Forum and the OFCOM International Conference, for which our senior associate analyst, James Enck, was also asked to prepare a short paper on the challenges of funding next generation access.

In this article, we present some of the key takeaways and their relevance to the Telco 2.0 message. We’ve also gathered some must-see videos of key presentations:

Lubricating the digital economy

Both conferences came during a period of unprecedented turmoil and uncertainty, as the banking crisis evolves into a broader economic crisis. However, there were a number of positive messages regarding new opportunities for the broad ICT spectrum. Generally, we see these messages as broadly supportive of our views regarding the telco segment as a relative defensive sector with a huge opportunity to innovate and emerge from the current economic crisis as very different beasts.

At the OFCOM event, Eric Bresson, the French Minister for Communications, made an inspiring speech in which he made a strong case for ICT as a magnet for investment and a driver of economic development and stability, and outlined some of the forward-looking steps which France is taking under its 2012 Digital Blueprint. Key points:
  1. ICT in Europe accounts for 25% of total economic growth today, and should represent around 30% of growth over the next five years.
  2. The increased contribution of ICT to the economy means that it will become even more indispensable as a driver of job creation and defensibility (e.g., you can’t “offshore” a datacenter or FTTH network).
  3. The promise of greater long-term economic competitiveness makes ICT one of the most attractive investments which national governments can make. The current turmoil gives us a great opportunity to respond with a digital revolution.
  4. France is taking ambitious steps with its 2012 programme, including universal access to government e-services, but also implementing a robust mechanism for citizens to manage their “data shadow” (a theme we have written about recently).
The Monaco Media Forum contained a number of similarly positive statements, particularly regarding the growing significance of mobile and online advertising:
  • A case study involving a European Cup tie-in campaign for Puma shows a 10:1 ratio of accuracy of consumer phone numbers in comparison to standard web registration forms, demonstrating the power of tying telco customer identity to lead generation and market intelligence applications. The same session contained some discussion of pouring telco data (CDRs) into the profiling process.
  • Publicis Group confirmed that its campaign planning for 2009 on behalf of many major clients contains allocations for mobile campaigns, for the first time ever. This was also echoed by AdMob, which has a number of brands in its stable committing to multi-million dollar mobile advertising campaigns in 2009.
  • A number of presenters alluded to the relatively low proportion of online in total ad spending (currently 6 - 7%) versus direct response marketing (17%), which points to huge upside if the medium can deliver the same level of relevance and granularity.
  • Maurice Levy, CEO and Chairman of Publicis Group, contrasted the dynamics of the dotcom crash with the current environment, predicting that offline budgets will be the first victim of cutbacks in this cycle, in preference for the more granular analytics available in the online world.
  • Jeffrey Cole of USC’s Annenberg Center, in his wide-ranging and entertaining presentation, cited an important figure - that the average US household today spends $260 per month on telecom services which didn’t even exist a generation ago. Even at the poverty level, $180 average monthly expenditure demonstrates the extent to which telcos have created an essential feature of modern life.

The customer data dilemma

Despite the perception that they are formidable masters of data, the online players face the same set of dilemmas as telcos in the area of balancing stewardship of data with the need to drive forward with better audience definition, profiling and targeting. A number of presentations took the view that the online world and consumers are in the evolving process of reaching an understanding:
  • Jeffrey Cole observed acutely that the pressures on the content world force it towards ad-funded distribution models, which inevitably leads the consumer to the same compromise made with television in the 1950’s - the cost of “free” content is advertising.
  • The distinction between the television compromise and the online situation is the perception of greater invasiveness in an online context. Our premise remains that over time, as ad content proliferates, consumers will trade a lower level of anonymity for more relevant content. The panel discussion at the MMF following Avinash Kaushik’s presentation (found in our next section) contained some interesting discussion of stewardship and strategies to promote trust, so that consumers will find this a trade they can live with.
  • Nikesh Arora of Google again underlined the data dilemma and possible compromises in both his MMF and OFCOM presentations. His central point on the issue was that there is a generational aspect at work which makes “millennials” much more relaxed about privacy, particularly with regard to the perceived benefits of self-expression. He also made the case that consumers have been giving away personal metadata via credit card use for decades with relatively little pushback. (He may have a horse in this particular race, but his underlying point is a valid one.)

The challenges of personalization/customization/optimization

There was no shortage of coverage of these issues, primarily from the perspective of advertising, but we all acknowledge the importance of the same to telcos, who have largely struggled here to date. This is particularly ironic given telcos’ direct and personal relationship with large numbers of customers. However, with the evaporation of traditional sources of growth, personalization, customization, and optimization arguably matter more to telcos now than ever, so it pays to listen to the experiences and approaches of those engaged in this space. Innovators were not in short supply at the MMF event, and we were treated to a number of presentations by interesting companies, whose value proposition is entirely rooted in analyzing and organizing data in such a way as to make it more relevant, and thus more compelling, to the end user:
  1. Calvin Lui, CEO of Tumri, discussed his company’s approach to better ad performance through customization. Tumri’s technology deconstructs the various elements of online ad creative content (color scheme, call to action, special offer, background, etc.) and reconstructs presentation in response to its own analytical data and the target demographics identified by an advertiser, all in under 100 milliseconds. This allows advertisers huge latitude to experiment with various configurations and messages across multiple target groups and assess performance to continually optimize effectiveness.
  2. Amiad Solomon, CEO of Peer39, described his company’s take on improved targeting through semantic analysis. Peer39, whose team includes the original developers behind AdSense, seeks to make online advertising content more relevant by analyzing the entire semantic context of webpages, including comments sections and user forums. This holistic approach to context is quite different from a key-word based approach. After all, what good is serving ads against key words when the context may be entirely erroneous? Consider an extreme example: a website covering “drag racing in Queens, New York” - for which a keyword approach might serve an ad related to “drag queens in New York,” which would most likely be unwelcome to the target audience.
  3. Dave Sifry introduced Offbeat Guides, which embraces personalization throughout. Offbeat Guides are personalized travel guide books, each unique to the individual traveller, put together by the travellers themselves on the company’s site from a wide variety of sources on the web, and customized along a number of parameters such as date/season of travel, point of departure and various flavours of special interests and preferences. Customers may buy either a .pdf version, physical print version, or both, and soon Offbeat Guides will be available in HTML format to allow easy viewing on mobile devices.
  4. Avinash Kaushik, head of analytics at Google, presented a very well-received talk entitled “Experiment or Die.” It was an interesting mixture of personal experience and observations from the analytics world, coupled with cautionary statements towards old media that it must find and hire talented people who understand analytics and optimization, or face the consequences. He discussed a number of experiments which various sites have carried out on presentation, advertising footprint (interestingly, one example achieved a 40% uplift in ad revenues by reducing the ad footprint by 25%), and pricing structures. His main messages, as we interpret them, were: companies have to experiment in order to learn and improve, defining the right performance metrics is key (website “hits”, he said, are “How Idiots Track Success”), and the company has to free itself from constraints placed on it by HIPPOs (“Highest Paid Person’s Opinion”) in order to innovate.
  5. Adapting the old to the new and remaining relevant

    This was really at the heart of both conferences, from different perspectives. For the OFCOM audience, largely regulators and their ecosystem, the challenge is to adapt to rapid technology change and remain relevant to the market in that context. For the old guard among the MMF participants, the challenge to adapt and remain relevant to shifts in media definition and consumption has never been more pressing. One moderator in a breakout session noted that, at many media conferences, the old guard broadcasters and new media types tend to congregate along tribal lines, and in many cases don’t speak the same language, when in fact their interests are probably more aligned now than ever before:
    • James Murdoch’s star turn in a solo interview touched on many of the themes of tensions between old media and new media. In particular, his comments about not using the web to simply replicate newspaper content online, but rather to engage the audience and drive consumption of the core product, resonates quite strongly.
    • Ben Silverman from NBC Universal and Mark Thompson from the BBC touched on similar themes with regard to online video. Mr. Silverman stated that Hulu is 50% ahead of where NBCU originally expected viewing to be, and both he and Roma Khanna (representing NBCU at the OFCOM conference) discussed the demonstrated potential of online video to actually boost, rather than depress, viewing of “destination TV” content. In other words, online and offline are not necessarily locked in a binary relationship, at least not when executed with some joined-up thinking.
    • Maurice Levy from Publicis Group made some very interesting statements regarding customer-centric advertising and brand evangelism in his panel with Niklas Zennstrom. In particular he alluded to the case of a major client who has created over 600 TV ads (a lot in television termvs), as against over 20,000 which have been created by individuals. Whether these will always be aligned with the brand values envisaged by the company is a concern, and companies are increasingly having to grapple with the balance between retaining control and letting the consumer “own” the brand.
    • The contentious issue of piracy and the role of telcos and ISPs in combating it was never far away in either conference, though more so at the OFCOM event. One interesting anecdote which arose from the MMF music breakout panel, was that of TDC, the private equity controlled Danish incumbent. Faced with commercial pressure from utility FTTH projects, and legal pressure from rights group KODA over file sharing, TDC earlier this year took a very clever decision. In a perversion of the sender pays model, TDC struck a deal with KODA and a large number of music labels, to pay the equivalent of what KODA collects each year, in return for unlimited access to music for its customers via legal download and a cessation to legal pressures against TDC and its customers. So, TDC pays the original sender to become the new sender, thus avoiding having to police its customer base, delivering a generous-looking gift to the end user, and gaining some ground on customer retention (the DRM wipes the content if the customer leaves TDC).

    Big picture takeaways We appreciate that this article represents a lot to absorb, and we hope that many of the key messages are self-evident, but, if not, our view is that there is much here for telcos to interpret positively:
    1. Connectivity and interactivity are definitively embedded in the lives of consumers as indispensable elements in a very uncertain and volatile world. Consider what that could, and should, mean for your business, and how you will capitalise on it.
    2. The media world, far from being an enemy or parasite, will share many of the same challenges as telcos, as broadband becomes more ubiquitous and advertisers/content publishers grow more selective and sensitive to the economic realities at hand. Your opportunity is to cross the conceptual and industry language barriers to define how you can help - because it seems clear that you have customer and data assets which could be used, responsibly, to the benefit of both parties.
    3. Take the chance to learn from the approaches of innovators in analogous parts of the value chain. Just because you’re not directly involved in web analytics, site optimization, or the travel guide publication industry, doesn’t mean that you can’t gain something from the experiences of people who are.
    Now, of all times in living memory, is the time to think beyond the traditional comfort zone.

    It’s been done by others, just look at Monty Python.

    To share this article easily, please click:

November 24, 2008

Transactions: telcos’ future in one word

As well as customer data, a huge theme at the last Telco 2.0 event was transactions. So many of the delegates were interested in how telcos can facilitate them, earn money from them, bill for them, secure them, authenticate them… we’ve said in the past that the future of value added services, at least, is in creating a huge transaction processing platform. For the first time, we had an excellent practical example.

Amazon_CTO_Werner_Vogels_Flywheel.JPG

Amazon.com CTO Werner Vogels’ presentation was an inspiring example of just what can be achieved by understanding both transaction processing, and the business model you need to generate the transactions.

Recap: Economics of Telco 2.0

To begin with, let’s go back to some Telco 2.0 economics. We often talk about “trading hubs”. This can sound like empty business school jargon. But they are real, and they have some well-characterised properties. Specifically, they emerge in markets which exhibit high transaction costs, usually due to asymmetric information. And they tend to show increasing returns to scale.

Our standard examples are stock exchanges, which exist because otherwise, the cost of trading stocks would be very high indeed. (Imagine trying to find another individual to buy your shares in company X without one.) Further, asymmetric information would make it very risky to trade in securities at all; without an exchange, you’d be forced to accept whatever price a buyer you found offered. There would be a very good chance of getting ripped off. But if you could go to any one of many other brokers, there would be little point in trying to buy or sell at a price very different to that prevailing. That given, it turns out that the exchange works better the more people use it; liquidity goes to liquidity.

Increasing returns mean there is no going back

Another standard example is a container port; since containerisation at the end of the 1960s, more and more of world trade has moved through a small set of very large ports known as load centres. These emerged because the new and extremely expensive ships had to go where there was enough trade to fill their capacity. If there was more traffic in Rotterdam than Felixstowe, that would be where the ships went, and therefore anyone wanting to ship goods would get more choice of routes and times and better rates there. So the volume of trade there would grow, and that in itself attracted more ships. A very similar process occurred with the emergence of Internet exchange points at the end of the 1990s.

These principles are astonishingly powerful; one of their most important consequences is that transient changes at the right moment can have dramatic long-term impact. Paul Krugman won the Nobel prize for economics for applying them to international trade and economic geography. New York, for example, was a port not much bigger than Boston or indeed any of the other ports of New England until the construction of the Erie Canal. The canal was only used seriously for a few years, but it immediately made New York the leading port of the Northeastern U.S., which also made it the industrial centre of the region. As soon as the railway was built, traffic on the canal fell away - but because of the canal, the railway was built to New York, and the rest we all know. Activity attracts activity, in a self-similar process of growth.

In our own industry, there are no shortage of examples of this — there is, after all, only one Google, only one iTunes, and only one Amazon.com. Similarly, the US only had one AT&T before the government decided that was one too many; since then, however, the RBOCs have been doing their level best to return to the days of one Ma Bell.

Amazon - building a giant platform

Amazon.com is founded on a very similar insight. Vogels describes the two main drivers of the business as selection and low prices (selection here might be better put as “choice” or “range”). By offering much more selection — aiming to be the world’s biggest catalogue — Amazon makes it much more likely that transactions will occur. This attracts customers, which further fuels the process.

Egad! A flywheel with two sides!

But the crucial step was becoming two-sided — inviting other merchants to sell through Amazon. They were attracted by access to Amazon’s customer base and IT systems, Amazon by further additions to its selection of goods. Similarly, other merchants can also sell products from Amazon’s catalogue through their own Web sites, still using the same infrastructure. And until a transaction actually occurs, it’s free. It’s also free, both to Amazon and to the affiliate, to link to books (or anything else) on Amazon.com; but when a transaction happens and someone buys it, the affiliate makes a small but significant percentage. It pays to send Amazon more traffic.This “flywheel” drives growth in the volume of transactions; scale and operational excellence drives cost saving and hence low prices.

Telco%202.0%20-%20Werner%20Vogels.jpg

The upshot is not only Amazon’s success, but the fact that now there really isn’t much point competing with them directly, just as building a new container port next door to Singapore would be insane. Other major retailers — Marks & Spencer, to take an iconic and large example — now use Amazon’s IT systems to run their own businesses. (As a result, although not one cash register exists within Amazon, its software and business processes have to provide for them.)

This further suggests that telcos have to get cracking, before someone else cracks it and closes the opportunity off for ever. You can be certain that Amazon, Google, or someone we’ve never even heard of yet, will move faster than a telco industry standardisation group; you can be certain that four to six different and incompatible transaction platforms in each market will be a disaster.

The good news, however, is that we do have a historical example of how to build a huge platform business collaboratively - the major inter-bank consortia that operate card payments systems. Visa and Mastercard’s internal pricing is specifically designed so that it benefited each bank who joined and each merchant who accepted the cards, whilst also subsidising the issue of cards in order to maximise the customer base.

The LINK network of ATM operators in Britain originated with the UK’s small mutually-owned banks (called “building societies”). As their competitors, the national commercial banks (the clearing banks, in British parlance), installed more and more ATMs hoping to achieve nationwide coverage, the building societies were faced with a serious problem. As (mostly) small local institutions no one society could hope to offer national card service, and it could be expected that the clearing banks would exact a high price to let a small competitor use their system. However, precisely because their territories rarely overlapped, a society that joined LINK hugely increased its coverage at once. More members meant more value, and also helped spread the costs of the shared infrastructure. Eventually, it was the clearing banks who had to swallow their pride and join LINK.

LINK is also an interesting case study because it demonstrates that platforms behave differently in terms of competition law than almost anything else. It could be argued that the societies were illegally agreeing not to compete; it could also be argued that they were cross-subsidising ATM service to end users and therefore competing unfairly with the clearing banks. And in fact it was, as the banks made a last-ditch attempt to make LINK impose charges on users. But even if LINK was anticompetitive, the result was that any British card holder could use any British ATM without paying a fee, and you try arguing that’s against the public interest.

And, of course, there’s GSM roaming.

Your IT architecture is who you are

Another interesting feature of Vogels’ presentation was the way some principles of IT architecture pervade the company. In a sense, Amazon is the data model which describes its giant catalogue, customer base, and partner relationships. Its business strategy is based on creating as many ways of perceiving and interacting with this model as possible. Geeks call this a REST architecture (here’s an article specifically about REST and Amazon), or a Model-View-Controller system, depending on which kind and generation of geek you’re talking to. What they mean is that the core model is strictly separated from the ways its content is represented, added to, and interacted with. The aim is to make it as reusable and adaptable as possible, so it can be integrated with other systems, remixed to suit the user, given different user-interface skins. Exhibitor Infonova has built a next-gen BSS infrastructure around this “white label wholesale” model.

Can your telco change its skin?

Can people do this with your company? In essence, a telco is a big pile of CDRs and certain specific technical capabilities, which we either address through e164 telephone numbers or not at all. The aim of Telco 2.0, it is becoming increasingly clear, is to create as many ways as possible to interact with these core assets, so that as many transactions as possible happen, on which the telco can collect what Matt Bross of BT Innovate described as a “thin layer of value”.

However, it’s worth remembering that even if the layer of value is thin in terms of the user’s margin, it’s pretty thick in terms of the telco’s margin — the activity on the network that results is like HLR traffic rather than (say) streaming video, so the marginal cost of a transaction is extremely low. Like SMS, the margin for the telco could actually be very high. Using Turkcell’s Mobile Signature as a guide, we estimated that the total VAS opportunity worked out to 103,000 transactions a second across the world. The two questions you need to answer are “How can I maximise my share of those transactions?” and “How can I reduce the marginal cost of a transaction?”

Conclusion: customer intimacy through operational excellence

Platform businesses are a fine example of the first fundamental strategy — operational excellence. The product of a trading hub is usually very close to the homeogenous product of classical economics; trading stocks is very much the same in Frankfurt or Amsterdam. But unlike the classical economist’s world, rather than many equals competing, the effects of increasing returns to scale result in an oligopoly. Among the oligopolists, then, there is essentially no scope for the second fundamental strategy — new product. Pricing is restricted by oligopolistic price stability, and is anyway entirely dependent on operational excellence to cut the underlying costs.

Customer intimacy is deliberately abstracted out of the transaction platform: we’re leaving it to people who really know their customer to generate the transactions, and providing them with the tools. Note that Amazon.com works internally to build its customer relationships, but also provides the same tools to other parties. This means that not only do the upstream customers — merchants in Amazonspeak — benefit from Amazon’s knowledge of its customers, but Amazon benefits from the merchants’ superior closeness to their own customers.

Similarly, future Telco 2.0 carriers’ capabilities will be available to their upstream customers, using their intimacy with customers to generate transactions, to their own internal service developers and marketers, using the telco’s data assets to generate traffic for its own account, and also to upstream customers who want to use the telco’s customer intimacy to generate transactions in their own businesses.

In this way, a crucial paradox is resolved. How can you have “customer intimacy” with telco-sized customer bases? The answer is simply that it grows out of operational excellence. And Amazon is the quintessential operational excellence company.

To share this article easily, please click:

Ring! Ring! Hot News, 24th November, 2008

In Today’s Issue: Internet forecast wars on again; Odlyzko fights the nonsense; experimental high-def YouTube, and how to get it; BT: OFCOM ate my homework; Amazon’s CDN has landed; Telefonica wants a spaceship or two; T-Mobile UK is down; T-Systems blows the German secret service’s cover; VZW peeks at BHO’s CDRs; SearchWiki, another Google web-hoover; Ubuntu for mobiles; Lotus Notes for Nokia; Nokia and Yahoo!; Nokia and TD-SCDMA, possible faster Chinese rollout; HOWTO manage devices OTA in S60; GPS SIMs coming; Qualcomm’s WLAN LBS; CTIA fights for lucrative convict market; Clearwire-Sprint JV signed, shares tank; Indian consolidation coming; T-Mobile USA’s digiframe comes with data but no music; a cautionary tale about age verification.

It’s another round in the Internet traffic forecast wars. The vendors’ side last week published research claiming that a coming exaflood would lead to “Internet brownouts”; as TelecomTV points out, not only did they use identical language to everyone else who’s predicted this over the last 16 years, but just as always, world authority Andrew Odlzkyo disagrees and is probably right (his MINTS project claims that backbone traffic actually fell recently).

The access side is another matter; but then, the DSL operators’ problems aren’t so much that they can’t handle the traffic than that they can’t pay for it. The latest wave of disruption heading for the DSL industry is YouTube’s plan to start offering higher-quality - and therefore much traffic-heavier - versions of its videos. According to Wired, they seem to be real H264-encoded, p720 MP4 streams. Interested? Veteran hacker Jamie Zawinski has instructions on how to try it out before Google actually launches it. Some news sites are suggesting that you’ll be able to download as well as stream the videos - marginally better news for DSL backhaul provisioners.

Nobody has confirmed or denied that, but Firefox users can install this user script and do it anyway.

BT, meanwhile, is blaming OFCOM for slow progress on its fibre-to-the-yet-to-be-determined rollout.

Speaking of things content-delivery, Amazon.com’s Werner Vogels told the Telco 2.0 event they didn’t need to worry about an Amazon telco platform. Don’t play him at cards. Here’s AWS’s brand new cloud-based CDN, Cloudfront. As usual with AWS, it’s all transactions-based pricing, web service APIs, and magic URLs; you pour data into an S3 bucket, call the Cloudfront API to register the location of your data, and they fix their internal workings to send requests for your URL to the topologically-nearest server farm. US and European users can be served up to 10 terabytes of stuff for $170/TB.

It’s a curious paradox that CDN pricing is being driven through the floor, at the same time as CDNing has become absolutely essential for any Web service with serious numbers of users. The service already boasts of integration with EC2 cloud computing, but you have to expect that the integration will probably go further and let EC2ers push their application instances down to the CDN, like Akamai’s Edge Computing service. We’ve said before that CDNing is wonderful because it’s a solution which is native to the Internet, rather than working against it; note also that the list of Amazon CDN locations bears a close comparison to a list of major Internet exchanges.

Telefonica and Vivendi are buying a satellite TV network in Spain. Killer Telco 2.0 quote:
Telefónica has a pay-TV business, Imagenio, which it feeds to clients through its broadband network. Some executives believe Digital Plus’s satellite platform, with 2m subscribers, would complement this.
As Telco 2.0 analyst Keith McMahon so often reminds us, nothing beats satellite for efficient mass video distribution. Imagenio already has an impressive STB capability - is Telefonica planning an integrated satellite broadcast-Internet video solution?

Meanwhile, T-Mobile UK experienced a massive (300 kilosubscriber) outage after a database was corrupted. In ironic contrast, they also leaked the IP address ranges secretly assigned to the German intelligence service. They really need that new vice-president of data protection…

In other spooky news, Verizon Wireless admitted that some of its employees illegally looked up Barack Obama’s call details. CEO Lowell McAdam made a public apology. Google, meanwhile, launched a new function in its core search business which lets users comment on search results - the FT Tech Blog points out that it’s just another source of data.

We don’t know how well LiMo is likely to do now Motorola’s handset operation is concentrating on Google Android, but we do know that Linux on gadgets is still a major story. ARM announced this week that they want to port the very popular Ubuntu distribution to run on their chips, thus making it the fourth mobilinux after OpenMoko, LIPS/LiMo, and Android. Fragmentation? You bet, although UIQ is apparently leaving us.

Nokia, meanwhile, taps into the only IM user base that counts (if you want to make money, that is): the 140 million users of IBM Lotus Notes. They are integrating it with the S60 e-mail client. It’s available for all the N- and E-series (obviously enough) but also for a whole gaggle of recent feature phones. Meanwhile, why doesn’t Nokia buy Yahoo!, we are asked. Perhaps because they’ve got more sense?

But seriously folks. Nokia’s efforts to build its own Web services ecosystem, Ovi, have so far been a bit of a mess. Yahoo! would bring in a lot of interesting stuff - a major suite of cloud services, the so-called Y! OS 1.0, the OneConnect unified messaging platform, FireEagle stalkerware, and a lot of local-search information. There’s not a bad search engine in there too, they say. And, of course, there are ad revenues. In the meantime, the same analyst reckons Nokia might open up the Nokia Maps API, which is certainly a good idea, as the current Google Maps Mobile app doesn’t support any of the wonderful user-generated content functionality of Google Maps itself. In other Nokia news, they are lining up a TD-SCDMA device in order to make the Chinese government happy. Which is handy, because the Chinese government’s economic crisis plan may include a faster 3G rollout.

Still further Nokia news for geeks: here’s a tutorial on using device management under S60.

Sagem’s SIM-making side supports satellite-searching silicon, specifically BlueSky’s GPS-on-SIM technology. On the other hand, Qualcomm is now licensing a WLAN location-based service platform, which basically uses a big database of WLAN hotspots and some data from the WLAN radio itself to position you when you’re somewhere GPS doesn’t work. Like your office, in all probability - or jail. The CTIA is fighting a proposal to use RF jammers to stop convicts phoning - they reckon it’s illegal. They can’t make that much ARPU from jailbirds, can they?

Over in the Telco USSR, the Clearwire-Sprint WiMAX joint venture went through; causing Clearwire stock to drop like a stone. This perhaps isn’t that significant - in a market like today’s, pretty much anything is an excuse to make the stock tank - but they probably fear that Sprint will somehow stick Clearwire with most of the rollout costs, and probably get them involved in some kind of horrible legal dispute.

And the CEO of Bharti Airtel says consolidation is coming in India. He’s not wrong; mobile operators of various sizes and technologies have shot out of the ground, and although they are still in the fast subscriber acquisition phase, the presence of big Indian industrials (like Reliance and Bharti) and Vodafone implies a coming squeeze on small operators.

You’ve heard of Comes With Music; we prefer Comes With Data, like the Amazon Kindle. T-Mobile USA launched a digital picture frame/player gadget with bundled 3G connectivity this week. Telephony Online isn’t convinced.

And Bruce Schneier has a cautionary tale about age verification. Being a trusted steward of customer data requires you to be, ah, trustworthy. And some of your customers may have distinctly untrustworthy plans for your data assets.

To share this article easily, please click:

November 17, 2008

Ring! Ring! Hot News, 17th November 2008

In Today’s Issue: Fibre from the home, says David Isenberg; DTAG humbled over VDSL rollout; Nortel reorgs yet again, keeps fibre unit; Telephony Online covers the backstabbing in real time; regional separatists rock the Telco USSR; cuts at Vodafone and BT, profits down at Telefonica; BT Vision gets ITV content; drive your Sky+ box from your iPhone; Mobilkom deploys femtocells; Hulu vs YouTube == sausage vs rose?; more MediaFLO; Qualcomm-powered netbooks will eat our cities; the standards wars are over; first GSM/WiMAX gadget; the emerging Adobe/ARM/Qualcomm mobile OS; app stores, competitive arena of tomorrow; O2 UK, T-Mobile USA fire up dev ecosystems; Facebook gets OTT messaging; Hutch adapts; Bubley the revolutionary! The first thing we do, let’s kill all the vendors…

AT&T alumnus David Isenberg has an idea; if FTTH (fibre to the home) is difficult, then what about fibre from the home, with homeowners, developers or communities building their own fibre links out to the exchange? It’s a micro version of a munifibre deployment, and you can see how it might play well with an incumbent determined not to go beyond fibre-to-the-cabinet/node/local exchange. (Naming no names.) He also has more here regarding the maths of fibre deployment.

After all, this makes a lot of sense; one of the main reasons we need fibre is that it provides ample uplink, which is the real long pole in the tent when it comes to user generated content, peer-to-peer distribution, and the like. Meanwhile, how are the mighty fallen! Time was when Deutsche Telekom wouldn’t hear of letting competitors use its planned VDSL network; they lobbied the government, they hammered on the doors of the Berlaymont, they groaned and held up the deployment. Now things are looking very different indeed. First they offered to move towards open access, and now they are asking other carriers to “help” get the fibre out there.

Nortel Networks may have made its name producing optical Ethernet kit, but things are pretty grim there. First they announced they were planning to sell the fibre unit; then someone presumably grabbed the CEO’s lapels and yelled “What were you thinking, man?” Not only would it be hard to sell the division, but it’s the best business Nortel has… Now they’re reorganising (again) and cutting jobs (again). Ed Gubbins at Telephony Online tracks the thuds and screams filtering out of Nortel HQ and makes the sad and telling points that most of the execs who have just been fired were hired within the last two years, and they were all brought in from outside, whereas their replacements are all internal promotions. Among other things, Nortel is now planning to do without a CTO, which is somewhat worrying for a company founded entirely on T.

In other bad news, the Telco USSR has lost a Soyuz. Or rather, as Telephony Online’s Kevin Fitchard reports, the Illinois Supreme Court has ordered Sprint to shut down its iDEN network in the state, the week after general secretary of the Sprint Party Dan Hesse sent fraternal greetings to the downtrodden toiling masses of Nextel and Motorola and announced a new party line, under which the iDEN system’s unique voice & messaging capabilities would take a leading role in their global strategy.

The problem appears to be one of the myriad disputes between mainline Sprint-Nextel, as the airline business would call it, and its many, many regional affiliates. The merger with Nextel triggered reams of litigation with these companies, which mostly ended with Sprint forking out the greenmail to make them go away. But they still aren’t all resolved; Sprint is like the USSR in many ways, and these territorial conflicts bubbling under the surface are just another. Sprint is left with the unenviable choice of selling a lone iDEN network (to whom?) or shuttering it and bulk-transferring the subscribers to mainline Sprint PCS, which will in all probability mean kissing a lot of them goodbye, and also put a Chicago-shaped hole in their iDEN coverage.

It’s all been a bit grim so far; but then, it has been a grim week. Vodafone is trying to slash £1bn off its cost base and crank higher margins out of the subscribers; BT announced 10,000 layoffs, or 7% of its global work force. Profits fell 50% at Telefonica, although this was partly because last year’s figure was inflated by one-off items.

So here’s some positive news. ITV signs up with BT to put its content on the BT Vision IPTV system, which makes the telco the first UK IPTV operator to carry all the main TV networks’ stuff. It sounds like the on-demand bandwidth system Aepona presented at Telco 2.0 is going to be getting some work…meanwhile, fans of integrated video distribution might want to check out the iPhone app that controls your Sky+ box, and Mobilkom’s femtocell deployment.

Relatedly, the FT wonders if Hulu will “catch YouTube”. But do their businesses really intersect? Hulu is essentially a re-implementation of traditional broadcast TV, streaming TV studios’ and Hollywood’s content base on the Web. YouTube is a UGC aggregator that acts as a traffic-generating flywheel for Google’s ad brokering core business.

Whilst we’re on the topic of video distribution, MediaFLO is expanding into more markets. Its parent, Qualcomm, however, is very proud of its netbook built on mobile phone chips, and seems to be planning a major contest with Intel in the netbook/MID market. (Personally, I prefer to think of them as “cheaputers”.) Part of the reason they are so proud is probably that their 4G technology, CDMA 1xEV-DO Rev.C aka UMB, is being given the humane killer. After Verizon Wireless went LTE and Sprint went WiMAX, there wasn’t much hope for it; this just confirms it. Speaking of WiMAX, the first GSM/WiMAX dual mode gadget is here, and it’s going to a greenfield carrier in Russia.

The standards wars are over. I repeat, the standards wars are over.

So Qualcomm is going to have to make a go of applications and central processors, which means competing directly Intel and either competition or cooptition with ARM. For this they have a strategy. ARM and Qualcomm have been working with Adobe to make their AIR development environment run directly on their chips, while Qualcomm has launched an SDK for BREW developers working in Flash rather than C. This implies that Qualcomm and Adobe are about to become yet another mobile platform.

The mobile OS wars are well and truly on. As are the app store wars; they are the new deck, it says here. In fact, carriers ought to hope they are nothing like the deck, which is a failed business model; app stores, so far, look a much better idea than the handset portal obsession of the early 2000s. This hasn’t stopped carriers paying search engines for…something.

Meanwhile, O2 UK and T-Mobile USA are latest carriers to get an app store and a developer ecosystem. Note that Adobe pops up here, too, pushing the idea of open access to app stores. They are certainly confident they are right.

Voice and messaging 2.0 - Facebook just got its own over-the-top messaging network, in addition to its existing integration with SMS. Rather like the WiMAX handset above, this is an example of delayed innovation in the developed world; MXit and QQ were doing this sort of SMS/GPRS price arbitrage to their carriers years ago in South Africa and China. But beware — it’s coming to you. Improve your core voice and messaging before someone else does.

Like Hutchison, who are bundling various social networks and other apps (including Facebook) in a new device made by a low-cost ODM in China…note that it’s using BREW and Adobe Flash. It sounds almost as if someone had a plan…

Finally, Dean Bubley calls for a mob of phone users with torches and pitchforks to force sense on the vendors and carriers.

To share this article easily, please click:

November 13, 2008

‘Two-Sided’ Telecoms Business Models - Hunger for Adoption Now

We are currently analysing the huge amount of material generated by the participants at the 5th Telco 2.0 Executive Brainstorm last week - both qualitative and quantitative and captured by our ‘Mindshare’ method. We will share all of this with the participants next week, and highlights of it with readers of this blog over the next few weeks.

But in the meantime, below are the results of two important votes with the 250 senior execs at the event which demonstrate the growth in the perceived relative importance of the ‘two-sided’ telecoms business model versus the existing telecoms business model. It’s useful to compare this with some of the output from the 4th Telco 2.0 event in April 2008 (described here) and then to reflect on the changes in attitude by forward thinking people in the industry in the last six months alone. One of our associates Dean Bubley, a seasoned industry analyst who runs Disruptive Analysis, summed this up very well in a note he sent us this morning, which we’ve published in full below. Here’s the first chart:

t2november08-slide1.PNG

t2november08-slide2.PNG

Strategies, left to right: M&A in existing geographies, expand into Internet markets, expand into IT, expand into adjacent telephony markets, expand into entertainment, expand into emerging markets, enhance retail, enhance new wholesale distribution, offer more retail ICT, new VAS platform

t2november08-slide3.PNG

What are these charts telling us? Dean Bubley’s thoughts:

I’ve been to several of the Telco 2.0 Brainstorm events over the past two years. Last week’s was a bit different - the mood had changed from curiosity to hunger, perhaps overlaid with the slightest hint of fear. The survey responses from the event tell the story: especially with the current economic situation, telecoms operators are aware that the prospects of future growth are dim, especially if they continue to push today’s strategies towards their inevitable commoditisation.

One thing I find particularly striking is the apparent desire to “leapfrog” existing retail/wholesale business models (even new, Telco 2.0-enhanced ones) and jump straight to the much more complex and less-defined B2B2C VAS propositions.

This ties in with a huge industry obsession with opening up APIs, and finding value in the nooks and crannies of “network capabilities” and repositories of customer data. By comparison, simply extending the “retail” model to third party services, or selling wholesale (and unbranded) capacity to be embedded other voice/data/video/mobile players seems a little bland.

I agree that the really advanced VAS services have huge long-term appeal, but I fear that it will take a while to get the specifications, standards, business models and management philosophies in place - particularly given the likely conservatism of management teams over the next 18 months. In the shorter term, I think there is a lot more that can be done with improved retailing and wholesaling.

It is notable, for example, that Hutchison 3’s INQ handset division has announced its “FaceBook phone” this morning. Using the handset as an advanced retail storefront is not a new concept - but previous heavy-handed approaches have been more about forcing users to confront never-ending exhortations to download paid content.

The 3/INQ approach is different - it’s about getting average users to sign up for data plans, and potentially getting a stream of dedicated users who might generate even more revenue in future. The iPhone AppStore is another prime example of improved retailing in telecoms.

The wholesale side is trickier. Most mobile operators still guard their brands religiously. But third-party pays, or “sponsored” data is a must, on both handsets and other devices [Ed. - this concept was presented at the event by Andrew Bud - more to come]. Not everyone will want to sign up for a 24 month HSDPA modem subscription for laptops, while most prepaid handset subscribers still lack data access as they don’t understand the costs.

There need to be options for other people to pick up the tab - governments could sponsor free mobile data for people to check social services websites, conference organisers and café owners could offer “free 3G” the same way they currently offer “free WiFi”. Broadcasters could send users “free” mobile TV shows, supported by adverts in the same fashion as terrestrial TV broadcasts.

To share this article easily, please click:

November 12, 2008

Exclusive Interview: The ‘Long Tail’ Interrogated (part 2)

Last week the fifth Telco 2.0 Executive Brainstorm continued its theme of business model innovation at the intersection of telecoms, media and technology by welcoming back Will Page, Chief Economist at the MCPS PRS Alliance, a copyright collection society that represents over 50,000 songwriters and 5,000 publishers.

Will_Page_Long_Tail%20%28Small%29.JPG

Will took the opportunity to present, exclusively to Telco 2.0, new research - based on an unprecedented analysis of digital music sales data gathered over a year - that opens to question the recieved wisdom around the ‘Long Tail’ theory, and helps to re-define what it actually means and for whom. The presentation created quite a stir at the event, in the media and blogosphere. Here, Telco 2.0 discusses at length the presentation and the reaction to it with Will Page.

Telco 2.0: At previous Telco 2.0 executive brainstoms you’ve covered file sharing, the economics of two-sided markets and now the long tail. Coming from outside the Telco world, how useful do you find the events?

Will: Very! It’s ironic that we live in a world of convergence yet too many of the disparate camps like to remain in their pigeon holes and preach to the converted. What Telco 2.0 events do, by allowing someone from a copyright collecting society to speak openly to an audience predominately made up of ISPs and Telco operators, is invaluable to both content and connectivity businesses.

You can see the importance of this more and more now. We have an truly awesome CEO of UK Music (the newly formed music industry trade body) in place now - Feargal Sharkey - and he’s spending an increasing amount of time at OfCom. Two years ago, that simply would not be happening. So, in many ways, what the Telco 2.0 Initiative does in terms of brining different industries together is ahead of the curve.

Last week’s Long Tail presentation was a good example of this. I first met Andrew Bud, Executive Chairman of MBlox (and now Chair of the Mobile Entertainment Forum) and a key collaborator on my analysis, at a Telco 2.0 event back in October 2007. I can’t think of another ‘platform’ where our paths would have crossed, even though both our businesses share surprisingly similar characteristics.

Telco 2.0: For those who follow Telco 2.0 but missed the presentation, could you bring them up to speed on what exactly you, your colleague Gary Eggleton and Andrew Bud have been working on over the past few months?

Will: Sure, it’s worth going back to the beginning as it’s been an interesting journey. Firstly, like so many others, I read the original Wired article on the Long Tail in December 2004 and was genuinely inspired by it. For the next two years I was active in the blog-to-a-book website and have to credit the concept as being one of the principal reasons behind moving to London to work in the music industry in the Summer of 2006, ironically when the book came out.

I guess the presentation I gave last week reflects what I’ve learnt in the two years since from working in a collecting society, an organisation which by default is in the long tail business. Indeed, the Performing Right Society (PRS) has been dealing with long tail markets since 1914. The whole purpose of constructing and offering a collective licence is that it doesn’t matter if it’s a song is a hit or a niche, all the tracks have been licensed under a blanket agreement.

Given the clear relevance of collecting societies to the ‘long tail’ debate, I was surprised to see so little mention of them in Chris’s book - or the blogs that followed. For example, our US equivalents’ ASCAP and BMI don’t appear once in the book’s index.

For those who weren’t there, let me break the presentation down into three parts. It began by looking at the evidence in terms of actual historical data. I drew upon a great expression that I learned whilst in the Government Economic Service, which is to always strive for ‘evidence-based policy making’ and resist the temptation of ‘policy-based evidence making’. Increasingly, when I hear those words “here’s another great example of the long tail at work”, I’m inclined to expect that claim to lean towards the latter of the two.

I made the point that looking at volume-based Rhapsody data, which much of the long tail application to music has been ‘built’ around, is like a glass half empty - at best. We need to also consider value, and by that I mean not just retail spend, but marginal profitability in terms of what gets back to the artist and songwriter, and also ‘displacement’.

One achievement of my two years at the MCPS PRS Alliance is to get ‘displacement’ into the everyday lexicon of the UK music industry. Is every digital track sold to be celebrated (a P2P user now gone legitimate)? Or regretted (a £9.95 album sale lost)? The reality of the the long tail is now being uncovered by many stakeholders in the music industry’s head (hitmakers) and tail (poor sellers).

The second part revisited ‘histograms’ as a way of plotting the long tail. Andrew Bud, who’s been like a Professor to me throughout this project, put me onto a fantastic book published in 1956 by Brown, entitled ‘Statistical Forecasting for Inventory Control’, and which described the importance of log normal distribution as an analysis method.

This concept is not radically new and is still discussed today (for example, Chris Anderson refers to log normal distributions in his speeches and blog too). But for me this book was fifty years ahead of its time.

Using this approach our team constructed log normal intervals and plotted an unprecedented amount of digital music data over a significant time period. The basic shape of consumer demand for digital music clearly fits the log normal distribution…”with eye-watering accuracy”. It was really striking. There are many new schools of thought, but the old rules seem to hold truest.

The difference between a Pareto-style distribution and a log normal is neatly summarised by Chris Anderson in his recent response to my analysis, below:

” …The two distributions look similar at first glance, and you have to plot them log-log (or fit them with a statistical package) to tell the difference. Long Tails are “heavy-tailed” distributions, where a lot of the total volume is the tail, while lognormals are more like the classic top-heavy hit distribution…”

The third part of my presentation at Telco 2.0 last week concluded with two important slides. The first one plotted two heads. The first ‘head’ was the concentration of tracks which sold very little or none at all. It questioned whether the net revenue generated from these tracks cover the real sunk and commission-based costs for a.) getting the song there and b.) getting money back out of the system.

The second ‘head’ was to show the effective average revenue per track in each ‘bin’ (or statistical grouping of data). This was a crude averaging method but it proved highly illustrative. The inequality in revenue between hits and niches was jaw-droppingly stark, justifying Andrew’s observation that “in this tail, you starve”.

For example, we found that only 20% of tracks in our sample were ‘active’, that is to say they sold at least one copy, and hence, 80% of the tracks sold nothing at all. Moreover, approximately 80% of sales revenue came from around 3% of the active tracks. Factor in the dormant tail and you’re looking at a 80/0.38% rule for all the inventory on the digital shelf.

Finally, only 40 tracks sold more than 100,000 copies, accounting for 8% of the business. Think about that - back in the physical world, forty tracks could be just 4 albums, or the top slice of the best-selling “Now That’s What I Call Music, Volume 70” which bundles up 43 ‘hits’ into one perennially popular customer offering!

This chart really drove home the theme of the presentation: what does the ‘long tail’ actually mean, and for whom?

If you’re a for-profit aggregator, it means one thing, if you’re an individual copyright holder, it means another. Again, this is something the debate has largely overlooked to date, yet everyone ‘down here on the ground’ increasingly recognises it.

As a not-for-profit membership governed collecting society, I’m extremely fortunate to be in unique position to make a balanced interpretation about the facts, and what they mean to both sides - individuals and firms. My interpretation is in no way gospel, but I can at least build an argument that’s based on evidence from the coal face.

My argument, in summary, was that the future of business is definitely not selling ‘less of more’. Scale matters. To tee up the interactive debate with the brainstorm participants I concluded with a final slide posing the dilemma facing firms in the content value chain as regards their investment strategies. Thanks to the way Telco 2.0 allows the participants to ‘blog live’ with the presenters using lap tops and special software [Ed. - we call the format ‘Mindshare’], a mountain of comments and questions flooded in.

Telco 2.0: Of the (literally) hundreds of questions the audience threw at yourself, your colleague Gary Eggleton and Andrew Bud, which would like to answer in more detail here?

Will: Firstly, I’m genuinely grateful for that ‘blogging facility’ you have at Telco 2.0. The day after the presentation, your CEO emailed me every question, idea and challenge your audience threw at me - that’s a fantastic facility, a “free lunch” of excellent feedback and advice. My thanks to everyone who posted.

Now, I think there are three themes which I can draw from your excellent interactive facility at the conference and expand on here: (i) the black market, (ii) digital inventory costs and (iii) scarcity.

1.) The Black Market (P2P)
There were lots of really insightful questions on this topic which can be summarised by this one: ‘is the P2P market more or less concentrated around hits than the legal one?’

To help answer it I would direct readers to a now infamous paper I published with Eric Garland, CEO of Big Champagne, titled ‘In Rainbows, On Torrents’ (pdf).

My hunch, based on the evidence we presented in that paper (pointing out the 2.3 million illegal downloads of Raidohead’s new album when it was also available ‘for free’ on their own website), is that the black market is even more hit-centric.

As Eric would argue, popular music is popular wherever it is popular, in that you can’t be a hit on iTunes without being a hit on Bit Torrent …and vice versa. For further reading, I’d suggest the sociologist William McPhee’s groundbreaking theory of exposure, found in his 1963 book ‘Formal Theories of Mass Behavior’.

2.) Digital Inventory Costs
These costs are often overlooked by those claiming the long tail is a ‘panacea’ for artists. Making recorded music has many independent costs, some have gone down, others have gone up (what economists call ‘cost disease’).

Similarly, there are administrative costs to uploading tracks onto digital sales platforms and getting the money back to the creator. For example, indie ‘niche’ labels need ‘aggregators’ before they can join the main digital music platforms, which is a wholesale market, just like in any other business, digital or bricks and mortar. The same old rules of transaction costs and economies of scale apply there too.

I wanted the audience last week to consider another old rule of economics - cost-benefit analysis. Do the net benefits outweigh the costs (both upfront and commission based) of joining the long tail? It’s a simple question to ask which few bother to do and hence it’s infuriating when you read propositions like ‘all you need is 1,000 true fans’.

3.) Scarcity
There was a wonderful comment from one anonymous participant in the room who said:
“…Scarcity forces a ‘competition’-like structure to pass the cut-off point, which paradoxically creates value by increasing the effort of content suppliers to win….”

This really sums up the point of my presentation. What I said was not particularly new, in fact its basic common economic sense, about which we sometimes need a reminder. This quote points to where I’m going to take the research next. Not the ‘head’, nor the ‘tail’ - but what happens to the ‘body’?

My hunch is that without scarcity, the body is underexploited. The quote provides context for a point I made on stage and in an article in The Register: “Is the ‘future of business’ really selling less of more? Absolutely not. If Top of the Pops still existed, it would feature the Top 14, not Top 40.”

Telco 2.0: You’ve definitely got the debate started as there’s been significant press coverage since the publication. How have you found the reaction in the media by those who were and weren’t there?

Will: Tricky. Many journalists have agendas - some you agree with, some you don’t. Sometimes the meaning of your work gets lost in the differences within those agendas. My agenda is an academic one, like the great Scottish philosopher Hume would of wanted, one of conjectures and refutations. Let’s take a theory and put it to an impartial evidence-based test.

Firstly. it was great to see Eric Schmidt, CEO of Google, putting forward a strikingly similar argument to myself on McKinsey’s website recently.

In addition, I was pleased to see The Register picked up on the role of a collecting society, an institution that receives surprisingly little coverage in Long Tail debate, yet has pioneered the creation of long tail markets for musical copyright through patent pooling and blanket licensing for almost a century.

It was great to finally get a mention on Chris Anderson’s blog - given that I published my first set of long tail statistics (pdf) back in November 2006!

Given that the source of my data cannot be disclosed at this stage and the slide deck from the Telco 2.0 event cannot be circulated (and I’m genuinely grateful to those in the audience for understanding this point), I thought he did a pretty good job at blogging about a presentation he wasn’t present at. Hopefully this interview will help him fill the gaps.

However, I still think he’s focusing his arguments on the less-relevant volume-based data, and not looking at value in all of its definitions, Or, as an impatient CEO might say ‘show me the money!’ Volume-based discounting has, is and always will be prevalent in any market, be it online or offline. It is a simple and widely accepted fact, and once you accept that it becomes increasingly difficult to hold a conversation about why the future of businesses is selling less of more.

On the downside, one of your participants published a blog article that was so far off the mark, it made me wonder if he was actually paying attention (or more likely, understood the complexity of the music industry). For example, he describes my focus on individual sales as very ‘old economy’. Yet the erosion of the unit value of musical copyright is the biggest issue facing the membership of the MCPS PRS Alliance.

Why? Because we’re a membership-governed not-for-profit organisation that licenses, collects, processes and pays out royalties for our 50,000 individual song writers. He also goes on to say I used a data set where the concept of margin is irrelevant, which is the completely reverse of what I actually did. I presented data and then introduced the concepts of marginal costs (the real costs of managing and processing digital inventory) and marginal benefits (how much of that unit value actually gets back to the creator) from the outset.

Trying to have a balanced debate about the long tail, and avoiding knee jerk reactions and hysterical claims, is hard, very hard! Everyone immediately becomes an expert in a specific market or a statistical rule that they actually know relatively little about.

I feel for Chris on this who pioneered this concept, as he must have had to stare this problem in the face for a lot longer than me. In the music industry, which has experienced the force majeure of disruptive technologies like no other, and for over a decade now, you get a little tired of arm chair critics telling you what to do with the benefit of hindsight, and little understanding of what options were available at the time.

Nevertheless, as the legendary Peter Jenner would always say every time meetings between the content and connectivity industries collapsed into disagreement and disarray the important thing is that we keep all the parties talking, exchanging ideas, evidence and advice.

Telco 2.0: Finally, how transferable is your work? What does it mean for other areas which Telco companies are looking to get involved in, such as Television, Film and Books as well as applications?

Will: Very! Discovering a log normal distribution in one area of media provides a template for evidence-based gathering, interpretation and decision making in others.

I’d stress caution though - you need to order the questions correctly. Just as when you look at international comparisons in order to learn lessons for domestic issues, you need to ask ‘what works over there’ and ONLY then ask ‘of what works over there, and what could work here?’.

There’s been far too much decision-based evidence making to date along the lines of “the long tail must work so find me a great example of it working”. That’s in no way the fault of Chris Anderson. He (like myself) goes to great pains to correct people’s knee jerk reactions, but it’s standard fare when a new economic theory comes along, people get a wee bit hysterical about it.

I think that the most important lesson to come out of this work is a real back-to-basics question: is scarcity a constraint or is it a discipline? I think that you can ask this question from the outset, regardless of what type (and what size) of media inventory you intend to carry across your network.

Finally, I’d like to reemphasise the importance of impartial evidence-based analysis. My work is not about trying to prove anyone wrong. I’m looking at how well their case stands up when presented with evidence. On that note, perhaps it would be apt to end with a suggestion to those proponents of the long tail theory, by drawing upon a quote from the late great John Maynard Keynes: “When the facts change, I change my mind. What do you do, sir?”

[Ed. - After the interview we asked the Telco 2.0 analysts to comment on the transferability of Will’s analysis and the ‘so what’ for telcos:

Chris Barraclough, Consulting Director: Value comes from: catalogue breadth/depth + distribution (which includes searchability and multiple customer touchpoints, including affiliates) + evaluation. Amazon has market power because it works hard on both distribution and evaluation. You can exploit a longer tail than your competitors if you can a.) lower your cost base further than them (ie afford to carry more inventory than them) and b.) price cleverly (link price to volume so that lower volume items are priced higher).

Martin Geddes, Chief Analyst: We must appreciate the uniqueness of different content types and distribution networks. On the latter, for example, iTunes differs from last.fm due to pricing policies and content recommendation systems. In terms of content types music is weak in metadata (people don’t write much about individual songs), unlike richer media like movies (where there are lots of reviews and information on the participants), games, software apps, and TV shows. So, my advice is be careful about extrapolating lessons between different media and indeed even between sub-genres within the same medium. Sport, porn, and news video all have very different dynamics, for example.

The big ‘so what?’ for telcos is that a lot of ‘long tail’ content may have no commercial value, but may have considerable social value to users (e.g. photos of your kids). It still needs to be transported. This makes it all the more important to cater not just for ‘high QoS’ material like streaming HD movies, but also to be able to dynamically subload other content. Watch this space for interesting developments in this latter category…!

James Enck, Senior Associate Analyst: It would be useful to analyse how the value of content changes over time. Van Gogh didn’t sell much during his lifetime, Grateful Dead fans favor bootlegs rather than studio recordings, and we all know the story of the Arctic Monkeys. In other words, it’s conceivable that content moves from the tail to the head over time, and those who don’t spend time in the tail will always be surprised at what appears in the head.

The depressing truth for telcos is that replicating the head does nothing for differentiation. Moreover, if content strategies are geared to churn reduction rather than incremental revenue, then what disincentive to churn is there if all competitors have the same 4,000 film library? Long tail content can be highly appealing as a differentiator if it maintains a local flavor. www.pod3.tv is one example in the UK, which to my knowledge, no telco has sought to engage with. I’m baffled as to why Telekom Austria seems to have stopped the Buntes Fernsehen project, which to my mind was a very interesting way to differentiate on long-tail content in a way that’s highly relevant to a local customer base.

Keith McMahon, Senior Analyst, Content Distribution 2.0: The key message for me is that there is no silver bullet in merely loading content onto the net. The challenge is beyond simply distribution. The promotional aspects will be a really hard skill for telcos to replicate over a wide range of content. They are probably much better partnering, developing ‘two-sided’ enabling business models and shifting the demand risk to parties who know better.

Alan Patrick, Senior Associate Analyst, Content Distribution 2.0: I did Mechanical Engineering at University and studied inventory management theory. The thing I recall is that nearly every inventory based demand curve was Log Normal. The big issue in the online world is the lower transaction costs which supports a “positive returns” power law dynamic, ie. the big get bigger. This drives an increased rush to the ‘Hit Head’. In other words any service which had a long tail distribution would rapidly move to a bigger hit head in any online world.]

To share this article easily, please click:

Voice telephony: death or glory?

At our most recent Telco 2.0 brainstorm, the second session concentrated on the business opportunity in the core voice and messaging business. Here we review the key messages, and explore some of the future business model scenarios.

The timing of this discussion is rather apposite. Despite our belief in Vodafone’s long-term strength, they have just announced that their core voice business has stagnated:

The performance of the company’s European operations suffered from the tough economic climate with margins decreasing from 38.2% to 36.2% on revenues that were down 1.1% on an organic basis. The company blamed ongoing price pressure on core voice and messaging services.

As we said before, if you don’t improve your core product at all since launching digital networks, and assume two-sided Internet business models won’t have any effect on you, you get all you deserve. [Ed - if you feel you deserve better, why not invest in our new Voice & Messaging 2.0 report?]

Re-thinking dialling, voicemail and freephone for 2-sided markets

The lead-in to the session was by Chief Analyst of STL Partners, Martin Geddes. His thesis is a simple one: telcos have consistently abandoned their core product, and are ignoring new business models, whilst pursuing a fools’ gold in media content. The old model — charging users for software services that have no marginal cost or barriers to entry — is dying. That doesn’t stop initiatives like Rich Communications Suite (RCS) from trying.


Martin Geddes, Chief Analyst, STL Partners

To illustrate future business models he gave three examples of how money could be made in future. Each of these focused on different aspects of the consumer to call centre interaction. As you may remember, customer care is one of the key B2B2C value-added services in a Telco 2.0 platform. [For full details, see our report The 2-Sided Telecoms Market Opportunity.]

The first of these was from a Canadian start-up we’ve profiled before, called Fonolo. It exquisitely demonstrates that the value is in integration of telephony and the Web, as well as moving from the call itself to the set-up of the interaction. We asked their CEO, Shai Berger, to tell us more in this video clip:


Shai Berger, CEO, Fonolo

Note that their current business model is a mixture of advertising and end-user premium fees. This is being positioned as a traditional consumer VAS, with a sprinkling of two-sided markets via advertising. The question, however, is who benefits more: the consumer, or the call centre? We think that it’s the latter, and the consumer is the price-sensitive side. The call centre wants the maximum rate of self-care, high customer satisfaction, and the web site offers the ability to do all kinds of enhanced multi-modal interactions that a 0-9*# keypad can’t do well. Even basic things like showing where you are in the queue, and a picture of the person you’re talking to, would make for a far better user experience.

Therefore in our two-sided market world, we’d get telcos to distribute and promote this tool (on their fixed, mobile and on-device portals). They would then sell these enhanced capabilities to call centres.

The second example Martin gave was around outbound calling from call centres. Today the typical experience is something like the following. The call centre operator has to wait for the phone to ring, finds it goes to voicemail (up to 80% of calls to business users go to voicemail), and then leaves a message asking the user to call back to complete the business process. By the time the user gets the message, the call centre may be closed. Or the user simply never responds. So you’re burning labour on leaving these messages, in a process that is both ineffective and inefficient. According to Oracle, customer service representatives making outbound calls typically spend 20-30 minutes per hour talking to customers. The rest is wasted.

A better experience would be simply to deposit a VoiceXML document directly into the user’s voicemail system. “The product you have requested is now in stock. Press one to have it shipped immediately, two to reschedule your delivery, three to cancel.” The business process completes right their inside your voicemail system. And the telco collects and order of magnitude more revenue than they would get from a few cents of termination fee.

The third and final example was more futuristic, looking at how Paypal-like services could be brought from the Internet to telephony, taking out the errors, cost and fraud on today’s information and transactional exchanges to call centres.

These were just a few examples on how to re-imaging telephony to service the needs of call centres. There are many more such examples, and many more business processes to integrate. Telephony could easily become a growth engine again for telecoms, if only telcos would wake up to the new two-sided business model.

BT: From phone company to business communications platform

The next speaker was JP Rangaswami from BT. Excluding the (important) access line revenue, BT only makes a small fraction of its revenue from telephony. Nonetheless, it has embarked on a multi-billion pound programme to create the ultimate voice and communications platform with its 21CN network initiative. Under JP’s guidance, BT has also recently bought Ribbit, a platform that extends telephony integration to Web developers. Clearly BT understands there’s life in being a “phone company” yet (as long as you’ve a two-sided business model, naturally).


JP Rangaswami, Managing Director, BT Design

What JP proposed is that voice is a very much a feature, not the product. Using a Dali image to emphasise the strangeness and difference of the world we find ourselves in, he told his story through the history of two other media: the printed word, and photography.

In both of these cases, we’ve seen a mass democratisation and de-centralisation of the technology. Printing presses were centralised, and printing became an industry unto itself. It was a tool of control. For a while, there was a central printing shop in every company to do reprographics. Now, we see a “Print this page” icon on your screen, and the printing press under your desk can smudge some ink on paper fibre for you in a moment, at a cost low enough you don’t even think about it. “Print”, therefore, has simply been embedded into every other application. Likewise, imaging has gone from an industry into a feature. You don’t need to go twice to the photo shop, once to drop off your films, and again to pick up your prints. “Upload image” is a standard feature of many web applications. It’s two clicks to share one from your photostream.

The message is that the model is undergoing fundamental change, and voice is following the same trajectory. Calls will increasingly be launched from within Web applications. Whoever can capture that context, enrich those interactions, and (particularly) ensure business processes complete will make the money. Carrying the data from A to B and counting minutes is not the model.

BT therefore clearly understands the nature of future business models, even if they are keeping their cards close to their chest in terms of execution. If their CEO can explain this to investors in a way they can grasp, and they can demonstrate some real revenues, then BT is seriously onto something.

Voice as a spice, not the meal

Our third presenter, Thomas Howe, is an independent consultant and blogger, and brings a hands-on perspective to using telco voice and messaging APIs to build a business.


Thomas Howe. (Apparently Barack wears a Thomas Howe tie.)

Thomas spoke about the new business model he sees for telcos. He sees the value is increasingly in knowing things about the customer, not doing things like moving bits around. Doing has become cheap and easy due to continued exponential improvements in technology. It’s hard-to-replicate data that provides business advantage, and telcos have that by the bucket load. In particular, telcos can combine the data with the network to offer new capabilities. It’s not particularly useful to know someone’s latitude and longitude. It is very useful to know if someone is at home, for example to take a delivery. That means understanding “someone”, “at” and “home” — i.e. who are you, where are you, and what is “home”? (This reflects our analysis on the seven questions any telco platform must answer.)

Going to market

In the attendee feedback, there were three clear messages:

  • People like the ideas, and see the value in these new capabilities that telcos can offer businesses who want to take friction out of interacting with their customers.
  • Everyone wants to know pricing, volume and revenue models.
  • There are concerns over privacy, brand positioning, and ability of telcos to execute co-operatively.

For readers interested in getting answers to these questions, and how to execute these ideas, we’ll be diving into these issues in our research in the run-up to the next Telco 2.0 brainstorm in the spring.

To summarise, existing voice platform initiatives like Parlay/OSA are network-centric, and what is needed is a business process centric approach. There are a few global commerce platform emerging, and none of them are from telcos. Yet there are already great telco successes in two-sided markets, such as SMS short codes and premium SMS. Telcos have to continue to build on these to service a wider range of business processes and upstream customers.

Meanwhile, astute attendees will have picked up the protestations of earlier keynote speaker Werner Vogels, CTO of Amazon. “And finally — our telecoms platform. Don’t worry, this is no threat to you.” But he would say that, wouldn’t he?

To share this article easily, please click:

The Future of Online Video - new hypothesis

One of Telco 2.0’s key associates in the ‘Content Distribution’ space is Alan Patrick from Broadsight (see his excellent blog). He’s been working with us on a new report on Future Business Models for Online Video Distribution, which will be published later this month. Alan presented some of this analysis at the Telco 2.0 event last week. We asked him to sum up his thoughts:

Over the last two months we’ve created a hypothesis on how the online video market may evolve - based on desk research, interviews, online questionnaires, a workshop with the avant garde new media users at the Tuttle Club and a “Wisdom of Crowds” session at the Telco 2.0 Brainstorm last week. First, here’s the stimulus presentation I made:

The Future of Online Video
View SlideShare presentation or Upload your own. (tags: internet telecoms)
Now, here is some explanation to bring this to life for you:

To understand the industry evolution, its worth outlining the background to the online media world. What we can see is that there is massive and disruptive change across the video media supply chain:

Content Creation - There have been 2 major shifts in the last 5 years:

(i) Falling costs of content recording and production equipment (hardware and software) has reduced costs of capture and creation, and the resulting emergence of User Generated Content (UGC) has reduced content prices for media where amateur and professional differentiation is low (photography for example) as well as driven a huge (volume wise) new market in video production, mainly in short form content (a few minutes).
(ii) The digitisation of large amounts of video libraries, both by the owners and increasingly amateurs with low cost copying equipment, has made a huge back catalogue of video available online. This has aided not only the original media operators, but also the market in “User Copied Content” (also known as piracy in some quarters)

Aggregation - The traditional media aggregation functions of publication - content finding, content editing, and content marketing - have been replaced by algorithm based online systems. The finding content function was invaded by Search Engines (eg Google) and now increasingly Social Media, where networks of friends discover new content. These social networks have also taken over much of the editing functions of rating and recommending content. Marketing costs have also reduced in a networked world, to the extent where the transaction costs of some items makes it cheaper to give them away rather than to actually charge for them

Distribution - Moore’s Law, Open Source software, usable “de facto” webservice standards and a glut of bandwidth and hardware buildout from the dotcom failures has meant the cost per megabyte, teraflop and kilobaud has plummeted since 2000.
Over the last few years, Distributors have engaged in vicious price cutting to fill their huge empty pipes.

Customer Environment - The inexorable march of Moore’s Law, and increasing adherence to open architectures, plus increasing device interchangeability and application flexibility, has led to the total cost of ownership falling for hardware, software and services

Predicting the impact of all these axes of change is quite hard - we found that it was easier to group these axes into a number of scenarios. After some winnowing, 3 main ones emerged:

Old Order (eg BBC, Hollywood Movie Studios). They win if they can:
• Re-establish content rights • Maintain control on sources of funding (Ads, Subscription)
Pirate World (YouTube et al)* win if there is
• No control of rights, Free wins • Offset-based funding is sufficient to kep the costs of these businesses paid
New Order Players win if:
• New copyright models allow some form of pricing control by these new aggregators / creators • Migrate control on sources of funding (Ads, Subscription) from Old Order

Interestingly enough, technology shifts cease to make strategic differences once you look at the “big picture” outcomes above. By and large the technology drives the opportunity, but the prediction of industry evolution resolves itself around the economics and sociopolitical reactions. In this space, the most material factor is the ability to manage - and monetise - the copyrights.

Also, when we started looking at these scenarios, we realised that the likely evolution was not either/or, but more likely to be an evolution of the market from the Old Order, through a disrupted, disaggregated “Pirate World” and then a New Order will emerge, as shown in the chart on page 9 of the slide deck above (in the chart, the X axis shows time, the Y axis shows relative market share.)

Why evolution?

It is fairly clear that the Old World structures have costs that are being attacked already (and have also been in similarly structured but lower bandwidth media - print and music - to devastating effect). But our view is that although “Pirate World” is disruptive and will disaggregate much of the existing structures, it is not sustainable itself in the longer term, because:

- Firstly, there is just not enough advertising money, nor enough offset funding money from all the Web behemoths, to indefinitely fund the growth of an industry as large as the video media industry. TV and Cinema is a c $0.5 trillion industry worldwide, that’s the size of the total global Ad industry - the current online Ad industry is c $50bn. This was true pre crunch, it is even more true now.

- Secondly, all the evidence is that advertisers and funders want to fund high quality, longer form content, not the vast tranches of User Generated Content which is the main output of the current online video media industry

- Thirdly, we judge it is difficult to believe - given the size of market at stake - that counterplays to increase the difficulty of piracy for the average person will not occur.
In other words, we imagine a scenario of creative destruction. What is then interesting is to think about how the New Order could emerge. In our workshops and questionnaires, 4 likely approaches came out as the strongest:
(i) The “iTunes Play” - someone like Apple creates an end to end, owned but beautifully designed supply chain and starts to charge a reasonable amount - most people prefer to pay that than risk copyright infringement.

(ii) The Trusted Network - New Media networks emerge that people trust as the best pointers to desirable content. They could be social media based or algorithm based (Think Last.fm or Pandora, but for Video).

(iii) Existing “Pirate Players” go mainstream, typically due to legal and financial pressure - this weeks’ announcement that YouTube is to start showing Ad funded, legal, long form content is thus very interesting.

(iv) Old Media players make the decision to eat their own lunch and transition - think Hulu or BBC iPlayer here.

As to timings, these are the hardest to estimate - you can draw all sorts of adoption curves from past evidence and build simulation models - but Bill Gates’s maxim that things change less in 2 years than you think, and ditto more in 10, seems to hold good for the last 10 years of media so why not here?

At the Telco 2.0 event, we tested two main assertions of this model:

- Was our assumption that Pirate World was not sustainable valid? - Would the current media distributors (ISPs, Broadcasters and Telcos) be able to transition? (we held that some would, but - as can be seen from the chart - value destruction would be significant over 5 - 10 years)

We asked people to vote and comment on whether we were too pessimistic or optimistic, or about right.

For the timings, c 33% of people felt we were optimistic - ie Pirate World would come sooner and/or last longer. 50% of people felt it was about right, 17% felt we were too pessimistic.

For the Old Order to transition, about 25% thought we were optimistic (ie there would be more value destruction), 60% thought it was “about right” and about 15% thought we were pessimistic.

We’ll be working on our real number estimates in the report, but in the meantime we would be fascinated to hear your thoughts.

To share this article easily, please click:

Vodafone 2008 Results - new CEO at the crossroads to Telco 2.0?

The Telco 2.0 team looks at operator results through a different lens to most analysts. Rather than focusing on the minutiae of data trends, we look for hints of changes in corporate direction and the pursuit of, or potential of pursuing, more sustainable growth strategies based around a ‘two sided’ business model.

It was with this perspective that we listened in on the Vodafone 2008 half year results call. Despite being categorized by some on call as being a “large, cumbersome beast”, the Telco 2.0 team can see a possibility of Vodafone emerging as the Telco 2.0 poster child, just as it”s been the Telco 1.0 star. Here’s why:

Vodafone will add the brain to the Pipes

It was heartening to hear a CEO of a major operator confessing that “there is nothing wrong with being a bit pipe as long as it is efficient bit pipe”, even better was the caveat that Vodafone wants to convert the raw bit pipe, adding “billing, profile, and to a certain extent, location” intelligence.

This is a fundamental starting point of the two-sided business model: efficient bit delivery is not enough, adding intelligence creates the value and return on the network.

Market Segmentation

Another important step on the Telco 2.0 journey is the realization that you can’t possibly serve every market niche with all their communications needs. Part of being a great retailer is picking your strengths.

The biggest take-away from the Amazon story presented by their CTO at our event last week is that it is possible to leverage your platform by using partners to improve the product catalogue and to stimulate traffic. (There’s a detailed case study on the lessons of Amazon to telcos in our ‘Two-Sided Telecoms Market Opportunity’ sizing report).

Vodafone is quite explicit that it is targeting future growth in the SME and SoHo segments for customer penetration and with broadband, both fixed and mobile for product enhancements.

Differentiation at the Edge of the Network

The Telco 2.0 team also likes Vodafone’s actions to differentiate itself through equipment. The lifetime exclusivity and joint development on the Blackberry Storm is probably the most high profile device. As important is the low-cost handset partnership with ZTE and the SoHo targeting convergent Vodafone Station with Huawei.

In the Telco 2.0 world differentiation at the edge of the network allied with an efficient intelligent pipe with wide geographical coverage is a killer combination.

Partner Network Model

Vodafone is the only Mobile Operator we know who has the strength of brand, products and customers to enable it to sign up partner networks - especially since the international demise of i-mode. This is a great innovation - expanding geographic reach and increasing the size of the footprint without the need to buy every asset in the world. In the recent quarter, Vodafone added the Russian/CIS operator, MTS, to the family. The Vodafone Journey neatly summarises the scope of the network.

Free Cash Flow - a key metric

In financial terms, Vodafone is expecting to generate an incredible £5.2bn - £5.7bn of free cash flow for the year. The Telco 2.0 team feel that Free Cash Flow and Return on Capital are the key metrics to measure telco performance. Some common measures such as EBITDA margin actually discourage some two-sided business models where margins are much smaller than typical voice services but which grow the overall absolute profitability and sweat the network assets.

All is not rosy

Vodafone is not firing on all cyclinders in the UK and Turkey. Implementation of two-sided business models could certainly help in the UK; but in Turkey the market leader is already implementing some of these techniques which will make recovery much more difficult.

UK moves into reverse gear

It must be incredibly frustrating for the worldwide mobile leader to be trailing the leader in its home market - the gap has probably grown wider with a 1.7% decline in revenues in the current quarter. The UK is now by far the weakest major European market in the Vodafone portfolio in terms of profitability and cash generation (UK generated £391m of operating cash flows in the first six months compared to Germany with £1,147m, Italy with £850m and Spain £391m.

Turnaround in the UK will be difficult for Vodafone. It is one of the markets that would probably benefit from in-market consolidation from the current list of five MNOs and countless MVNOs. There’s also an over-penetration in the SoHo, SME and Corporate sectors. This was the market data when Vodafone UK presented its “Winning in the Market” strategy in March 2007:

vod-newCEO-1.PNG

The UK’s new CEO, Guy Laurence, has a difficult task ahead. In his previous job, he made a success in the similarly difficult Netherlands market. The Telco 2.0 team would urge an examination of two-sided business models. One of the specific actions mentioned in the results call was the expansion of wholesale. There is much more that can be done beyond typical voice reseller arrangements (such as with Lebara and Asda). We would specifically recommend, as a starter, a trip across the pond to Verizon Wireless to bring its ODI open development initiative to the UK.

Not Quite Turkish Delight

Vodafone entered Turkey in May 2006 paying £2.6bn for the second operator, Telsim, with 12.2m customers. A mere two years later Vodafone has written off an incredible £1.7bn of that amount admitting that fixing the network and improving distribution is proving difficult. It is hardly surprising given the dominance of the market leader, Turkcell:

vod-newCEO2.PNG

But Network and Distribution are just the first two hurdles for Vodafone to leap; Turkcell is a real innovator and has built a really impressive line up of value-added-products that provide extreme stickiness and anti-churn properties. Turkcell’s Chief VAS Officer, Cenk Serdar, presented some of their payments and authentication services at last week’s Telco 2.0 event. Vodafone will be playing catch-up for some time. Serdar is effectively the deputy CEO, which should tell you just how seriously they take this stuff.

The target Vodafone Business segments are currently highly penetrated. Turkcell will in all probability be more than a capable opponent.

vod-newCEO3.PNG

Grounds for optimism

There was a wonderful section in the conference call where Vittorio Colao delivered his personal perception of the industry:

“In the context of turbulent times, we should not forget that we operate in an industry which continues to be able to generate strong and consistent cashflow, basically on delivering very compelling services that serve a fundamental human need. This to me is the crucial point.

“Some people believe that voice is increasingly a mature market in developed market, but there is the data opportunity and the data opportunity is strong…

“We can all see that delivering ubiquitous connectivity is basically a one way road - once you start using it you can’t give it up, and for sure I’m sure that no-one will give it up for saving the price of a couple of drinks in a month…

“The cost/benefit of ubiquitous connectivity is incredibly compelling”

The Telco 2.0 team heartily agrees - there is a fantastic opportunity in end-user connectivity, but the icing on the cake is leveraging this connectivity and adding services which allow upstream players to intelligently connect to the end-users. To achieve this there’ll need to be some significant organisational and cultural changes at Vodafone.

[Ed. - Pieter Knook, Vodafone’s new Internet Services Director hired in from Microsoft, has an important role to play in this. At the Telco 2.0 event last week he revealed for the first time publically his vision for a ‘next generation open mobile internet platform’ and the cross-industry activity that he’s driving (starting, intererestingly, with China Mobile and Softbank). We’ll be reviewing this in more detail in future posts on Vodafone.]

To share this article easily, please click:

November 10, 2008

Guest Post: RatPlug! It started at Telco 2.0 with a USB cable, a HomePlug, and a phone…

We like it when we manage to stimulate innovative new product ideas. Here’s one from Jeremy Penston, previously a consultant and now, thanks to a convergence of stimuli at a previous Telco 2.0 event, a consumer electronics entrepreneur. Here Jeremy describes his product, which he demo’d to as many people as he could at the 5th Telco 2.0 Exec Brainstorm last week in London:

I would like to thank the Telco 2.0 team for the opportunity to write this post. Telco 2.0 has been the source of a huge amount of insight and inspiration for me as we have developed the product that I’ll describe in this article. The RatPlug addresses the Achilles heel of the mobile ISP - video.

ratplug-logo.gif

The RatPlug is an intelligent charger. It acts as an internet access point for your mobile devices, automatically saving and sharing your pictures, downloading YouTube videos and podcasts while you charge the battery of any USB device.

It uses the time that you spend charging the battery to sideload your portable devices. The RatPlug uses powerline communications to connect to your home broadband, and USB to connect to your device. The whole process is automated so the upload and download is started simply by plugging the device in to charge. There is no software required on the device.

When your battery is charged up, so is your memory card. Mobile phones, PSPs, digital cameras, photo frames, MPx players, internet car radios can all interchangeably use one RatPlug because each device is uniquely identified and linked to the user by our service platform. The user is at the centre of our system.

The RatPlug eliminates friction. It gets the files where they need to be - simple as that. We are not going into the content space - that is not our business. We complement devices too, making it easy for users to get stuff to play on them.

Our company, Omniplug Technologies, is a data logistics company. “Data logistics” is one of many Telco 2.0 concepts that we have based our thinking on. We move the fat files from A to B like Martin Geddes’ shipping containers analogy from the October 2007 Telco 2.0 event. We deploy intelligence at the edge of the network - an old James Enck concept.

The idea for the RatPlug was the result of the second Telco 2.0 conference in March 2007. At that event we heard two contrasting stories. One described how mobiles were going to have 40GB storage, the other that operators were asking themselves why they should go down the same route that ISPs had pioneered - all you can eat gluttony?

The answer was simply an accident of circumstance - all these conversations going on while I had a Homeplug in my bag, my phone and its USB cable.

The idea was to use the existing broadband networks when they were quiet at night and the device was plugged in to charge. Move the fat files around quietly in the background. Use existing user behaviour. Create something simple, approachable and affordable.

The content we are focused on is what we call disposable media; the sort of stuff that you would like to have, but if you can’t get easily or have to pay for, you won’t bother - the sort of stuff that is out of date if it is more than a few days old. If you produce or are trying to distribute such media, you will already know that getting it to mobile users is full of friction. Why spend 90 seconds downloading a video that only lasts 5 minutes?

We aim to give the user stuff to fill the snippets of free time that we all have as individuals - waiting for the train, the bus or for your friends to show up. “Snippets of free time” was another takeaway from Telco 2.0 - Dawn Nafus of Intel gave a presentation in the 2007 Digital Youth section that was inspirational and has led us to where we are now.

Which is where, exactly?

We have a prototype, we have applied for IP protection and we are showing this to anyone who is interested. We are going to launch it next year.

What we really like about the RatPlug is that it fits so nicely with what everyone else is trying to do.

Being a Telco 2.0, a Web 2.0, a CE Manufacturer 2.0, or a pick your industry 2.0 is all about openness. It is about the link elements in the value chain that tie all the pieces together so that everyone wins. It is about moving data around so that everyone can build the best possible services that customers find really easy to use.

So much of the inspiration for the RatPlug has come from Telco 2.0 that it is only right that we are starting to tell people about it through this blog. Werner Vogels, CTO, Amazon.com may just have answered another big question for us in his presentation at last week’s event…

If you want to know more about Omniplug or the RatPlug, please contact me: jpenston@omniplug.net

To share this article easily, please click:

Ring! Ring! Hot News, 10th November 2008

In Today’s Issue: Your churning handset market; Apple beats RIM into third; horrible quarter in the Telco USSR; astonishingly trivial jailbreak for Gphones; iPhone emergency call only function lets you call any number; AT&T’s iPhone-as-router; cap watch; MobileMe sporked; some kind of election in US spikes SMS service; CEP is your new favourite TLA; Virgin Media struggles, Iliad soars; Rio gets really fast Internet service; Orange cans IPTV; DTAG feeling better now; Turkcell stars at Telco 2.0, boosts profits 50%; 900MHz 3G in Finland; Vodacom = Vodafone Africa; Nortel MetroEthernet sale off; C&W split forever delayed due to unexpected good news; YouTube eats the world

The handset market is churning frantically, as Samsung unexpectedly races into the lead in the US and elsewhere. Motorola is the biggest loser, even with last week’s good news from Sprint. Here’s more on the devices, especially the Omnia iPhone clone. Apple has overtaken RIM for smartphone shipments.

We said last week’s good news from Sprint; good news is a relative term. It’s been yet another sapping quarter for them. Unlike some other Kansan stories, this time there’s little prospect of waking up to find it’s all just a bad dream.

And those pesky kids have beaten the restrictions on the Google Android G1. The hack is alarmingly simple — it requires you to install a terminal client and telnet to the device’s own IP address from within your /bin/system/ directory. That’s a total of three commands to get full root access; cd /bin/system/, netstat (or ifconfig), telnet your.ip.address.here…and you’re done. Of course there’s a patch coming, but you really have to wonder about Android’s security if it’s that simple. Has anyone tried to telnet into it from another IP address? For geekier readers, the original XDADevel thread is here; it gets interesting when they start talking about running Jabberd and the curious fact that everything on a G1 runs in a hidden console as root…

Relatedly, it appears that the iPhone’s password lock allows thieves to call any number they like. There’s more on AT&T and the death of the iPhone-as-router app, too. AT&T is also apparently planning to cap heavy users, with a Comcast-like big bucket approach (150GB a month for the top speed bracket).

Apple’s been caught short of infrastructure to support its cloud activities before, and this week it happened again, with MobileMe going down for seven hours.

In the US, meanwhile, apparently there was some sort of election. Telephony Online reports that Barack Obama’s election spiked SMS traffic by around 10% (“She cannae hold it much longer, Cap’n!”). Worthy of note:

The power of the mobile phone was a prominent theme throughout Obama’s campaign, which included a dedicated mobile effort with the ability to download ringtones, wallpaper or receive text updates on the issues. The campaign reached across nearly every major social network and even called upon geographically and demographically targeted advertising messages over Quattro Wireless’ network to encourage voters in key states to vote early.

There’s more here.

Interesting new buzzword watch; Complex Event Processing. It looks like it ties into a lot of Telco 2.0 themes.

Meanwhile, a tale of two telcos. Virgin Media’s net loss doubles; Iliad sees profits rise by 30%. Even stripping out the effect of acquiring Alice France, they were still up 17%. It’s rather depressing that out of the three fundamental business strategies, the one that the UK ISP industry hasn’t tried is “operational excellence”. And come to think of it, “new product” has barely been touched in the general dash for price leadership.

In Rio de Janeiro, they’re trialling 60Mbits/s cable service. Dude, where’s my fibre? Orange UK, meanwhile, is cutting back on investment and canning its IPTV rollout; apparently it’s too similar to BT Vision. Or, for that matter, to all the other IPTV and cable TV operators in the world.

Deutsche Telekom peeked out of the hospital this week with unexpectedly good results. “Improved processes” were given as one of the reasons, which certainly sounds like an attempt at operational excellence to us. But one of the stars of Telco 2.0 last week, Turkcell, matched that with a 50% boost to profits.

900MHz spectrum refarming is coming: DNA in Finland announced a major deployment of 3G base stations in the old GSM band. Meanwhile, Vodafone took a controlling stake in Vodacom. It seems that Vodacom is now going to be developed as the centre for Vodafone’s activities in Africa, with the Ghanaian and Kenyan stakes rolled up in it.

Fixed-line voice is dying in Hungary. Everyone mourned Nortel’s announcement that it was selling its optical networks business; but this week it looks like that’s off - nobody can afford to buy it!.

Cable & Wireless isn’t going to divide itself in two after all, or at least not for a while, in a “things not so bad after all” storm.

And finally, this is interesting — a new study shows that Web video has overtaken P2P filesharing as a traffic generator. Come on, you’re not seriously proposing to block all web traffic too…

To share this article easily, please click:

November 5, 2008

CDR = Customer Data Revolution

The opportunities and pitfalls of the telcos’ vast stash of CDRs (Call Detail Records) and phone bills have been a top theme here at the Telco 2.0 event. Last year, you may remember, we said on this blog that in the future, so many new applications will need contextual data to function that we’ll need to think of how subscribers will take their data shadow with them when they churn. It looks like this is going to be more important than ever.

Paul Magelli, head of subscriber data management at Nokia Siemens Networks, just gave a presentation in which he argues telecoms needs to invest in understanding customers in the same way it invested in monitoring and instrumenting networks in the last 10 years; we’re going from a “network driven world” to an “information driven world”. Otherwise how would we know if someone, like Magelli, had two mobiles, a BlackBerry, and a laptop dongle but was still the same person?

Operators, he argued, enjoy a relationship of trust with their subscribers, as opposed to (say) Google’s “relationship based on openness”. The importance of contextual data is difficult to underestimate; Thomas Howe’s presentation on day one was all about adding voice “as a spice” to other business processes and their hard-to-replicate data assets, and what could be achieved with an API that takes more arguments than an e164 telephone number, Martin Geddes’s talk in the same session centred on integrating other kinds of context around the voice call, JP Rangaswami argued that being able to keep the context of a call — “TiVoising voice” — was a transforming event.

But this raises some crucial and difficult questions. Arguably, CDRs — “the real social network” as Howe calls them — are the creation of the subscribers, just as the content in Wikipedia or the links Google counts are. Carriers facilitate this, but only keep them for their own billing purposes (and because sinister government agencies want them to). Everyone at least mentioned the need to respect users’ privacy, but there was little said about what this meant in practice. Is it really true that operators enjoy a “trusted stewardship” status in the eyes of subscribers? Just as one of the barriers to new VAS is the fear of a disastrous service launch, one of the barriers to new uses for this stuff should be the fear of a privacy or security Chernobyl that would destroy this trust once and for all.

Perhaps the guiding principle should be that operators respect subscribers’ data sovereignty? That would mean subscribers would have to explicitly and effectively choose what data to release and how; it would also mean that they would have to be rewarded for uses of it that mainly benefit the operator, like ad targeting. The reward, however, doesn’t have to be money. It could be quasimonetary — lower rates — or it could be access to new and compelling applications. Carriers would have to make it easy for churners to take their data shadow with them as they go out the door. Perhaps, as someone suggested today, this is yet another reason to deploy ENUM. That sounds grim, but the flipside is that you’d need to make it easy to import data; which is all good if you consider the CDR pile to be a strategic asset.

One business which has made its main aim to maximise its customer data pile at all costs is Amazon.com. Their CTO, Werner Vogels, spoke at Telco 2.0 yesterday. Amazon believes that its huge customer base, and its vast resource of data on their purchases, is a crucial asset. Similarly, it makes it its business to maximise its holdings of information about upstream customers — that is to say, its catalogue. Vogels described their decision to make listings free and to open the platform to upstream customers (merchants in Amazonspeak) as being explicitly intended to increase the information pile. Eventually, he said, they aim to treat each customer as a segment to themselves. In a way, Amazon is a machine for generating and matching user profiles and SKUs and then settling the transactions that result..

In a similar way, Voice & Messaging 2.0 is all about reducing the minimum segment size it’s possible to develop services for, right down to individuals. But all of this is dependent on respecting information sovereignty: if you want to create passionate users, and even more so developers, you’ve got to respect the work they put in creating all that data. Which means not being evil, and providing a user interface to make that data shadow manifest and controllable, and providing the APIs and terms of business that will help your upstreams to invent things with it.

To share this article easily, please click:

November 3, 2008

Ring! Ring! Hot News, 3rd November 2008

We’re often noticeably keen on BT; but we’re not always right. This week, it happened. BT issued a profit warning combined with the message that it might have to chip in more cash to its pension scheme; the shares duly tanked. The hit to profits came at the company’s growth centre, BT Global Services, as its enterprise clients cut back on their IT spending. Perhaps, however, that’s a good problem to have; at least compared to those telcos whose core telecoms business is spiralling rapidly downwards.

But it can’t have helped that France Telecom announced numbers this week that were entirely satisfactory, with mobile broadband helping to compensate the steady decline of the fixed voice business. (Or is it cannibalising it?)

Back in the Telco USSR, meanwhile, Sprint-Nextel management is making a virtue of necessity. They have cancelled the attempt to sell Nextel, three years and $36bn in writedowns after they bought it. Now, due to the financial crisis, they’re talking about “rejuvenating” the network that gave you Push To Talk and specialist public-safety voice & messaging. Their tell us:

“The iDen network is a key differentiator for Sprint,” said Dan Hesse who took over as Sprint Nextel’s chief executive at the start of this year. “It allows us to offer products and services no other carrier in the industry can match.”

Someone has finally noticed that the money’s in enterprise voice and messaging. How different things were a few months ago when all Nextel did was get in the WiMAX rollout’s way. In support of this, there was some good news for fellow crisis club Motorola, as Sprint is renewing the longstanding agreement under which they support iDen technology.

And Moto needs the good news; the numbers at its handset division are so bad that it looks unlikely that they can actually get rid of it. Instead, there’s going to be one last try, to either turn the job around or at least put it in a condition to sell. As a result, many of Moto’s OS platforms are being axed. Only their proprietary low-end one, Google Android and Windows Mobile will be supported. Supposedly, hundreds of engineers have already been reassigned to work on Android. This leaves the LiMo mobile Linux looking a little shaky. Relatedly, the latest release of Ubuntu is apparently full of mobility features, so maybe we’ll see the PC platform go mobile, just not in the way Bill Gates would ever have hoped for.

Apple, meanwhile, smashed its way into the handset-shipping top 10; incredibly, what with the revenue-sharing agreement and the whacking handset subsidy and the $900m hit to profits, AT&T is now giving its iPhone subs free WLAN. Just what did Steve Jobs do to those people? As a result, iPhone developers are in short supply — this may be a first for the mobile software business, and one which really marks out Apple’s success with the Jesus Phone.

Yahoo!, meanwhile, has lashed its various APIs together and called it Yahoo! OS v1.0. And over at Forum Nokia, Robin Jewsbury doesn’t think much of Steve Ballmer’s thoughts on mobile.

Via David Isenberg, we learn that Verizon claims that fibre-to-the-home cuts maintenance truck rolls by 39%. Just what are we waiting for?

But then, it’s always the way — the non-telecoms problems are the hardest to solve. The saga of the dispute between Altimo (Alfa Group’s telecoms business) and Telenor over various shareholdings in the former Soviet Union just took off again; Telenor won a case in an obscure Russian court, Alfa (or what looks like a front for them) overturned it in another.

Telenor did much better with its investment in Pakistan. Now, facing falling profits at home, it’s trying to repeat this in India by buying 60% of Unitech.

We’ve often said that the industry needs a new business model. In Sri Lanka, fixed-line operator Lanka Bell’s got one: they’re paying customers to receive international calls, after connecting up to FLAG slashed their costs. How long before Sri Lanka develops a lot of call centres?

PCCW chairman and major shareholder Richard Li is planning to take the company private, thus reversing the massive and much hyped IPO of 1999. Does anyone remember when people actually called it “Pacific Century CyberWorks”? Me neither…

The Australian government is still dreaming of censoring the Internet, and the French Senate kept the three-strikes proposal despite its beating in the European Parliament.

To share this article easily, please click:

Guest Post: In-Car Internet Radio - new opportunity for telcos?

As part of our regular coverage of innovative new products, here’s an introduction to miRoamer’s in-car Internet radio. Robert Demian, their head of global sales, is coming over from Australia to our event this week to discuss the opportunity. He describes it here:

miRoamer is an original developer and leader in internet media technologies, specializing in cutting edge media and content streaming. Over the past six years miRoamer has developed new opportunities for the general public to gain global access to digital media. The biggest challenge has been to create a total portable solution.

The development and evolution of In-Car Internet Radio and IPTV brings a new portability to digital media delivery for existing digital content suppliers and traditional terrestrial outlets. The delivery of mainstream and diverse programming is now cost effective and available in-car! This has now been achieved through co-development with the worlds’ leading car audio maker. With the Internet Radio-Content market experiencing huge growth, it is ideally suited to the in-car environment.

The most noticeable feature of the changes in radio listening trends over the long term has been the rapid growth of Internet radio. Across a range of countries, Internet radio is the increasingly preferred delivery platform to deliver radio content. As well as opening up a world of new broadcast streams, it provides worldwide access to favourite local stations allowing people to stay tuned where ever they are. Even in their home city, streamed internet radio access improves sound quality and provides interference free reception, something not always available in high rise offices and apartments.

In-Car Listening

In-Car Listening is a big chunk of total listening to radio and had been growing over the last decade, both in the absolute volume of listening and in-car listening’s share of total radio listening. Internet Radio will soon be able to join the party when it comes to In- Car radio listening, Energetic internet media platform provider, miRoamer, has linked a deal with a world leading multinational OEM, to show off the ‘World first Internet Car Radio” at the upcoming CES in Las Vegas, Nevada (Jan 8-11th).

This means telcos have a big new market in prospect. telcos are in a unique position to provide the necessary wireless connectivity to deliver internet radio to auto-mobiles. And just how big is this market?

In the USA, radio is still the 300 pound media gorilla, with effectively everyone (94% of Adults 18+) using the medium weekly. In-home listening is still where the majority of hours are clocked up, but in-car-listening in the peak drive sessions, mornings and afternoons, Monday to Friday. In-car takes a 43% share of average listening hours at that time.

Official radio rating studies from Nielsen Media from Melbourne, Australia show that the number of people 18+ listening in-car during Morning and Afternoon Drive has risen over 10 years to over 2/3rds of that age group (68.3%) , an incidence that has risen 24% from 1997. In addition, the volume of listening has increased by nearly half (49%) over the same period (2).

Qualitatively, the time spent listening to in-car radio is regarded as the most intense time to enjoy radio content. A UK study noted … “Most journeys are routine and, particularly in congested traffic, pre-occupy only one part of the brain, allowing the driver to listen intently to radio output.” The report also notes a characteristic of in-car listening…. “Radio meets much more intensive needs within the car than in any other situation, spanning the gamut of emotional, mood enhancing/changing, entertainment and information.”

The report sheds light on two issues which need to be addressed in the promotion and listener usage of in-car Internet radio. What is station switching behaviour like in car radio listening and who will likely take up in-car Internet radio first?

There are other issues that have yet to be researched, such as the use of user-generated audio entertainment In-Car versus listening to broadcast (and when available) live internet radio. The study by Winstanley Research (UK) showed that despite ease of station channel-surfing using push buttons - which in some cars can be thumb operated from an up or down pre set channel selector on the steering wheel - switching behaviour was much lower than anticipated and is related to the routine of the journey (commuting to and from work in this case).

The study showed that only a minority of respondents switched stations, despite most having their presets set up for easy access to alternative stations. The following top-line findings from the report show that:

  1. 34% of in-car listeners say they never change stations during journeys
  2. despite having nearly 7 presets available on average, less than three are regularly used
  3. “I tend to listen to the same radio stations in the car that I listen to at home.” 82% of listeners agreed with this statement
  4. “I am the sort of person who switches stations a lot when driving” - 85% of drivers disagreed with this statement

The Winstanley Research study was conducted in 1997, in the South East of the UK (i.e. London and surrounds) so the dominance of market leaders, Capital FM, BBC Radio 1 and Radio 2 at that time, might exaggerate the paucity of switching. However, as in other radio listening environments, station switching is far less common than practitioners in advertising and marketing, with their innate professional curiosity about all and every media channel, would personally believe, based on their own heightened experience.

“I’d always assumed that most listeners, like me, used car journeys as the ideal opportunity for surfing the airwaves …The figures show that I’m plain wrong”. Dominic Mills, Editorial Director, Haymarket Business Publishing

The market task…

From the point of view of in-car Internet radio, developing the habit of listening to stations on-line from Internet radio streams, especially Internet-only stations, will be the same marketing task and address the same opportunity as that which faces content providers currently.

They will compete against the ease and accessibility of “always on” local broadcast radio stations, but the ease of tuning into Internet radio in the car will enhance it’s appeal as it will be even easier than setting up a stream on a computer. If the right consumer benefits are promoted, including the variety of Internet radio, and the proposition is promoted through exposure of the service (especially at CES launch as World first Internet Car radio), the awareness and the demand for the service will increase.

It is similar to a short period, where in the 1970s, OEM car radios were often supplied without the FM option. Even later in the decade, they may have had an AM/FM radio but not yet with the tape player entertainment option. It’s also worth noting that the presence of tape players only ever became a subset of in-car listening and did not overwhelm or outpace the volume of listening devoted to in-car radio listening. The same observation can be made about mp3 player adoption in -car … good to have the option but across the market, not substantially taken up at the expense of radio in the car.

Lessons from the “8-track.”

You might argue that the same sideline effect could come to pass in-car Internet radio listening. It is a potential threat, but here’s why Internet car radio won’t go the way of the 8-track player!

  • It’s the easiest access path to Radio, over 10,000 + stations and endless numbers of streams, specialising in the most popular audio entertainment medium.
  • In all cases, it provides superior sound quality, clear and full fidelity stereo and interference- free talk. Internet radio can be delivered with less compression, allowing the In-Car Radio to reproduce to the best acoustic standard possible.
  • It will be free of interference associated with broadcast reception (especially AM) and have world- wide coverage as opposed to just local market transmission of 50 radial kilometres.
  • Internet radio will greatly arouse interest in the medium as it increases the range, variety and choice well beyond the typical radio station repertoire offered locally.

It seems likely that this wider range of content, together with the ease of pre-selecting it on your Internet car radio buttons (along with favourite local stations) will only increase choice and stimulate interest. We know this from on-line radio; there are over 48 million people in the USA alone already using the internet to deliver their own stream of favourite radio and music programming.

miRoamer …on the move

miRoamer was the first company in the world to develop and commercialise portable Internet radio. The move to develop an in-car-radio solution is the natural progression to create a new paradigm and real starting point for digital media. Now that Internet radio is portable and in-car, the mind bloggles!

To share this article easily, please click:

Telco 2.0 Strategy Report Out Now: Telco Strategy in the Cloud

Subscribe to this blog

To get blog posts delivered to your inbox, enter your email address:


How we respect your privacy

Subscribe via RSS

Telco 2.0™ Email Newsletter

The free Telco 2.0™ newsletter is published every second week. To subscribe, enter your email address:

Telco 2.0™ is produced by: