Monday, August 31, 2009

Mobile Payments are getting Public Acceptance

The value of mobile payment transactions is forecast to expand 68% annually and reach almost $250 billion in 2012 from $29 billion in 2008. By then, accourding to Arthur D. Little, proximity payments will represent 51% of the total m-payment transactions.

Based on ADL's Global M-Payment Report Update 2009, the company believes these figures will be realized as telecom companies have an incentive to launch m-payment services to take advantage of the current window of opportunity. Additionally, transaction volume is expected to keep rising as m-payments will take market share from banking transactions due to lower service costs, and from online-payment services due to increased mobility.

M-payment services are largely still focused on a business-to-person environment, and are equally balanced between remote and proximity services. Volume-wise, remittances will globally be the strongest growth contributor, rising 25% annually over the next two years, before retail purchases will take the lead with 77% yearly growth until 2012.

A key factor influencing the potential for m-payments in any market - developed or emerging - is the banking infrastructure. M-payments have a greater opportunity in markets where the banking network is relatively less developed, acting as a competitive service channel.

ADL expects m-payment transactions in developed markets to grow 56% yearly, representing a little over a third of the total transaction value by 2012. That same year, emerging markets will grow by 76% yearly and account for about two-thirds of the total.

The biggest share in 2008 came from the cluster of developed countries Japan, South Korea and Australia, with 24% of the global total. Western Europe is in the second spot with a 13% share, but will become the biggest contributor at 17% by 2012. South America will follow closely with 12% and North America with 11%.

In developed markets, ADL does not see m-payments substituting existing payment systems - as massive adoption is limited to convenience-enhancing applications and niche segments - but it will put pressure on existing transaction channel margins. In the next two years, m-payments will remain a complementary transaction channel in developed markets.

For an m-payment solution to be adopted on a mass scale, it must fulfill the key success factors, offering unmatched mobility, a user-friendly interface, a high number of contacts with banks and operators and a new level of convenience. Most of the m-payment applications supply niche segments.

Despite the current hype, ADL does not expect a massive adoption of near field communication (NFC) solution in a majority of developed countries until 2011 at the earliest. Instead, this is seen to be still two years away due to a delay in hardware standardization, the limited availability of NFC-enabled handsets and issues in the development of a viable business case for all stakeholders.

Also, improved regulations and movements toward a liberal ecosystem will push market developments into going "cross-border". As seen in the EU, m-payment market development depends on the creation of a liberal regulatory framework, enabling increased competition and aiming at streamlining cross-border m-payment transactions.

Concerns beyond convenience

As for emerging markets, m-payment services will become the first widespread cashless payment system in many countries, enabling cost-effective and secure transactions. As already seen from several successful service launches, emerging markets are a fertile ground for the development of m-payment solutions due to their limited banking infrastructure and growing mobile penetration.

End-users' benefits will mainly be created through low-value but high-frequency transaction services. Service providers will continue to focus on such transactions while leveraging their customer relationships. Regulators and governments are establishing a legal environment that encourages further development of m-payment services.

New know-your-customer (KYC) norms will be developed, forcing market players to find the balance between convenience of use and security concerns. Regulators, operators, banks and other players have to find the right balance between making the service very convenient and lowering barriers to adoption on one side, and appropriate security and anti-fraud controls on the other side. Hence, value chain players should take an active role in defining the KYC norms to secure their influence on the development.
Remittance will be the strong growth driver for m-payment transaction volume and cross-border cooperation. Global mobile remittances, with 146% annual growth, will be the target area for m-payment services and will lead to the establishment of international strategic partnerships. This will positively influence the m-payment transactions.

For financial institutions, m-banking and related m-payment services can be a differentiating factor and a chance to tap into the trillion-dollar market of micro cash payments.

For merchants, it is best to evaluate the m-payment channel as a means to increase consumer convenience, mobility and accessibility of their services and goods.

ADL recommends that independent payment service providers increase their partnerships to become well-integrated and, thus, to increase their bargaining power concerning value chain margin distributions. (source: telcomasia.net)

Wednesday, August 26, 2009

T-Mobile Android Phones are sold at RadioShack

In July, T-Mobile announced a new partnership with RadioShack that looked to bring more of its cell phones and smartphones to more customers. Now, these T-Mobile products are available at RadioShack starting Aug. 19. These handsets include the HTC myTouch 3G, which is based on Google Android, as well as new Samsung models.

T-Mobile is bringing a new line of cell phones and smartphones, including the Android-based HTC myTouch 3G handset, to RadioShack starting Aug. 19.

In July, T-Mobile and RadioShack announced the new partnership, which looked to expand the audience for T-Mobile’s line of smartphones and other handset devices. In addition to the much-anticipated myTouch 3G, T-Mobile is bringing other devices to RadioShack, including ones from Samsung and Sony Ericsson, as well as Research In Motion’s BlackBerry.

The partnership also allows T-Mobile to bring more customers onto its new 3G network, which is facing competition from the likes of Verizon Wireless, AT&T and Sprint. In addition, it allows T-Mobile to dive deeper into retail with access to more than 4,000 RadioShack stores. This is an area where AT&T, which has more than 2,000 retail outlets, has had a district advantage.

At the same time, Google and its open-source Android operating system is given a much bigger stage and more access to potential customers.

“The Android platform, with its broad array of unique applications, offers our customers another exciting option for their mobile lifestyle,” Peter Whitsett, an executive vice president with RadioShack, said in an Aug. 19 statement.

Besides the myTouch 3G, RadioShack now has several other brands on its shelves, including the BlackBerry Curve 8900, the Samsung T239, the Samsung Comeback, the Samsung Gravity 2, the Samsung Behold and the Sony Ericsson TM506.

In addition to the RadioShack deal, T-Mobile announced that the Samsung Gravity 2 is now available for a price of $30 after the rebate and a new service contract. The Gravity 2, a horizontal slider phone, is available through T-Mobile’s 3G network and offers a full QWERTY keypad, multiple messaging options, a 2-megapixel camera and camcorder, and 16GB of additional memory.

Tuesday, August 25, 2009

Malaysia's P1 WiMAX Operator Celebrated its One-Year Anniversary with 80000 Subscribers

PETALING JAYA – Packet One Networks (Malaysia) Sdn. Bhd. (“P1”), Malaysia’s first and leading WiMAX telco today announced that it has surpassed the 80,000 subscriber mark, after one year of rolling out its P1 W1MAX high-speed wireless broadband service.

The impressive subscriber milestone underlines P1’s commitment to challenge the Malaysian broadband scene through its innovative 4G (fourth generation) technology and compelling consumer and business high-speed wireless broadband packages. At the same time, the leading WiMAX player here and in the region is playing an active role in catering to the insatiable demand for high-speed broadband, in line with the Government’s goal of ensuring that Malaysia reaches 50% household broadband penetration rate by 2010.

P1’s CEO, Michael Lai said, “We are one year into our mission to provide broadband technology that reaches out to everyone in Malaysia. As one of the forerunners in the world for large scale 802.16e 2.3GHz commercial WiMAX deployment, we’re very proud of what we’ve achieved today from ground zero.”

P1 acquired one of Malaysia’s four WiMAX licenses in March 2007. Following this, P1 began a series of strategic alliances with the aim to be the first to launch Malaysia’s and one of the world’s first WiMAX services. In May 2008, through its parent company, KLSE Main Board listed Green Packet Berhad, P1 secured a US$14.3mil (RM50mil) investment from Intel to make it the very first Intel-invested WiMAX telcos in the Asia Pacific.

Navin Shenoy, General Manager of Intel Asia Pacific, said: “Intel congratulates Packet One on a successful first year. The Internet is an essential, enriching and entertaining part of our lives, and the strong take-up of Packet One’s WiMAX service demonstrates Malaysia’s need for powerful, flexible, wireless broadband solutions. Intel is pleased to continue its work with Packet One to bring the benefits of WiMAX to more Malaysians.”

Barely 17 months into operations, P1 launched commercial services with the go-to-market brand of P1 W1MAX in August 2008. The competition was nowhere in sight.
Lai attributes P1’s progress to a few key success ingredients. He said, “We have over 500 strong workforce that believes in our WiMAX vision, and everyone from our engineers, marketers, and finance personnel to our help desks staff tackled the knowledge gap in relation to the new WiMAX technology and on the 2.3 GHz spectrum band. Now we’re probably one of the most skilled workforce the world has to offer for deployment and service provision of WiMAX.”

Talent aside, Lai advocates the importance of the WiMAX DNA (Device, Network, and Applications).
Whilst WiMAX devices on the 2.3GHz spectrum band were practically non-existent, P1 had an unfair advantage which leverages the technology and R&D strength of Green Packet, also a wireless connectivity products and solutions provider for the international market. The outcome of which is a host of stylish and user-friendly plug-and-play fixed and mobile devices, which not only accelerated P1’s launch preparedness but continues to help it gain traction in the marketplace. P1 has also maintained a respectable (Average Revenue per User) of above US$24.30 (RM85), which it plans to sustain by continually introducing new and innovative connectivity and communications services to its consumers.

Services in the pipeline include P1 VoWiMAX or Voice over WIMAX, which is expected to bring digital voice to a whole new level with the high bandwidth capability of WiMAX and QoS. Within the next few years, P1 expects to further support WiMAX-enabled laptops and several new Mobile Internet Devices (MID) such as mobile phones and PDAs.

Rapid is the key word says Lai, when it comes to network deployment to expand coverage to the entire nation. To meet P1’s ambitious network coverage goals, the Company employs a multi-vendor strategy with Alcatel Lucent deploying the first phase, and towards the tail end, launch ZTE’s concurrent second phase deployment. P1 had even built its own in-house deployment team to support its aggressive rollout target. P1 expects to deliver 35% population coverage by 2009, and 65% by 2012.

“We want to make broadband a right for all Malaysians and not a privilege”, said Lai when asked what the P1 W1MAX brand is all about. He added with zest, “We’re the challenger, unconventional, engaging, fun and out to make an impact. More importantly, we go out of our way to understand and respond to what our consumers want.”

P1 is definitely changing the Malaysian broadband industry landscape. This new player is promising new subscriber addition numbers on the back of limited network coverage.
“More and more Malaysians are becoming convinced that P1 is capable of providing a compelling alternative and we can’t wait for them to join the P1 W1MAX revolution, if they have not already,” concluded Lai.

Opposition to Google's Plan to Commercialize Digital Copies of Books

BERLIN — Opposition is mounting in Europe to a proposed class-action settlement giving Google the right to commercialize digital copies of millions of books.

The settlement would permit Americans to buy online access to millions of books by European authors whose works were scanned by Google at American libraries.

While some big European publishers, like the Oxford University Press and Bertelsmann (which owns Random House) and Georg von Holtzbrinck (the owner of Macmillan), support the agreement, there is widespread opposition among French publishers. The German government, supported by national collection societies in Germany, Austria, Switzerland and Spain, plans to argue against it and encourage writers to pull out of the agreement.

A United States District Court has set a Sept. 4 deadline for submissions on the settlement and plans to hold a hearing Oct. 7.

Akash Sachdeva, an intellectual property lawyer with the law firm Allen & Overy in London, said that last-minute objections from Europe were unlikely to stop the settlement from going forward.

“I would imagine the court is going to say that because you have a significant amount of big players around the world who have opted into this, then it is worth proceeding with,” he said.

Google, which has been digitizing books since 2004 to make them available online, says the proposed settlement will benefit publishers, authors and consumers, making a vast reservoir of work available for easy access.

Around the world, 25,000 publishers, libraries and individuals are working with Google to digitize their archives and catalogues, including Oxford’s prestigious Bodleian Library and the Bavarian State Library. Even the French National Library, an outspoken opponent of the project, said last week that it was talking to Google about a deal to help digitize its archives.

“We believe that we are helping the industry tremendously by creating a way for authors and publishers to be found,” said Santiago de la Mora, Google’s head of printing partnerships in London. “Search is critical. If you are not found, the rest cannot follow.”

Mr. de la Mora also said that authors could always remove their works from Google’s scanning registry.

The European Commission is to hold a staff-level meeting on the proposed settlement on Sept. 7, but it has not directly involved itself in the case. The commission has supported homegrown digitization projects, including the Europeana online book and cultural library database.

In Britain, where many publishing houses have close ties to the United States, publishers have avoided open confrontation with Google.

But some British publishers have objections and are working with Google on issues like how to determine whether a book is out of print, which comes up when books are still widely available in Europe but no longer in the United States.

Some are also concerned about a lack of European representation on the Book Rights Registry, a panel that is supposed to collect and distribute revenue from Google’s book sales in the United States to authors and publishers.

In Germany, Austria, Switzerland and Spain, opposition to the settlement is more vocal.

The German government has hired an American law firm, Sheppard Mullin Richter & Hampton, to submit a friend-of-the-court brief opposing Google.

Copyright agencies in the opposing countries, which represent publishers and authors and generate revenue by levying fees on their book sales, view Google’s online sales platform as a direct threat.

Four European agencies, VG Wort of Germany, Literar Mechana of Austria, Pro Litteris of Switzerland and Cedro of Spain, are asking members to remove their books from Google’s online registry, should the settlement be approved.

In Germany, about 2,700 people, including the prominent authors Günter Grass and Daniel Kehlman, have signed a petition asking the government to try to scuttle the settlement. Alain Kouck, the chief executive of Editis, the second-largest French book publisher, said he was talking to other publishers about developing a unified French digital sales platform.

(source: NYT)

Wednesday, August 19, 2009

ProtoStar I and ProtoStar II Satellites are for Sale under the US Bancruptcy Law

Yesterday, ProtoStar Ltd. filed two separate motions seeking approval of procedures to separately sell substantially all of the assets of two of its subsidiaries - ProtoStar I Ltd. and ProtoStar II Ltd. All three of the entities filed voluntary chapter 11 bankruptcy petitions in the United States Bankruptcy Court for the District of Delaware on July 29, 2009.

ProtoStar was formed in 2005 to "acquire, modify, launch and operate high-power geostationary (i.e., fixed with respect to a given point on Earth) communication satellites optimized for direct-to-home satellite television and broadband internet access across the Asia-Pacific region." ProtoStar operates two such satellites - the ProtoStar I Satellite (owned by ProtoStar I Ltd.) and the ProtoStar II Satellite (owned by ProtoStar II Ltd.). The ProtoStar I Satellite launched on July 7, 2008 and "provides Ku-band coverage for digital DTH services, high-definition television and broadband internet to under-served areas from Southeast Asia to the Middle East, while C-band transponders on the satellite enable ProtoStar to provide cellular backhaul, traditional last mile telecom and basic broadcasting services." The ProtoStar II Satellite launched on May 16, 2009 and became operational on June 17, 2009 following in-orbit testing. At that time, the ProtoStar II Satellite began providing services to PT Media Citra IndoStar and PT MNC Skyvision (the largest DTH satellite television service operator in Indonesia).

According to the motions, there is no stalking horse bidder for ProtoStar I's assets, but "potential acquirers have expressed an interest to purchase the assets." However, ProtoStar is seeking the authority to enter into a stalking horse agreement and provide a break-up fee and expense reimbursement in an amount of up to three percent of the cash purchase price (in aggregate). There is also no stalking horse bidder for the assets of ProtoStar II and the company is similarly seeking the ability to enter into a stalking horse agreement later. However, ProtoStar II seeks advance approval of a three percent break-up fee and a separate $500,000 expense reimbursement.

The bidding and auction procedures sought for both sets of assets include the following (differences between the two proposed sets of procedures are noted):
  • Bid Deadline: 4:00 p.m. (Eastern) on September 17, 2009
  • Good Faith Deposit: 10% of proposed purchase price
  • Minimum Overbid (in the event a stalking horse bidder is selected): $500,000
  • Credit Bidding: ProtoStar's lenders are entitled to submit a credit bid at any time before or at the auctions
  • Auction: September 23, 2009

In addition, each motion attaches a proposed form of asset purchase agreement for bidders to use in submitting their bids (a bidder must include a copy of the form agreement with its bid which is marked to show any requested changes to that form).

Tuesday, August 18, 2009

Anyone would send an SMS to the Universe?

From now until August 24, in honor of National Science Week in Australia, Hellofromearth.net is collecting goodwill messages to be sent to the nearest Earth-like planet outside our solar system likely to support life.

The planet – Gliese 581d – is eight times the size of Earth and some 20 light years away (194 trillion km). It was first discovered in April 2007. Due to its size, it is classified as a 'Super Earth'.

Those of you who have something to say to whoever might be living on Gliese 581d can submit your message – which must be less than 160 characters long – to the web site. The messages will be broadcast via the Canberra Deep Space Communication Complex at Tidbinbilla in cooperation with NASA.

The web site has been publishing the messages collected so far – and let’s just say that, for the most part, Earthlings probably won’t be making the greatest of first impressions when the messages arrive at on Gliese 581d in December 2029.

Still, there are some good ones. My personal favorite:

"Dear Alien, to unsubscribe from planet Earth SPAM please respond with *UNSUBSCRIBE*. Thank You."

Well done, David of Melbourne. Cheeky Earthling.

Friday, August 14, 2009

Twitter was used by Hackers as Botnet Command Channel

For the past couple weeks, Twitter has come under attacks that besieged it with more traffic than it could handle. Now comes evidence that the microblogging website is being used to feed the very types of infected machines that took it out of commission.

That's the conclusion of Jose Nazario, the manager of security research at Arbor Networks. On Thursday, he stumbled upon a Twitter account that was being used as part of an improvised update server for computers that are part of a botnet.

The account, which Twitter promptly suspended, issued tweets containing a single line of text that looked indecipherable to the naked eye. Using what's known as a base64 decoder, however, the dispatches pointed to links where infected computers could receive malware updates.

Master command channels used to herd large numbers of infected machines have long been one of the weak links in the botnet trade. Not only do they cost money to maintain, but they can provide tell-tale clues that help law enforcement agents to track down the miscreants running the rogue networks. Bot herders have used ICQ, internet relay chat, and other chat mediums to get around this limitation, but this appears to be the first time Twitter is known to have been employed.

Nazario said he's found at least two other Twitter accounts he suspects were being used in the same fashion, but needs to do additional analysis before he can be sure. The bots using the Twitter account connected using RSS feeds, a technique that allowed them to receive each tweet in real time without the need of an account. It was unclear how many bots connected to the account.

Up to now, the bot designers have done a good job keeping their enterprise under wraps. The original bot software is detected by just 46 percent of the major anti-virus tools, according to this VirusTotal analysis. The updates, which appear to be affiliated with the Buzus trojan, are even stealthier, with only 22 percent of AV engines detecting it.

"This continues the themes that we've been seeing for years now," Nazario told El Reg. "Twitter is just a new mechanism for this, of once you're on the bot pushing out new stuff that's not detected, basically outrunning AV."

Nazario, who offers more details here, says he discovered the Twitter command and control channel by accident, while looking for evidence into denial-of-service attacks that took it out of service for millions of users last week. His finding suggests that in the world of internet crime, large sites can unwittingly be cast as both the victim and the enabler, perhaps simultaneously.

Thursday, August 13, 2009

Facebook to launch Facebook Lite for Developing Nations

Facebook today launched a low bandwidth version of its site, for users on slow connections.

Facebook Lite, currently on trial in Russia and India, is billed as a "faster, simpler version similar to the Facebook experience you get on a mobile phone". It includes the basic communication and photo-sharing functions of the usual site, but excludes bandwith-hungry adornments such as apps and video.

"We are currently testing Facebook Lite in countries where we are seeing lots of new users coming to Facebook for the first time, and are looking to start off with a more simple experience," Facebook said.

As well as being more accessible where 2G mobile and dial-up internet dominate, the launch makes financial sense for the site, which remains Profit Lite. Users in developing countries draw even less per head in advertising revenue than their Western counterparts, who themselves do not cover Facebook's ever-rising bandwidth and storage bill.

Microsoft and Nokia agreed to equip MS Office in Mobile Phones

Microsoft and Nokia, long adversaries in mobile phone technology, have agreed to a partnership to equip many Nokia cellphones with the Microsoft Office software, according to a person with knowledge of the agreement.

Microsoft’s lucrative Office line faces an emerging competitive threat from free Web-based word processing, spreadsheet and other software, especially from Google. And consumers are increasingly using their smartphones to do tasks that once could be done only on personal computers, Microsoft’s stronghold.

“This appears to be a case of the enemy of my enemy is my friend,” said Rob Enderle, an independent technology analyst.

The alliance, expected to be announced Wednesday, seems to be a pragmatic step by both companies, as each tries to cope with growing competitive threats.

Nokia, the world’s largest cellphone maker, is struggling in the smartphone market against rivals led by the iPhone from Apple and the BlackBerry by Research in Motion. The competition is increasing with the recent entry of phones using the Android software from Google.

Neither company would comment. The two companies said in an advisory that they would hold a conference call on Wednesday.

Nokia and Microsoft have been rivals for years in cellphone operating systems, with Nokia adopting Symbian software and shunning Windows Mobile. Despite few details, the Microsoft-Nokia alliance apparently extends only to Office.

“This does seem to be a case of Microsoft Office business trumping Windows Mobile,” said Matt Rosoff, an analyst at Directions on Microsoft, a research firm. (Steve Lohr -NY Times)

Wednesday, August 12, 2009

Global Mobile Transformation in Emerging Markets

Emerging markets will lead the global economic recovery in 2010, with countries such as China and India likely to show the most obvious signs of upturn, but prospects for other emerging economies are also promising. Developed economies are expected to grow about 1.7 percent next year, while emerging markets will increase their GDP by 4.9 percent, according to a recent report by Bank of America Securities-Merrill Lynch Research.

The term “emerging economies” was first used in 1981 by Antoine W. Van Agtmael of the World Bank. There are currently 28 emerging markets in the world, which constitute approximately 80 percent of the global population and about 20 percent of the world's economies. With most consumers located in emerging markets, we simply can’t ignore this fact and the unique, innovative consumer trends we see coming out of these markets.

In the communications industry in particular, companies have lagged the market’s recovery since March 2009, according to Bank of America Securities-Merrill Lynch Research. But despite the fact that there were wide variations across markets, emerging economies will continue to develop at a pace that will surpass developed economies and probably further influence the future of our industry, particularly in the mobile sector. China's communications market is already the second largest telecommunications services market in the Asia-Pacific region after Japan, according to Pyramid Research, and it will surpass Japan by 2014.

As we move towards a more customer-centric business model, we can’t ignore the power of 80 percent of the world’s population, and these consumers buy products and services, love or hate brands, influence their peers and demand services from their providers in their own, particular way. And as globalization forces continue to expand, these consumers can influence the wider “global village”. Service Providers in emerging markets are preparing for the future, and that is why they have continued to experience healthy growth in terms of IT spending this year despite the global financial crisis.

The Mobile Revolution
As the world becomes more and more mobile, the global communications market is expected to recover in 2010, with mobile data being the major engine of growth. Global mobile penetration is estimated at 60 percent and will jump to 84 percent by 2013, led by growth in India, China and other emerging markets. China and India will add 829 million mobile subscribers in 2009-2013, which will represent 44 percent of the world's total net additions during that period (Pyramid Research). Given this data, it’s clear that organic growth will come mostly from emerging markets.

Currently, China’s mobile penetration stands at just over 50 percent, in Indonesia it is at 63 percent, and India is at 34.3 percent (it is expected to pass 54 percent by 2010). But in other emerging markets in Asia, mobile penetration is close to/or has exceeded 100 percent (like Malaysia and Taiwan). Looking at other parts of the world, mobile penetration in Peru and Mexico has reached 66 percent and almost 75 percent, respectively. But in countries such as Brazil and Colombia, mobile penetration is already at 81 percent and 83 percent, respectively, and well over 100 percent in Argentina and Russia.

While there’s still room for organic subscriber growth in markets such as China and India, for the most part the future for service providers in emerging markets – that have focused so far on increasing connections – is going to be about increasing the level of additional, innovative services and improving the quality of their customer experience. Saturation is around the corner, and the increase of triple and quad-play offerings has turned the telco-cable competition increasingly fierce in many emerging markets.

In a recent column published in TM Forum’s Inside Latin America, Wally Swain, Senior Vice President Emerging Markets, Yankee Group, indicated that “churn management becomes even more critical of an issue, and keeping customers from churning is the best way to improve the bottom line. Prepaid churn management is all about using sophisticated data mining and CRM techniques to target offers that appeal to a particular client’s profile.”

Swain added, “This is the OSS challenge as penetration rises, growth slows and it is no longer sufficient to merely hang out a sign for clients to know where to sign up. Using advanced OSS tools to reduce prepaid churn and also the percentage of inactive customers is the route to an improved bottom-line in these difficult economic times.” And I would add, and this is the right strategy even beyond this challenging economic stage (and beyond Latin America).

The innovation of services has been in most instances about delivering an “integrated” offering that is mostly just about combining the bill with a discount for a package of services. That was a relatively easy move, but according to Ignacio Perrone, Industry Manager, ICT, Frost & Sullivan, the next step will be a much more significant and qualitative effort, as service providers move from bundling to what he calls “blending of services” but in an even more complex mobile world, determined by not just one device, but a scenario of multiple mobile devices per consumer and an overload of content.

What is consumed and how it’s consumed will change radically. How services are packaged, offered and paid for, will have to change too. This new, more complex type of convergence will further affect the customer experience. This is a challenge that affects developed and emerging markets at the same time. It is a world with more of everything, everywhere: services, devices, content, you name it.

According to Perrone, whoever understands it first and develops the necessary capabilities and appropriate business model will be the clear winners.

We don’t really know where the global mobile market will be 10 years from now, or how OSS/BSS systems will continue to transform themselves to support our industry’s needs. But one thing I know for sure, I’d be looking in emerging markets for clues. The customer is king, and the greatest percentage of future global subscriber growth is in those markets. Watch closely, many of the most innovative solutions are flourishing from emerging markets these days. (source:


How would we live in a 4G World?

3G promised us the mobile Internet but, as ever, the hype got ahead of reality. To be fair, by the time that 3G was finally agreed upon and rolled out, the speeds it offered seemed rather pedestrian, and most industry watchers are now waiting on a 4G world for a truly global, mobile Internet with access speeds capable of supporting the kinds of applications and services that the iPhone is letting us glimpse. And that might happen faster than we think. The 1.5 billion app downloads on iPhone – a tiny fraction of the total mobile phones in the world – has shown service providers everywhere that there really is a market for mobile content and applications.

Now I don’t really want to step into the debate over which flavor of 4G will win – LTE or WiMAX – except to say that I think LTE will be a much simpler operational step for cellular service providers to roll out and be able to play the same kind of staged deployment as they did with 3G where multi-protocol handsets ‘hide’ the initial patchiness of the network infrastructure.

The battle for 4G technology dominance aside, a 4G world will be a very different one from 2G and 3G because several key technologies and approaches are coming together: fast and ubiquitous mobile Internet; powerful smart devices and cloud-based services. Cloud-based services allow the edge of the network (the device) to be smaller, lighter and simpler yet still as powerful because all of the heavy duty applications processing and storage are done inside the cloud. To be useable, it needs fast and reliable communications anywhere the user might be, and hence 4G is a crucial enabler of this approach.

Apart from a large screen and keyboard (and you could Bluetooth to those) why would you need a PC when you have a smart phone and all of your information and applications available online anywhere you go? No wonder Google has targeted its focus on the handset (Android) and netbooks (Chrome). If this scenario came about, it would obviously have very big implications for the PC and software industries as we know them today.

But there are many new issues to worry about too in a 4G world. It’s an all-IP network, so problems like VoIP security and service quality rear their head. In such a radically altered world, what are the implications for charging, settlements and so on? The operational headaches for service providers are likely to grow, so we need to crack on and solve them before these networks become high volume reality, and that starts with the basic infrastructure being manageable in a sophisticated and common way – not having to build different systems to cope with different manufacturers, for example.

But the implications of a 4G world go way beyond this. We may see a complete revolution in the business models underpinning these networks and services. Already we are seeing cracks with so-called over-the-top services bypassing the communications provider’s billing system. Do we see a separation between companies that operate networks and players that operate services and market those to end customers? How will value chains evolve? Even the model for who pays for services may change – already we are seeing more and more ‘free’ applications and content on the iPhone supported by advertising.

Net Neutrality: A Spoiler in the Making?
If we can take one lesson from history to understand a 4G world, it’s that the success of regulation on communications markets has been patchy at best. To me that’s why the whole net neutrality issue looks like one more step along that rocky road where regulators generally regulate by looking back at markets we have had rather than markets that might exist in the future. Free markets usually work well, and ones that are distorted by ‘helpful’ regulators usually don’t.

If a 4G world is totally dependent on high bandwidth, always on, IP connectivity anywhere on the planet, one of the inherent inadequacies at the heart of that structure will be the Internet itself. A 4G world will upgrade the fatness of the pipe, the device you’re viewing content on and business models. But one thing that hasn’t actually changed is the design of the Internet and all of its flaws having to do with service quality and the debate about IPv4 versus IPv6.

There are solutions out there, but essentially we’ve got this ungroomed, unmanaged, uncoordinated world in the Internet and variable quality that will not lend itself well at all to tomorrow’s value-added services where people are depending on the availability of the network to support pretty much everything they do.

Our old friend net neutrality seems to rattle around and around. The bit I don’t understand, especially in such a free market country as the U.S., is why would anyone want to enforce legislation that would prevent those who wanted it to pay for better classes of service? If this was healthcare and President Obama said everyone is going to get basic healthcare with no option for private plans, you can bet there would be rioting in the streets.

But that’s exactly what net neutrality proponents are saying about communications services: that everyone has to have the same thing. Is bringing everyone down to the level of the lowest common denominator really going to serve a world where people may well be prepared to pay for subscription services that give them sporting events on their phones in HD quality? So you have customers asking for this, but you as a provider have to say sorry, we can’t give that to you because the government passed a law that said we couldn’t.

I think the sheer possibility of a 4G world will make the current thinking we have about legislation and regulation seem pretty stupid, but then, that never stopped governments from interfering in the past!

We can only hope that legislators and regulatory bodies understand the potential of a 4G world and do their best to keep their hands off of it. I think only then will real innovation happen and a real evolutionary leap take place. (source: Keith Willets -TM Forum)

Monday, August 10, 2009

After Paralysed by DDoS Attack, Twitter was questiond about its Long Term Stability

The paralysing effect of an internet attack against Twitter has raised questions about the site's apparent fragility.

Attacks against accounts maintained by pro-Georgian blogger Cyxymu at a number of social networking sites including Facebook, Blogger and LiveJournal as well as Twitter, and apparently aimed at silencing him, brought the micro-blogging site to its knees.

The attack caused intermittent difficulties accessing Facebook (see notice here (http://www.facebook.com/home.php#/facebook?ref=pf)) and other sites on Thursday, but it was over at Twitter where it really hit home, flooring the micro-blogging service for almost two hours and reducing service levels well into Friday.

Bill Woodcock, research director at Packet Clearing House, advanced (http://www.theregister.co.uk/2009/08/07/twitter_attack_theory) the theory on Thursday that the assault wasn't the result of a traditional distributed denial of service, but the effects of users clicking a link contained in spam messages ostensibly promoting Cyxymu's web presence.

The messages were designed to discredit Cyxymu by associating him with a spam run. Other security researchers, such as Patrik Runald at F-Secure (here (http://twitter.com/patrikrunald/status/3175741744)) and Graham Cluley at Sophos, are sceptical about this Joe Job-style theory for the attack.

The vast majority of recipients wouldn't have bothered clicking on such a link, but it is possible that the spam campaign was either run alongside a denial-of-service attack from a network of compromised PCs or inspired a Russian patriot with access to a botnet to attack Cyxymu's web presence and by extension the social networking sites he uses. The timing of the attack coincides with the first anniversary of the ground war between Russia and Georgia.

However the attack was caused, and whether or not there's any significance in its timing, there's little doubt that it succeeded in throttling Twitter. An analysis (http://asert.arbornetworks.com/2009/08/where-did-all-the-tweets-go) by Arbor Networks, experts in DDoS attack mitigation, explains that Twitter-related traffic slowed to a trickle.

We generally don’t see a lot of data (i.e. it takes thousands of tweets to match the bandwidth of a single video), but 55 ISPs in the Internet Observatory were exchanging roughly 200 Mbps with Twitter before the DDoS. Then traffic dropped to a low of 60 Mbps around 10:40am and began climbing after that. As of 1pm EDT, Twitter traffic was still down by 50% at 150 Mbps (normally we see close to 300 Mbps for this time of day).

Twitter’s two NTT hosted address blocks were moved in response to the attack, Arbor adds. Twitter's reliance (http://blog.twitter.com/2008/02/twitter-chooses-ntt-america-enterprise.html) on just one service provider, and apparent lack of back up and redundancy, much less a comprehensive disaster recovery plan, goes a long way towards explaining why it was hit so badly.

Twitter's website was back up and running, albeit with minor latency issues, by Friday. The latest status update from Twitter states (http://status.twitter.com) that "site latency has continued to improve, however some web requests continue to fail" (source: The Register)

Friday, August 7, 2009

Google Buys On2 Software Firm for $106 million

Three years after buying the video-sharing Web site YouTube, Google is making a much smaller acquisition of a company that helps make online video files smaller.

Google said on Wednesday it agreed to buy On2 Technologies, which sells video compression software, in a stock deal valued at about $106 million. The per-share price was 57 percent above On2’s closing stock price on Tuesday, and On2’s shares soared on the news.

“We are committed to innovation in video quality on the Web, and we believe that On2’s team and technology will help us further that goal,” Sundar Pichai, Google’s vice president for product development, said in a statement announcing the deal.

Google paid about $1.6 billion in stock for YouTube in 2006. The deal gave Google a hugely popular destination for online videos, but it has yet to produce significant revenue for the company.