Tuesday, January 10, 2012

Android Google TV to be released by LG

With Google having confirmed LG as a major device partner for Google TV, the consumer electronics giant has revealed further details of the product it will be showcasing at CES in Las Vegas.

LG says the forthcoming LG Smart TV with Google TV combines the familiarity of Google’s Android OS with the convenience and comfort of LG’s 3D and Smart TV technologies, offering consumers a new and attractive home entertainment option.

“LG has constantly strived to provide consumers with wider choices in home entertainment that bring the highest level of sophistication and convenience,” said Havis Kwon, President and CEO of the LG Electronics Home Entertainment Company. “Through Google TV, LG has merged Google’s established Android operating system with LG’s proven 3D and Smart TV technologies, offering consumers a new and enthralling TV experience.”

LG says the LG Google TV’s most attractive feature is its ease of use, thanks to the combination of its Android-based user interface and the Magic Remote Qwerty designed by LG. LG Google TV’s user interface and main screen have been designed for convenient browsing and content selection. Multi-tasking is also possible, as the search, social networking and TV functions can be run simultaneously. The user interface can be accessed using the Magic Remote Qwerty which combines the user-friendly benefits of LG’s Magic Remote with a QWERTY keyboard.

Equipped with LG’s own CINEMA 3D technology, LG Google TV provides a home entertainment experience that is immersive, comfortable and convenient, says the company. Based on LG’s own Film Patterned Retarder (FPR) technology, CINEMA 3D glasses are battery-free and lightweight. “The glasses are also very affordable, making LG’s Google TV ideal for viewing by a large group of family and friends when used in 3D mode. And with a single click of the remote, any 2D content can be viewed in 3D, thanks to the built-in 2D to 3D conversion engine,” notes LG.

LG confirmed that alongside Google TV, the company will continue to advance its own Smart TV platform based on NetCast using open web technology such as Webkit browser and Linux. LG Smart TV with NetCast will be available globally in 85 plus countries at launch.

LG Smart TV with Google TV will be available in two series at launch in the US in 2012. The first demonstration of LG’s Google TV will take place at CES, January 10-13. (Source: Telecomasia.net)

Sharpening data center due diligence: 6-CIOs questions

Asking a board of directors for several hundred million dollars to obtain new data center capacity is one of the least popular requests a senior technology executive can make. As one CIO said, “I have to go to the executive committee and tell them that I need a billion dollars, and in return I’m going to give them exactly nothing in new functionality— I’m going to allow them to stay in business. I’m not looking forward to this.”

Investments in data center capacity are a fact of business life. Businesses require new applications to interact with customers, manage supply chains, process transactions, and analyze market trends. Those applications and the data they use must be hosted in secure, mission-critical facilities. To date, the largest enterprises have needed their own facilities for their most important applications and data.

How much data center capacity you need and when you need it, however, depends not only on the underlying growth of the business but also on a range of decisions about business projects, application architectures, and system designs spread out across many organizations that don’t always take data center capital into account. As a result, it’s easy to build too much or to build the wrong type of capacity. To avoid that, CIOs should ask a set of questions as part of their due diligence on data center– investment programs before going to the executive committee and the board with a capital request.

1. How much impact do our facilities have on the availability of important business applications?

Resiliency is among the most common justifications for data center investments. Out-of-date, low-quality data centers are often an unacceptable business risk. Yet upgrading facilities typically isn’t always the most direct or even the most effective means of making applications more available. At the margin, investments in improved system designs and operations may yield better returns than investments in physical facilities.

Downtime overwhelmingly stems from application and system failures, not facility outages. An online service provider, for example, found that facility outages accounted for about 1 percent of total downtime. Even the most aggressive improvements in facility uptimes would have a marginal impact on application downtimes.

Organizations with high-performing problem-management capabilities can achieve measurably better quality levels by identifying and eliminating the root causes of incidents across the technology stack. Yet many infrastructure organizations do not have integrated problem-management teams.

2. How much more capacity could we get from existing facilities?

In many cases, older data centers are constrained by cooling capacity, even more than by power capacity: insufficient air-conditioning infrastructure limits the amount of server, storage, and network equipment that can be placed in these sites. The data center team can often free up capacity by improving their cooling efficiency, sometimes through inexpensive and quick-to-implement moves.

A large European insurance company, for example, wanted to consolidate part of its data center portfolio in its largest, most resilient data center, which was cooling constrained. The company freed up one to two critical megawatts of capacity in this facility—with approximately $40 million in capital cost savings—by replacing worn floor tiles, cable brushes, and blanking plates (all of which improved air flow) and increasing the operating-temperature range. As a result, the company consolidated facilities and provided capacity for business growth without having to build new capacity.1

3. What does future demand for data center capacity look like and how can virtualization affect it?

World-class data center organizations deeply understand potential demand scenarios. Rather than make straight-line estimates based on historical growth, they use input from business and application-development groups to approximate the likely demand for different types of workloads. They then model potential variations from expected demand, factoring in uncertainties in business growth, application-development decisions, and infrastructure platform choices.

Without a business-driven demand forecast, IT organizations tend to build “just in case” capacity because few data center managers want to be caught short. A large European enterprise, for instance, cut its expansion plans to 15 critical megawatts, from 30, after the data center team conducted a deeper dive with business “owners” to better understand demand growth.

Even after years of rationalization, consolidation, and virtualization, many technology assets run at very low utilization rates, and every incremental server, storage frame, and router takes up space in a data center. Mandating that applications be migrated onto virtualized platforms in the facility, rather than moved onto similarly configured infrastructure, can be a powerful lever not only for reducing IT capital spending broadly but also for limiting new data center capacity requirements. A global bank, for example, cut their six-year demand to nearly 40 megawatts, from 57— a more than 25 percent reduction—by leveraging its data center build program to accelerate the use of virtual machines (Exhibit 1). This translated to a 25 percent reduction in new capacity build. That achievement helped create a political consensus for implementing virtualization technology more aggressively.

4. How can we improve capacity allocation by tier?

Owners of applications often argue that they must run in Tier III or Tier IV data centers to meet business expectations for resiliency.2 Businesses can, however, put large and increasing parts of their application environments in lower-tier facilities, saving as much as 10 to 20 percent on capital costs by moving from Tier IV to Tier III capacity (Exhibit 2). By moving from Tier IV to Tier II, they can cut capital costs by as much as 50 percent.

Many types of existing workloads, such as development-and-testing environments and less critical applications, can be placed in lower-tier facilities with negligible business impact. Lower-tier facilities can host even production environments for critical applications if they use virtualized failover— where redundant capacity kicks in automatically— and the loss of session data is acceptable, as it is for internal e-mail platforms.

With appropriate maintenance, downtime for lower-tier facilities can be much less common than conventional wisdom would have it. One major online service provider, for instance, has hosted all its critical applications in Tier III facilities for 20 years, without a single facility outage. This level of performance far exceeds the conventional Tier III standard, which assumes 1.6 hours of unplanned downtime a year. The company achieved its remarkable record through the quarterly testing and repair of mechanical and electrical equipment, the preemptive replacement of aging components, and well-defined maintenance procedures to minimize outages that result from human error.

It is inherently more efficient and effective to provide resiliency at the application level than at the infrastructure or facility level. Many institutions are rearchitecting applications over time to be “geo-resilient,” so that they run seamlessly across data center locations. In this case, two Tier II facilities can provide a higher level of resiliency at lower cost than a single Tier IV facility. This would allow even an enterprise’s most critical applications to be hosted in lower-tier facilities.

5. How can we incorporate modular designs into our data center footprint?

There is a traditional model for data center expansion: enterprises build monolithic structures in a highly customized way to accommodate demand that is five or sometimes ten years out. In addition, they design facilities to meet the resiliency requirements of the most critical loads. New modular construction techniques (see sidebar “Three modular construction approaches”), however, have these advantages:

  • shifting data center build programs from a craft process for custom-built capacity to an industrial process that allows companies to connect factory-built modules
  • building capacity in much smaller increments
  • making it easier to use lower-tier capacity
  • avoiding the construction of new facilities, by leveraging existing investments (see sidebar “Deploying modular capacity: Two case studies”)

6. What is the complete list of key design decisions and their financial impact?

Even after the company has established its capacity requirements, dozens of design choices could substantially affect the cost to build. They include the following:

  • redundancy level of electrical and mechanical equipment
  • floor structure (for instance, single- or multistory)
  • cooling technology (such as free-air cooling, evaporative chillers, and waterside economizers)
  • degree to which components are shared between or dedicated to modules
  • storm grade (for instance, the maximum wind speed a data center can withstand, as defined by regional or national standards, such as the Miami–Dade County building code and the Saffir–Simpson Hurricane Wind Scale)

Individual choices can have a disproportionate impact on costs per unit of capacity even after a company chooses its tier structure. A large global financial-services firm, for example, looked closely at its incident history and found that electrical failures—rather than mechanical ones—caused almost all of the issues at its facilities. Knowing this, the firm increased its electrical redundancy and decreased mechanical redundancy, shaving off several million dollars in construction costs per megawatt.

Given the scale of the investment required—billion-dollar data center programs are not unheard of— CIOs must undertake aggressive due diligence for their data center capital plans. They can often free up tens of millions of dollars from build programs by asking tough questions about resiliency, capacity, timing, tiering, and design. Perhaps more important, they can tell the executive committee and the board that they are using the company’s capital in the most judicious way possible.
(Source: James Kaplan - McKinsey)

How a grocery giant puts technology at the center of innovation

Cooperative Consumers Coop, better known as Coop, was Italy’s first retailer to embrace hypermarkets, in the 1980s, and then began opening even bigger superstore venues while expanding its offerings to include insurance and banking services, electricity, and prescription drugs. Throughout this expansion, Coop sought innovative ways to support its strategy with technology. Massimo Bongiovanni has strongly helped the company realize that goal as president of Coop Centrale, which manages purchasing and distribution for the retailer’s cooperative network of stores, as well as the IT and services that support marketing, pricing, and other elements of Coop’s commercial policies.

Earlier this year, McKinsey’s Brad Brown, Lorenzo Forina, and Johnson Sikes spoke with Massimo Bongiovanni about technology’s role in fostering growth and innovation.

A rising role for IT: McKinsey Global Survey results

Aspirations—and current expectations—for IT have never been higher. Executives continue to set exacting demands for IT support of business processes, and they see an even larger role for IT in a competitive environment increasingly shaken up by technology disruptions. These are among the results of our sixth annual business technology survey, where we asked executives across all functions, industries, and regions about their companies’ use of, expectations for, and spending on IT.1 Looking ahead, executives expect IT to create new platforms to support innovation and growth, help guide strategy with data and advanced analytics, and stay on top of possible new roles for mobile devices. For IT leaders, the good news is that along with these higher expectations, most respondents also see a greater willingness to spend more on IT.

Google TV Tries Again

LAS VEGAS — Manufacturers of televisions have been searching for something — anything! — to reverse a years-long slide in profits.

This week, several manufacturers plan to unveil their effort at the huge International Consumer Electronics Show here. It’s called Google TV.

If that sounds familiar, it’s because Google has been trying to crack the television market for some time, with rather tepid results. Google’s first foray into television, a partnership with Sony and Logitech in the fall of 2010, didn’t catch on because the remote was so big and complicated and because its software was confusing. And so far, Google’s success has been limited by the availability of on-demand television from the major networks.

Several announcements are expected at the show. LG Electronics says the latest iteration of Google TV has merged Google’s Android operating system with LG’s 3-D and Smart TV technologies, offering consumers “a new and enthralling TV experience.”

Whether consumers will be enthralled remains to be seen, but Google and television manufacturers are wagering that it will be a hit. Besides LG, Sony, Samsung and Vizio will introduce Google TV-powered products.

The idea of Google TV, and more broadly of Internet-connected televisions, is that viewers would be able to watch television in much the same way they browse the Internet, allowing them to watch movies, television shows, concerts and sporting events whenever they wanted.

In the case of Google TV, viewers would use Google’s Android operating system and apps specifically developed for the television. Google says it is a simpler interface than the first version of Google TV and offers more “TV-like” viewing of YouTube

Monday, December 5, 2011

IDC Predicts 2012 Surge for Mobile, Cloud, $1.8 Trillion IT Spending in 2012

Most of the 10 top expectations for the coming year carried over from trends IDC anticipated for 2011. But its predictions report, “Competing for 2020,” brought along a wide-angle view for 2012 and going forward through the next decade. Predictions were winnowed from contributions by about 1,000 IDC analysts in a range of fields covering IT, information management and mobile communications, and were reduced to 10 based on growth opportunity, industry-wide impact and structural disruption.

Topping IDC’s predictions once again was the vast increase in worldwide IT spending, anticipated to go up 6.9 percent to $1.8 trillion in 2012. However, IDC warned of financial and disaster disruptions to this growth, like the worst-case-scenario of the total unraveling of the Euro that IDC stated could push the increase in IT spending down to 2 percent or less in 2012. Growth in emerging markets was the second-place choice driving overall IT changes, and IDC pegged emerging market spending to more than triple current levels to reach 53 percent of all worldwide IT spending in 2012.

Mobile communications ranked twice on the list: in third place, mobile growth will stem from an increase in devices, lowered price points due to emerging markets, and an explosion of apps for home and business; and sixth, with the expectations of mobile network growth and access worldwide. As part of mobile’s expected dominance in 2012, IDC stated that it will be a “make-or-break” year for mobile products from Microsoft, HP and RIM.

Also continuing to make a big splash through 2012 is cloud computing. IDC foresees cloud spending to top $36 billion next year, which is four-times the overall IT industry rate of growth. It ranked expanding cloud adoption fourth in its 2012 list, bringing along expectations of a slew of cloud and as-a-service acquisitions and the development of many more cloud applications. Increased cloud services enablement came in at number five, with 2012 turning into a watershed year for shifting away from self-built systems, and cloud systems management software to increase by 62 percent next year, IDC stated.

During a Web seminar on the predictions, IDC chief analyst and senior vice president Frank Gens said his research and advisory firm made the connection between predictions for 2012 and the outlook for eight years from now based on a generational shift in tech platform adoption and innovation underway. With that in mind, Gens said it is reasonable to think IT spending could hit $5 trillion worldwide by 2020, led by investment and innovation from mobile tech and the cloud.

“It’s easy to see that these technologies will inevitably become the vast majority of all IT spending,” says Gens.

Rounding out the expectations for 2012 are emergence of big data analytics and mash-ups (7); increasingly sophisticated social business capabilities (8); an increase in “intelligent” communications devices tapped into networks (9); and the development of smarter industry solutions in such areas as financial security apps and green tech (10). Gone from IDC’s list of predictions for 2011 are added broadband dedications and delivery, and the growth of consumer Web television.

Saturday, November 19, 2011

European Parliament pushes to maintain Open Internet and Net Neutralitry

Yesterday the European Parliament (EP) adopted a clear-cut position on net neutrality, giving the priority to maintaining an open Internet for all rather than increasing its use for commercial purposes.

A resolution passed by MEPs in Strasbourg calls on the European Commission to ensure that “Internet service providers do not block, discriminate against or impair the ability of any person to use or offer any service, content or application of their choice irrespective of source or target.”

As the Internet evolves into a crucial market for an ever-increasing number of services, many ISPs are stepping up their attempts to prioritize certain traffic in order to offer the best and quickest services to those who pay more.

Most controversial is the intentional slow-down of Internet connections – also referred as ‘throttling’ – for clients who do not pay the full price. Some are even inclined to block specific services such as Skype, to avoid competition with their traditional telephony services.

In their resolution, MEPs did recognize the need for a “reasonable” management of data traffic to ensure that the Internet continues to run smoothly. However, the parliament also clearly underlined that anti-competitive practices should not be allowed.

MEPs asked the Commission to “closely monitor the development of traffic management practices and interconnection agreements, in particular in relation to blocking and throttling of, or excessive pricing for, VoIP and file sharing, as well as anticompetitive behaviour and excessive degradation of quality”.

The text adopted reiterated privacy and data protection concerns raised by the European Data Protection Supervisor (EDPS) who issued an opinion last October warning of “serious implications” for the security of personal data due to an excessively intrusive interpretation of traffic management.

EU Digital Agenda Commissioner Neelie Kroes stands accused by many MEPs of keeping an “ambiguous” approach to net neutrality.

The commission indeed refrained from taking a definitive position on traffic management in its communication on net neutrality published last April. However, it did made clear that further monitoring of dubious practices was required and could lead to regulatory measures in the future.

This analysis is still ongoing, explained Kroes’ spokesperson Ryan Heath.

“The Commission is monitoring the development of traffic management. To this end it has tasked BEREC (the Body of national telecoms regulators) to carry out investigations on net neutrality and traffic management, including instances of blocking and throttling. This work is currently ongoing.”

It remains unclear at this stage whether the commission will come up with new “guidance” for the sector or with binding legislation.

In December the EU telecoms ministers will discuss net neutrality. By the end of this year or the beginning of 2012 the commission is expected to conclude the analysis of traffic management practices.

Read the full article (EurActiv).

Also noteworthy: Traffic jams, ISPs and net neutrality (GigaOm).

Source: EurActiv

Related posts on intug.org:

  1. EC Committed to Open Internet Principles
  2. Europe’s NRAs Investigate Traffic Management Practices
  3. EC Launches Consultation on Net Neutrality