On March 30, 2010, Yale University placed a migration to Google Apps for its email services on hold over privacy and security concerns, especially regarding a lack of transparency about in what country its data would be stored in.
Michael Fisher, a computer science professor involved in the decision, said that “People were mainly interested in technical questions like the mechanics of moving, wondering ‘Could we do it?’ ,but nobody asked the question of ‘Should we do it?’” and went on to say that the migration would “also makes the data subject to the vagaries of foreign laws and governments, and “that Google was not willing to provide ITS with a list of countries to which the University’s data could be sent, but only a list of about 15 countries to which the data would not be sent.”
This closely aligns with our January report, “As IaaS Cloud Adoption Goes Global, Tech Vendors Must Address Local Concerns” which examined security and privacy issues involved in moving data to the cloud, especially when it’s no longer clear what country your data will reside in. In this report, we offered that IaaS providers should give “guidance on where data is located and location guarantees if necessary. Rather than merely claiming that data is in the cloud, tech vendors must be prepared to identify the location of data and provide location guarantees (at a premium) if required.”
My colleague, Mike Cansfield, just posted a blog on the new “scramble for Africa” among telecommunications companies. Bharti Airtel, an Indian company, just finalized a deal to take over most of the African assets of Zain, a Kuwaiti company. As Mike mentions, Bharti has been dogged in its efforts to enter the African market with two previous attempts to forge a deal with South Africa’s MTN Group.
Bharti sees significant opportunity on the continent where just 36% of the population owns a mobile phone – yet many more use mobile phone services through resellers who offer use of a phone by the minute in the local markets. Originally part of the informal sector, MTN has actually launched a program to legitimize the sale of on-demand phone services through its Y’ello Zone Payphone initiative. MTN originally pledged to install 7,500 community pay phones across the countries in underserved areas. To date, over 11,000 have been installed. As part of the program, MTN offers entrepreneurs an opportunity to operate these Payphone kiosks, and provides the skills training to run a successful phone shop. The program contributes to job creation, especially among women and youth, with more than 3,800 retailers already benefitting. But, I digress . . .
Consider the following: AT&T expects to save $12 million per year and 123,000 tons of carbon emissions per year using 1E's PC power management software to turn off PCs at night. By turning up the temperature in the data center from 69°F to 74°F, KPMG realized a 12.7% reduction in cooling energy usage. And Citigroup expects to save $11 million and 3,000 tons of greenhouse gases annually by simply enabling duplex settings on printers and copiers.
How are they achieving this? Green IT. Even in the face of a weak economy, Green IT is on the rise with approximately 50% of organizations globally enacting or creating a green IT strategy plan. And don't be fooled: green IT is as much about the greenbacks as it is about reducing the environmental impact of operating IT and the business. In fact, financial motivation — not environmental motivation — is the driving force behind the pursuit of greener IT (see Forrester’s “Q&A: The Economics Of Green IT”).
But despite the optimism, IT “blowhards” across the globe are negating the carbon reduction benefits of green IT one breath at a time. While virtualizing servers or powering down your PCs will reduce energy spend and CO2 emissions, Forrester finds that these jabber mouths — speaking fast, loud, and out of turn using unnecessarily wordy vocabulary — are creating a zero sum game.
It was quite a challenge to nail down all the detailed points ... and of course, the publishing process took a little getting used to. To be honest, I had most of it finalized over a month ago.
The next doc is just about to go into the editing queue - that will focus on the rationale behind the Pega acquisition of Chordiant, highlighting a major shift we see in the way that Enterprise Apps are developed.
Next week, I will present first results of Forrester’s 2009 global banking platform deals survey. A total of 17 banking platform vendors submitted their 2009 deals for evaluation. One year ago, the same set of deals would have represented at least 19 vendors: In the 2009 survey, FIS’s deals include those of acquired US-based Metavante, and Temenos’ deals include those of acquired French Viveo. These theoretically 19 participating vendors submitted a total of 1,068 banking platform deals to evaluate, a steep increase compared with the about 870 submitted deals for 2008.
We had to classify a large share of these 1,068 banking platform deals as extended business or even as a simple renewed license — if the vendors did not already submitted them with the according tag. Forrester’s “rules of the game” did not allow us to recognize further deals, for example, because a non-financial-services firm signed a deal. Overall, Forrester counted 269 of the submitted deals as 2009 new named customers, compared with 290 for 2008. In the past, Forrester sorted the vendors into four buckets: Global Power Sellers, Global Challengers, Pursuers, and Base Players. The Pursuers and in particular the Global Challengers saw only minor changes in the previous years. 2009 has shaken this stable structure, and we will see many vendors in groups they haven’t been in before.
Over the past year, I have received numerous inquiries asking me whether third-party database tools that focus on performance and tuning, backup recovery, replication, upgrade, troubleshooting, and migration capabilities matter anymore now that leading DBMS providers such as Oracle, IBM, and Microsoft are offering improved automation and broader coverage.
I find that third-party tools complement well with native database tools in assisting DBAs, developers and operational staff in their day-to-day activities. Last year, I had the opportunity to speak to dozens of enterprises that support hundreds and thousands of databases across various DBMSes. Most enterprises reported they saw at least a 20 percent IT staff productivity when using a third-party database tool.
Third-party vendor tools remain equally important because they support:
I was lucky enough last week [22 March 2010] to moderate a panel at EclipseCon on the future of application servers. The panelists did a great job, but I thought were far too conservative in their views. I agree with them that many customers want evolutionary change from today to future app servers, but I see requirements driving app servers toward radical change. Inevitably.
The changes I see:
Get more value from servers, get responsive, get agile and flexible
Last week I was once again hustling through a brutal travel week (10,000 miles in the air and two packed red-eyes) when I came across something really interesting. It was ~ 9 AM and I'd just gotten off AA flight 4389 from Toronto. I was a bit bleary eyed from a 4 AM call with a Finnish customer and was just trying to schlep my way to the Admiral's club for a cup of coffee when I stumbled across Accenture's Interactive Network display at the juncture of terminal H and K.
So what? You might ask, it's just a big screen and we already know our future is minority report -right? Yes - those of us in the echo chamber might know that, but what really struck me was watching my fellow travelers and how they interacted with the display. I sat and watched for about 10 minutes (while forgetting about the sorely needed cuppa joe) and just watched people as they started to walk past, then pause, then go up to the screen and start playing with it. On average folks would stay for a few minutes and read some of the latest news feeds, then hurry on to their next stop. But what I really found intriguing was how they interacted with the system:
What if there was an easy way to increase employee productivity by 10% using the technology that’s already in place? What would that do to the bottom line? Even a 1% gain would be significant for most large organizations. In this day and age when CIOs are competing for budget and every dollar of technology investment must be justified, CIOs should not overlook training as a means to boost employee productivity and the ROI of existing technology investments.
Unfortunately it seems that too few people really know how to use the applications they have available in an effective way. Take for example the proliferation of spreadsheets in the workplace. Tools like Microsoft Excel have amazing features that support some powerful analysis and reporting. Yet many people fail to utilize basic productivity features built into such applications. We probably all observe people misusing tools and completing work the hard way simply because they don’t know any better. And Excel is just one tool that many of us use day-in-day-out. Outlook has some amazing features to boost productivity but few people know how to take advantage of them.
Even where some level of training in core ERP applications is provided to new employees, we know that very little is actually absorbed in early training. And much of IT training is focused on what buttons to press in what sequence to get a job done; very little seems to focus on how to use all the technology together as part of a productive business process.
In-memory analytics are all abuzz for multiple reasons. Speed of querying, reporting and analysis is just one. Flexibility, agility, rapid prototyping is another. While there are many more reasons, not all in-memory approaches are created equal. Let’s look at the 5 options buyers have today:
1. In-memory OLAP. Classic MOLAP cube loaded entirely in memory
Vendors: IBM Cognos TM1, Actuate BIRT
Fast reporting, querying and analysts since the entire model and data are all in memory.
Ability to write back.
Accessible by 3rd party MDX tools (IBM Cognos TM1 specifically)
Requires traditional multidimensional data modeling.
Limited to single physical memory space (theoretical limit of 3Tb, but we haven’t seen production implementations of more than 300Gb – this applies to the other in-memory solutions as well)
2. In-memory ROLAP. ROLAP metadata loaded entirely in memory.
Speeds up reporting, querying and analysis since metadata is all in memory.
Not limited by physical memory
Only metadata, not entire data model is in memory, although MicroStrategy can build complete cubes from the subset of data held entirely in memory
Requires traditional multidimensional data modeling.
3. In memory inverted index. Index (with data) loaded into memory
Vendors: SAP BusinessObjects (BI Accelerator), Endeca
Fast reporting, querying and analysts since the entire index is in memory
Less modelling required than an OLAP based solution