Consider the following: AT&T expects to save $12 million per year and 123,000 tons of carbon emissions per year using 1E's PC power management software to turn off PCs at night. By turning up the temperature in the data center from 69°F to 74°F, KPMG realized a 12.7% reduction in cooling energy usage. And Citigroup expects to save $11 million and 3,000 tons of greenhouse gases annually by simply enabling duplex settings on printers and copiers.
How are they achieving this? Green IT. Even in the face of a weak economy, Green IT is on the rise with approximately 50% of organizations globally enacting or creating a green IT strategy plan. And don't be fooled: green IT is as much about the greenbacks as it is about reducing the environmental impact of operating IT and the business. In fact, financial motivation — not environmental motivation — is the driving force behind the pursuit of greener IT (see Forrester’s “Q&A: The Economics Of Green IT”).
But despite the optimism, IT “blowhards” across the globe are negating the carbon reduction benefits of green IT one breath at a time. While virtualizing servers or powering down your PCs will reduce energy spend and CO2 emissions, Forrester finds that these jabber mouths — speaking fast, loud, and out of turn using unnecessarily wordy vocabulary — are creating a zero sum game.
It was quite a challenge to nail down all the detailed points ... and of course, the publishing process took a little getting used to. To be honest, I had most of it finalized over a month ago.
The next doc is just about to go into the editing queue - that will focus on the rationale behind the Pega acquisition of Chordiant, highlighting a major shift we see in the way that Enterprise Apps are developed.
Next week, I will present first results of Forrester’s 2009 global banking platform deals survey. A total of 17 banking platform vendors submitted their 2009 deals for evaluation. One year ago, the same set of deals would have represented at least 19 vendors: In the 2009 survey, FIS’s deals include those of acquired US-based Metavante, and Temenos’ deals include those of acquired French Viveo. These theoretically 19 participating vendors submitted a total of 1,068 banking platform deals to evaluate, a steep increase compared with the about 870 submitted deals for 2008.
We had to classify a large share of these 1,068 banking platform deals as extended business or even as a simple renewed license — if the vendors did not already submitted them with the according tag. Forrester’s “rules of the game” did not allow us to recognize further deals, for example, because a non-financial-services firm signed a deal. Overall, Forrester counted 269 of the submitted deals as 2009 new named customers, compared with 290 for 2008. In the past, Forrester sorted the vendors into four buckets: Global Power Sellers, Global Challengers, Pursuers, and Base Players. The Pursuers and in particular the Global Challengers saw only minor changes in the previous years. 2009 has shaken this stable structure, and we will see many vendors in groups they haven’t been in before.
Over the past year, I have received numerous inquiries asking me whether third-party database tools that focus on performance and tuning, backup recovery, replication, upgrade, troubleshooting, and migration capabilities matter anymore now that leading DBMS providers such as Oracle, IBM, and Microsoft are offering improved automation and broader coverage.
I find that third-party tools complement well with native database tools in assisting DBAs, developers and operational staff in their day-to-day activities. Last year, I had the opportunity to speak to dozens of enterprises that support hundreds and thousands of databases across various DBMSes. Most enterprises reported they saw at least a 20 percent IT staff productivity when using a third-party database tool.
Third-party vendor tools remain equally important because they support:
I was lucky enough last week [22 March 2010] to moderate a panel at EclipseCon on the future of application servers. The panelists did a great job, but I thought were far too conservative in their views. I agree with them that many customers want evolutionary change from today to future app servers, but I see requirements driving app servers toward radical change. Inevitably.
The changes I see:
Get more value from servers, get responsive, get agile and flexible
Last week I was once again hustling through a brutal travel week (10,000 miles in the air and two packed red-eyes) when I came across something really interesting. It was ~ 9 AM and I'd just gotten off AA flight 4389 from Toronto. I was a bit bleary eyed from a 4 AM call with a Finnish customer and was just trying to schlep my way to the Admiral's club for a cup of coffee when I stumbled across Accenture's Interactive Network display at the juncture of terminal H and K.
So what? You might ask, it's just a big screen and we already know our future is minority report -right? Yes - those of us in the echo chamber might know that, but what really struck me was watching my fellow travelers and how they interacted with the display. I sat and watched for about 10 minutes (while forgetting about the sorely needed cuppa joe) and just watched people as they started to walk past, then pause, then go up to the screen and start playing with it. On average folks would stay for a few minutes and read some of the latest news feeds, then hurry on to their next stop. But what I really found intriguing was how they interacted with the system:
What if there was an easy way to increase employee productivity by 10% using the technology that’s already in place? What would that do to the bottom line? Even a 1% gain would be significant for most large organizations. In this day and age when CIOs are competing for budget and every dollar of technology investment must be justified, CIOs should not overlook training as a means to boost employee productivity and the ROI of existing technology investments.
Unfortunately it seems that too few people really know how to use the applications they have available in an effective way. Take for example the proliferation of spreadsheets in the workplace. Tools like Microsoft Excel have amazing features that support some powerful analysis and reporting. Yet many people fail to utilize basic productivity features built into such applications. We probably all observe people misusing tools and completing work the hard way simply because they don’t know any better. And Excel is just one tool that many of us use day-in-day-out. Outlook has some amazing features to boost productivity but few people know how to take advantage of them.
Even where some level of training in core ERP applications is provided to new employees, we know that very little is actually absorbed in early training. And much of IT training is focused on what buttons to press in what sequence to get a job done; very little seems to focus on how to use all the technology together as part of a productive business process.
In-memory analytics are all abuzz for multiple reasons. Speed of querying, reporting and analysis is just one. Flexibility, agility, rapid prototyping is another. While there are many more reasons, not all in-memory approaches are created equal. Let’s look at the 5 options buyers have today:
1. In-memory OLAP. Classic MOLAP cube loaded entirely in memory
Vendors: IBM Cognos TM1, Actuate BIRT
Fast reporting, querying and analysts since the entire model and data are all in memory.
Ability to write back.
Accessible by 3rd party MDX tools (IBM Cognos TM1 specifically)
Requires traditional multidimensional data modeling.
Limited to single physical memory space (theoretical limit of 3Tb, but we haven’t seen production implementations of more than 300Gb – this applies to the other in-memory solutions as well)
2. In-memory ROLAP. ROLAP metadata loaded entirely in memory.
Speeds up reporting, querying and analysis since metadata is all in memory.
Not limited by physical memory
Only metadata, not entire data model is in memory, although MicroStrategy can build complete cubes from the subset of data held entirely in memory
Requires traditional multidimensional data modeling.
3. In memory inverted index. Index (with data) loaded into memory
Vendors: SAP BusinessObjects (BI Accelerator), Endeca
Fast reporting, querying and analysts since the entire index is in memory
Less modelling required than an OLAP based solution
Smoke and fire is all around you, the sound of the alarm makes you dizzy and people are running in panic to escape the inferno while you have to find your way to safety. This is not a scene in the latest video game but actually training for e.g. field engineers in an exact virtual copy of a real world environment such as oil platforms or manufacturing plants.
In a recent discussion with VRcontext, a company based in Brussels and specialized since 10 years in asset virtualization, I was fascinated by the possibilities to create virtual copies of real world large, extremely complex assets simply from scanning existing CAD plans or on-site laser scans. It’s not just the 3D virtualization but the integration of the virtual world with Enterprise Asset Management (EAM), ERP, LIMS, P&ID and other systems that allows users to track, identify and locate every single piece of equipment in the real and virtual world.
These solutions are used today for safety training simulations as well as to increase operational efficiency e.g. in asset maintenance processes. There are still areas for further improvements, like the integration of RFID tags or sensor readings. However, as the technology further matures I can see future use cases all over the place – from the virtualization of any kind of location that is difficult or dangerous to enter to simple office buildings for a ‘company campus tour’ or a ‘virtual meeting’. And it doesn’t require super-computing power – it all runs on low-spec, ‘standard’ PCs and the models are only taking few GBytes storage.
So if you are bored of running around in Second Life or World Of Warcraft, if you ever have the chance, exchange your virtual sword for a wrench and visit the ‘real’ virtual world of a fascinating oil rig or refinery.
I had an interesting discussion today with a new company called Cogniciti that is developing a platform for helping adults “extend their memory and cognitive abilities longer in the lifespan.” Based on research from Baycrest, a health services center focusing on aging and affiliated with the University of Toronto, the company’s work is grounded in solid research.
I think extending one’s memory to stay as sharp as possible in both professional and personal life is a hot topic that is growing fast as an essential component of general fitness. We spend hours at the gym maintaining our physical fitness. But in order to enjoy our healthy bodies, we also need to be mentally fit. In the last few months, I’ve seen a lot of emphasis on informing people about what they can do to maintain their memory. PBS had a special over the Holidays and a brain fitness package was one of the “thank you” gifts for pledging money to the TV station. I picked up an AARP magazine in a doctor’s office last week and the lead article was on exercising the brain through challenging games. I felt quite satisfied when I completed the puzzles effortlessly (whew!)
I’m convinced we will see brain fitness as a soft skill for employees in the corporate world. Everyone can use memory strategies to improve their work performance. I like the blend of research and technology. Using self-paced online information and exercises that use simulations and other multimedia production techniques combined with self-study and online discussions give employees a complete brain enhancing program. Employees can also access brain games and exercises from their mobile device and get some brain stimulation on the way to work in the morning!