I'm pleased to announce that "The Forrester Wave™: UK Interactive Agencies — Web Design Capabilities, Q1 2010", is now available to Forrester clients on the Forrester Web site.
This report is an evaluation of the Web design capabilities of leading UK design agencies: AKQA, Amaze, Detica, EMC Consulting, LBi, Reading Room, Sapient Interactive, VML London, and Wunderman. Putting this together took six months of effort by a hard-working team that included Harley Manning, Angela Beckers, Richard Gans, William Chu and Shelby Catino.
In our research, we found that Detica and Sapient Interactive led the pack for transaction-led projects, due in large part to the high usability scores earned by the client reference sites they provided for evaluation. AKQA, EMC Consulting, LBi, Reading Room, and Wunderman were Strong Performers for transaction-led projects, with AKQA's exemplary Brand Image Review scores moving it into the Leaders' circle for image-led projects. Rounding out the field, Amaze showed strength in multilingual projects and image-led projects, while VML London earned top scores from both reference clients for the business results it produced. Both agencies came in as Contenders.
All nine vendors in this report have significant market presence and capabilities to service large clients. They are all ranked in the top 25 UK agencies by fee revenue (using data published by New Media Age).
What sets the Wave apart from other industry rankings and awards is the transparent, fact-based evaluation that underpins it. Forrester clients have the ability to look at detailed vendor scorecards and see what the strengths and weaknesses of each agency are.
To gather information on the strength of each vendor's current offering (represented on the vertical axis) and strategy (represented on the horizontal axis), we used the following methods:
Next week, I will present first results of Forrester’s 2009 global banking platform deals survey. A total of 17 banking platform vendors submitted their 2009 deals for evaluation. One year ago, the same set of deals would have represented at least 19 vendors: In the 2009 survey, FIS’s deals include those of acquired US-based Metavante, and Temenos’ deals include those of acquired French Viveo. These theoretically 19 participating vendors submitted a total of 1,068 banking platform deals to evaluate, a steep increase compared with the about 870 submitted deals for 2008.
We had to classify a large share of these 1,068 banking platform deals as extended business or even as a simple renewed license — if the vendors did not already submitted them with the according tag. Forrester’s “rules of the game” did not allow us to recognize further deals, for example, because a non-financial-services firm signed a deal. Overall, Forrester counted 269 of the submitted deals as 2009 new named customers, compared with 290 for 2008. In the past, Forrester sorted the vendors into four buckets: Global Power Sellers, Global Challengers, Pursuers, and Base Players. The Pursuers and in particular the Global Challengers saw only minor changes in the previous years. 2009 has shaken this stable structure, and we will see many vendors in groups they haven’t been in before.
Over the past year, I have received numerous inquiries asking me whether third-party database tools that focus on performance and tuning, backup recovery, replication, upgrade, troubleshooting, and migration capabilities matter anymore now that leading DBMS providers such as Oracle, IBM, and Microsoft are offering improved automation and broader coverage.
I find that third-party tools complement well with native database tools in assisting DBAs, developers and operational staff in their day-to-day activities. Last year, I had the opportunity to speak to dozens of enterprises that support hundreds and thousands of databases across various DBMSes. Most enterprises reported they saw at least a 20 percent IT staff productivity when using a third-party database tool.
Third-party vendor tools remain equally important because they support:
I was lucky enough last week [22 March 2010] to moderate a panel at EclipseCon on the future of application servers. The panelists did a great job, but I thought were far too conservative in their views. I agree with them that many customers want evolutionary change from today to future app servers, but I see requirements driving app servers toward radical change. Inevitably.
The changes I see:
Get more value from servers, get responsive, get agile and flexible
Last week I was once again hustling through a brutal travel week (10,000 miles in the air and two packed red-eyes) when I came across something really interesting. It was ~ 9 AM and I'd just gotten off AA flight 4389 from Toronto. I was a bit bleary eyed from a 4 AM call with a Finnish customer and was just trying to schlep my way to the Admiral's club for a cup of coffee when I stumbled across Accenture's Interactive Network display at the juncture of terminal H and K.
So what? You might ask, it's just a big screen and we already know our future is minority report -right? Yes - those of us in the echo chamber might know that, but what really struck me was watching my fellow travelers and how they interacted with the display. I sat and watched for about 10 minutes (while forgetting about the sorely needed cuppa joe) and just watched people as they started to walk past, then pause, then go up to the screen and start playing with it. On average folks would stay for a few minutes and read some of the latest news feeds, then hurry on to their next stop. But what I really found intriguing was how they interacted with the system:
What if there was an easy way to increase employee productivity by 10% using the technology that’s already in place? What would that do to the bottom line? Even a 1% gain would be significant for most large organizations. In this day and age when CIOs are competing for budget and every dollar of technology investment must be justified, CIOs should not overlook training as a means to boost employee productivity and the ROI of existing technology investments.
Unfortunately it seems that too few people really know how to use the applications they have available in an effective way. Take for example the proliferation of spreadsheets in the workplace. Tools like Microsoft Excel have amazing features that support some powerful analysis and reporting. Yet many people fail to utilize basic productivity features built into such applications. We probably all observe people misusing tools and completing work the hard way simply because they don’t know any better. And Excel is just one tool that many of us use day-in-day-out. Outlook has some amazing features to boost productivity but few people know how to take advantage of them.
Even where some level of training in core ERP applications is provided to new employees, we know that very little is actually absorbed in early training. And much of IT training is focused on what buttons to press in what sequence to get a job done; very little seems to focus on how to use all the technology together as part of a productive business process.
In-memory analytics are all abuzz for multiple reasons. Speed of querying, reporting and analysis is just one. Flexibility, agility, rapid prototyping is another. While there are many more reasons, not all in-memory approaches are created equal. Let’s look at the 5 options buyers have today:
1. In-memory OLAP. Classic MOLAP cube loaded entirely in memory
Vendors: IBM Cognos TM1, Actuate BIRT
Fast reporting, querying and analysts since the entire model and data are all in memory.
Ability to write back.
Accessible by 3rd party MDX tools (IBM Cognos TM1 specifically)
Requires traditional multidimensional data modeling.
Limited to single physical memory space (theoretical limit of 3Tb, but we haven’t seen production implementations of more than 300Gb – this applies to the other in-memory solutions as well)
2. In-memory ROLAP. ROLAP metadata loaded entirely in memory.
Speeds up reporting, querying and analysis since metadata is all in memory.
Not limited by physical memory
Only metadata, not entire data model is in memory, although MicroStrategy can build complete cubes from the subset of data held entirely in memory
Requires traditional multidimensional data modeling.
3. In memory inverted index. Index (with data) loaded into memory
Vendors: SAP BusinessObjects (BI Accelerator), Endeca
Fast reporting, querying and analysts since the entire index is in memory
Less modelling required than an OLAP based solution
Smoke and fire is all around you, the sound of the alarm makes you dizzy and people are running in panic to escape the inferno while you have to find your way to safety. This is not a scene in the latest video game but actually training for e.g. field engineers in an exact virtual copy of a real world environment such as oil platforms or manufacturing plants.
In a recent discussion with VRcontext, a company based in Brussels and specialized since 10 years in asset virtualization, I was fascinated by the possibilities to create virtual copies of real world large, extremely complex assets simply from scanning existing CAD plans or on-site laser scans. It’s not just the 3D virtualization but the integration of the virtual world with Enterprise Asset Management (EAM), ERP, LIMS, P&ID and other systems that allows users to track, identify and locate every single piece of equipment in the real and virtual world.
These solutions are used today for safety training simulations as well as to increase operational efficiency e.g. in asset maintenance processes. There are still areas for further improvements, like the integration of RFID tags or sensor readings. However, as the technology further matures I can see future use cases all over the place – from the virtualization of any kind of location that is difficult or dangerous to enter to simple office buildings for a ‘company campus tour’ or a ‘virtual meeting’. And it doesn’t require super-computing power – it all runs on low-spec, ‘standard’ PCs and the models are only taking few GBytes storage.
So if you are bored of running around in Second Life or World Of Warcraft, if you ever have the chance, exchange your virtual sword for a wrench and visit the ‘real’ virtual world of a fascinating oil rig or refinery.
I find it quite amazing to see the societal impact of mobile phones.
They have changed the way we communicate and live. There is a drastic change in the way children and parents communicate, in our individual relationships with time and location and in so many other parts of our daily lives. There are interesting books and theses about this topic. I recently came across an interesting view point from Russell Buckley about the "Unintended Consequences and the Success of Blackberry in the Middle East", which is further proof of how disruptive mobile can be. As communication and creation/media tools, mobile phones offer new ways to upload and access information (remember the riots in Iran). As such, governments have to monitor and anticipate this impact.
Beyond this, public authorities can make the most of mobile services. Many local councils, regional and national governments, and transport authorities are launching mobile initiatives, creating new value-added services for citizens, and trying to use mobile to connect with the least connected. They need to anticipate the arrival of NFC technology and make the most of more mature mobile ecosystems. They should balance their mobile investments with the constant need to avoid discriminating against particular groups of citizens and to allocate funds to projects with critical mass. Governments in particular can play a key role in stimulating ideas for new services and in backing and funding the most relevant initiatives.
A few years ago, Procter & Gamble publicly stated that it had experienced inconsistent research results from successive online research projects. Other organizations shared similar experiences, and questions were raised about “professional respondents.” The trustworthiness of online research was in question, and multiple initiatives arose. In the past two years, we’ve seen a lot of debate around this topic, and associations such as ESOMAR and ARF have come up with protocols that all good panels should follow — and many have. But what does this mean from a client perspective? How have initiatives like ARF's Quality Enhancement Process, MarketTools' TrueSample, or processes like machine fingerprinting changed the industry?