Having temporarily relocated to Eastern France – far from Paris and closer to Switzerland and Italy – I recently had the pleasure of experiencing my closest Cisco TelePresence site in Rolle, Switzerland, on the north shore of Lake Geneva for a discussion with Paul Mountford, President of Cisco’s Emerging Markets Theatre. Cisco’s Emerging Market’s strategy has focused on what they call “country transformation,” which revolves primarily around increasing the penetration of broadband. For FY11, Cisco will shift the message from country-wide transformation to something that rings closer to home (literally and figurative) with talk of “life-changing” stories through “life-changing” networks. While still a little lofty, the message resonates deeper than country transformation, which speaks primarily, if not exclusively, to high-level government officials. More a topic for Davos than for a boardroom or a living room.
Despite the lack of a sustained full-on recovery in the global economy, one gets the feeling that we're at the beginning of a period of tech expansion and growth, doesn't one? For many, 2011 budgeting planning is happening now, so it remains to be seen what yourexpansion and growth will be in the near term, but there's certainly no shortage of interesting new developments from technology vendors to whet your appetite.
While it's fun to look at emerging tech and imagine what impact it might have several years from now, it's a bit more pragmatic to focus on the technology trends that will be hitting the mainstream and making significant waves in the corporate world and in the public sector in the next few years.
In Q4 of last year Forrester published The Top 15 Technology Trends EA Should Watch. The author, analyst Alex Cullen, spoke with a few dozen analysts for input and then applied strict criteria for inclusion of a particular tech trend in the doc: 1) significant business or IT impact in the next 3 years; 2) newness, with implications not only for new business capabilities but also for the organization's understanding of the technology and how to manage it; and 3) complexity, especially regarding cross-functional impact to the organization.
As mentioned in some earlier posts, in the past quarters, I have been looking into the role that Market Research professionals play (and can play) with regard to information management. I’ve had many enlightening conversations about this topic with both vendors and client-side market researchers.
Technology developments result in more and more information becoming available internally, and at different parts of the organization. Just think about all the data an average company collects or buys — media measurement data, advertising awareness, advertising spend, retail data, sales data, competitive intelligence, Web-tracking data (from listening tools), Web site tracking, marketing data (e.g., Nielsen Claritas), customer satisfaction surveys, brand trackers, and other primary research data, to name just a few. One vendor estimated that the average research department handles around 50 different research sources!
When I spoke with vendors about their relationship with clients, each and every one of them was looking for ways to increase the level of engagement. For one thing, they are working on best-in-class reporting tools to make it easier for clients to process their data and make it visually more interesting — and hopefully easier to use. However, not many vendors think further than their own set of data. When questioned, they mention that their systems don’t allow for third-party data. Yes, it’s possible to link to internal CRM systems, but that’s about as far as things go.
In a recent discussion with a group of infrastructure architects, power architecture, especially UPS engineering, was on the table as a topic. There was general agreement that UPS systems are a necessary evil, cumbersome and expensive beasts to put into a DC, and a lot of speculation on alternatives. There was general consensus that the goal was to develop a solution that would be more granular install and deploy and thus allow easier and ad-hoc decisions about which resources to protect, and agreement that battery technologies and current UPS architectures were not optimal for this kind of solution.
So what if someone were to suddenly expand battery technology R&D investment by a factor of maybe 100x of R&D and into battery technology, expand high-capacity battery production by a giant factor, and drive prices down precipitously? That’s a tall order for today’s UPS industry, but it’s happening now courtesy of the auto industry and the anticipated wave of plug-in hybrid cars. While batteries for cars and batteries for computers certainly have their differences in terms of depth and frequency of charge/discharge cycles, packaging, lifespan, etc, there is little doubt that investments in dense and powerful automotive batteries and power management technology will bleed through into the data center. Throw in recent developments in high-charge capacitors (referred to in the media as “super capacitors”), which add the impedance match between the requirements for spike demands and a chemical battery’s dislike of sudden state changes, and you have all the foundational ingredients for major transformation in the way we think about supplying backup power to our data center components.
During the past few months, telecom service providers including AT&T, Sprint and Verizon have highlighted their roadmaps and deployment plans for 4G network technologies. These 4G technologies include Long-Term Evolution (LTE) and WiMax networks. Enterprises in North America and Europe are in the early stages of 4G network adoption based on results from Forrester’s SMB and Enterprise Networks and Telecommunication survey. Approximately 4% of surveyed enterprises currently implement or are expanding their implementation of fixed or mobile WiMax networks, and 3% of firms are implementing or expanding their implementation of LTE networks. These implementation percentages are expected to increase as the service providers pursue their 4G deployment initiatives.
There has been an interesting PR battle in Washington over the last few weeks about the number of massive regulations still on the administration's agenda. House Minority Leader John Boehner wrote a memo to President Obama citing a list of 191 proposed rules expected to have a more than $100 million impact on the economy (each!) and asking for clarification on the number of these pending rules that would surpass the $1 billion mark. The acting head of the Office of Management and Budget responded, saying that the number of "economically significant bills" passed last year actually represented a downward trend, and the current number on the agenda is more like 13.
For those of you wanting a little more clarification, you can search through the OMB's Unified Agenda and Regulatory Plan by economic significance, key terms, entities affected, and other criteria. Making sense of all of these proposed rules will take time, but it will help you get an idea of issues that your organization may have to face in the near future.
Coincidentally, my latest report, The Regulatory Intelligence Battlefield Heats Up, went live yesterday. In this paper, I offer an overview of different available resources to keep up with new and changing regulations as well as relevant legal guidance.
But interest isn’t equally high across different consumer industries. Below, you’ll find a graphic showing the top five industries that consumers are interested in participating with for co-creation efforts.
Household technology products like PCs and TVs top the list, but CPG, home entertainment (i.e., movies and music), household appliances (i.e., washing machines and refrigerators), and small kitchen appliances follow closely. As usual, men and women have different interests: While women account for 51% of all willing co-creators, they account for a much greater share of the audience interested in co-creating with CPG companies and clothing, footwear, and small kitchen appliance manufacturers.
Forrester's latest forecast for the technology economy is bullish, which by extension means good news for providers of software and services focused on improving corporate sustainability.
In our new outlook for IT spending by businesses and governments, we estimate that the market will hit $1.58 trillion in 2010, up almost 8 percent from the depressed 2009 level, and grow by a further 8.4 percent to $1.71 trillion in 2011 (global purchases expressed in U.S. dollars). U.S. government data about the overall economy, and tech vendors' Q1-Q2 financial reports, buttress our expectation that IT spending will growth at more than double the rate of the overall economy in 2010-11 and even beyond. See the details in Andrew Bartels's latest report here.
We expect that some of the prime beneficiaries of this positive outlook for IT spending will be those services and software suppliers that are focused on helping clients improve their sustainability posture. In particular, we are very positive on the outlook for sustainability consulting, and for enterprise carbon and energy management (ECEM) software.
Our research team is working now on reports that will update our outlook and spending forecasts for these two exciting markets. As we work with clients in enterprise IT organizations, it's clear that the "green IT" of yesterday is becoming the "IT for green" of tomorrow; that is, IT organizations and infrastructure are increasingly being deployed to meet the corporatewide sustainability challenge, not just improving IT's own energy efficiency and CO2 footprint.
It’s probably fair to say that the computer community is obsessed with speed. After all, our people buy computers to solve problems, and generally the faster the computer, the faster the problem gets solved. The earliest benchmark that I have seen is published in “High Speed Computing Devices, Engineering Resource Devices, McGraw Hill, 1950.” They cite the Marchant desktop calculator as achieving a best-in-class result of 1,350 digits per minute for addition, and the threshold problems then were figuring out how to break down Newton Raphsen equation solvers for maximum computational efficiency. And so the race begins…
Not much has changed since 1950. While our appetites are now expressed in GFLOPs per CPU and TFLOPS per system, users continue to push for escalation of performance in numerically intensive problems. Just as we settled down to a relatively predictable performance model with standard CPUs and cores glued into servers and aggregated into distributed computing architectures of various flavors, along came the notion of attached processors. First appearing in the 1960s and 1970s as attached mainframe vector processors and attached floating point array processors for minicomputers, attached processors have always had a devoted and vocal minority support within the industry. My own brush with them was as a developer using a Floating Point Systems array processor attached to a 32-bit minicomputer to speed up a nuclear reactor core power monitoring application. When all was said and done, the 50X performance advantage of the FPS box had decreased to about 3.5X for the total application. Not bad, but a defeat of expectations. Subsequent brushes with attempts to integrate DSPs with workstations left me a bit jaundiced about the future of attached processors as general purpose accelerators.
Forrester received more than 1,000 inquiries on SaaS and cloud services to date in 2010. With SaaS gaining maturity and even becoming the more common way to deploy software in some categories, firms are increasingly opting for SaaS solutions in place of packaged apps.
With the growing uptake of SaaS, Forrester has seen a change in the nature of questions about SaaS. Firms are not only asking basics around the whens and whys of SaaS but they are also asking more strategic questions around SaaS sourcing and SaaS vendor management, as well as how to set up organizational structure and hire the right skills to succeed with SaaS deployments.
Stay tuned for the full analysis of Forrester's SaaS inquiry data for the first half of 2010, to be published shortly.
Also, for anyone interested in a more in-depth analysis of SaaS and cloud services trends and best practices, we are hosting our first full-day workshop on the topic in Forrester’s Cambridge, Mass., headquarters on September 16. For more details about this event, please click here.
Please share your thoughts and connect with me on Twitter @lizherbert.