Oracle Rolls Out Private Cloud Architecture And World-Record Transaction Performance

Richard Fichera

On Dec. 2, Oracle announced the next move in its program to integrate its hardware and software assets, with the introduction of Oracle Private Cloud Architecture, an integrated infrastructure stack with Infiniband and/or 10G Ethernet fabric, integrated virtualization, management and servers along with software content, both Oracle’s and customer-supplied. Oracle has rolled out the architecture as a general platform for a variety of cloud environments, along with three specific implementations, Exadata, Exalogic and the new Sunrise Supercluster, as proof points for the architecture.

Exadata has been dealt with extensively in other venues, both inside Forrester and externally, and appears to deliver the goods for I&O groups who require efficient consolidation and maximum performance from an Oracle database environment.

Exalogic is a middleware-targeted companion to the Exadata hardware architecture (or another instantiation of Oracle’s private cloud architecture, depending on how you look at it), presenting an integrated infrastructure stack ready to run either Oracle or third-party apps, although Oracle is positioning it as a Java middleware platform. It consists of the following major components integrated into a single rack:

  1. Oracle x86 or T3-based servers and storage.
  2. Oracle Quad-rate Infiniband switches and the Oracle Solaris gateway, which makes the Infiniband network look like an extension of the enterprise 10G Ethernet environment.
  3. Oracle Linux or Solaris.
  4. Oracle Enterprise Manager Ops Center for management.
Read more

Service Oriented Organizations

Jean-Pierre Garbani

A few days ago I read an interesting article about how organizations need to adapt to virtualization to take full advantage of it.

If we consider that this is, in fact, the first step toward the industrialization of IT, we should consider how the organization of industry evolved over time, from the beginning to the mass-production era. In fact, I think IT will reach the mass-production stage within a few years. If we replicate this evolution in IT, it will go through these phases:

  • The craftsperson era. At the early stage of any industry, we find a solitary figure in a shop soon complemented by similarly minded associates (this is me, 43 years ago). They create valuable and innovative products, but productivity and cost per unit of production is usually through the roof. This is where IT was at the end of the 1960s and the beginning of the 1970s. The organization landscape was dominated by “gurus” who seemed to know everything and were loosely coupled within some kind of primitive structure.
  • The bureaucratic era. As IT was getting more complex, an organizational structure started to appear that tended to “rationalize” IT into a formal, hierarchical structure. In concept, it is very similar to what Max Weber described in 1910: a structure that emphasizes specialization and standardization in pursuit of a common goal. Tasks are split into small increments, mated to skills, and coordinated by a strong hierarchical protocol. The coordination within the organization is primarily achieved through bureaucratic controls. This is the “silo” concept.
Read more

Global Competition For "Doing Business"

Jennifer Belissent, Ph.D.

I've written about the World Bank's Doing Business Index in several blogs and reports.  One of my favorite graphics from my "Where In The World?" report on market opportunity assessment looks at the BRICs (Brazil, Russia, India, and China) - relative to a selection of other emerging markets - in terms of population and then compares their rankings across three economic and political indicators: Doing Business, Economic Freedom, and eReadiness.  The point is that "bigger is not always better" in terms of a potential market to enter. 

Saudi Arabia has used the World Bank's Doing Business Index as a critical measure of its 10 x 10 initiative - a program of reforms launched with the objective of being in the top 10 countries for doing business by 2010.  They missed the mark in 2010. But with the 2011 new rankings, we can congratulate Saudi Arabia's reformers for making it to 11 x 11. 

Read more

How To Measure Global Success And Regional Relevance?

Jost Hoppermann

Similar to the past few years at this time of year, we are in the process of preparing a global banking platform deals report for 2010. As we have done since 2005 to help application delivery teams make informed decisions, we will analyze deals’ structure, determine countable new named deals, and look at extended business as well as key functional areas and hosted deals — all to identify the level of global and regional success as well as functional hot spots for a large number of banking platform vendors.

In the past, some vendors told us that they are not particularly fond of us counting new named deals while only mentioning extended business, renewed licenses, and the like. Why do we do this, and what is the background for this approach? First, extended business as often represents good existing relationships between vendors and banks as it represents product capabilities themselves. Second, we have asked for average deals sizes and license fees for years, but only a minority of vendors typically discloses this information. Thus, we do not have a broad basis for dollar or euro market shares — and I personally shy away from playing the banking platform revenue estimates game.

An Alternative Counting Model Could Be Implemented Easily . . .

Consequently, available data makes counting new named deals the only feasible way to represent an extending or shrinking footprint in the off-the-shelf banking platform market — and thus to also represent customer decisions in favor of one banking platform or the other. Some vendors suggested introducing weights for the size of the bank and the relevance of the seven world regions (for example, North America and Asia Pacific). We could easily do so, but there are problems with this approach:

Read more

GSA Picks Google Apps: What It Means

Ted Schadler

The General Services Administration made a bold decision to move its email and collaboration systems to the cloud. In the RFP issued last June, it was easy to see their goals in the statement of objectives:

This Statement of Objectives (SOO) describes the goals that GSA expects to achieve with regard to the

1. modernization of its e-mail system;

2. provision of an effective collaborative working environment;

3. reduction of the government’s in-house system maintenance burden by providing related business, technical, and management functions; and

4. application of appropriate security and privacy safeguards.

GSA announced yesterday that they choose Google Apps for email and collaboration and Unisys as the implementation partner.

So what does this mean?

What it means (WIM) #1: GSA employees will be using a next-generation information workplace. And that means mobile, device-agnostic, and location-agile. Gmail on an iPad? No problem. Email from a home computer? Yep. For GSA and for every other agency and most companies, it's important to give employees the tools to be productive and engage from every location on every device. "Work becomes a thing you do and not a place you go." [Thanks to Earl Newsome of Estee Lauder for that quote.]

Read more

It's Beyond The Basics For Education: Add T For Technology To The Three Rs

Jennifer Belissent, Ph.D.

“School Bond Measure Fails” seems a common headline these days.  In fact, a quick Google search found that school bond measures and tax levies have just this fall failed all over the US, notably in Santa Clara County, which was characterized as “tax friendly.”  However, despite the hardships of raising money for schools, per-pupil spending continues to increase – having increased steadily from just over $500/pupil in 1919-20 to $11,674/pupil in 2006-07, according to the National Center for Education Statistics

One place that the expenditure has been going has been toward technology investments.  The number of computers in public elementary and secondary schools has increased:  in 2005, the average public school contained 154 instructional computers, compared with only 90 in 1998.  More importantly, the percentage of instructional rooms with access to the Internet increased from 51 percent in 1998 to 94 percent in 2005. 

Read more

Collaboration Will Become More People-Centric In 2011 And Will Challenge C&C Pros

Rob Koplowitz

For a number of years now, Forrester has used the following definition for Web 2.0:

A set of technologies and applications that enable efficient interaction among people, content and data in support of collectively fostering new businesses, technology offerings, and social structures.

For many Content and Collaboration Professionals (C&C Pros), the first half of this definition looks very familiar. Providing knowledge worker with better access to information and co-workers along with communication tools has been the primary goal since collaboration tools began to seriously penetrate the enterprise 20 years ago.

Now the second half of the definition "in support of collectively fostering new businesses, technology offerings and social structures" is a bit different. This maps to some potentially broad and strategic organizational goals.  This is at the core Enterprise Social Media. And Enterprise Social is here. Smart C&C Pros have already begun to take a leadership position in guiding their organization down this path that could be game changer, albeit one that is fraught with challenges.

Here's the challenge: As collaboration moves from being document-centric to more people-centric, the rules change. "Need to know" becomes "need to share". This can be scary, particularly for folks in HR that are concerned with privacy, legal folks that are thinking of intellectual capital, compliance, and the list goes on. Let's not even bring up the word WikiLeaks for heaven's sake. You get the picture.

Read more

Categories:

Build Innovation Zones Into Your Architecture

Randy Heffner

Forrester’s recent book, Empowered, describes the type of technology-based innovation by frontline employees that can cause nightmares for enterprise architects. New tools for business innovation are readily available to anyone, ranging from cloud computing and mobile apps to social networks, scripting languages, and mashups. Faced with long IT backlogs and high IT costs, frontline employees are building their own solutions to push business forward.

What worries architects is that (1) solutions built with these new tools — with little or no vetting — are being hooked to enterprise systems and data, opening potentially big risks to reliability and security, and (2) the siloed, quick-hit nature of these solutions will drive up ongoing costs of maintenance and support. Traditionally, architects use enterprise standards as their primary tool to ensure the quality, efficiency, and security of their organization’s technology base. However, when applied in the typical “lockdown” fashion, standards can stifle innovation — often because vetting a new technology takes longer than the perceived window of business opportunity.

To deal with these conflicting pressures, architects must forge a new equation between responsiveness and technology control. The business value of responsiveness, combined with the typically limited size of enterprise architecture teams, means that most organizations cannot wait for architects to vet every possible new technology. Thus, you must find ways to use architecture to navigate the tension between the business value of responsiveness and the business value of a high-quality technology base. The key is to build innovation zones into your architecture; Forrester defines these as:

Read more

IPv6: Drive Innovation With Rewards, Not Fear

Andre Kindness

I’m a sucker for good, biting humor, and in the spirit of Stephen Colbert’s Medals of Fear that he gave to a few distinguished souls (the press, Mark Zuckerberg, Anderson Cooper) at the rally in Washington D.C., I would like to hand a medal to the U.S. State Department for its 1999 publication of a country-by-country set of "Y2K" warnings — “End of Days” scenarios and solutions — for Americans doing business in 194 nations. I would give another medal to IPv6, the most drawn-out killer technology to date — and one that has had the longest run at trying to scare everyone about the end of IPv4. At Forrester, we are starting to see the adoption freighter slowly turning via the number of inquiries rolling in; governments accelerating their adoption with new mandates; vendors including IPv6 in their solutions; and the Number Resource Organization escalating its announcements about the depletion of IPv4 addresses (only 5% left!). To add to the drama, vendors are in the process of creating IPv4 address countdown clocks to generate buzz and differentiation. These scare tactics haven’t worked because technology pundits haven’t spoken about IPv6 in business terms. There is enormous business value in IPv6; those who embrace it will be the new leaders in their space.

Read more

Open Data Center Alliance – Lap Dog Or Watch Dog?

Richard Fichera

In October, with great fanfare, the Open Data Center Alliance unfurled its banners. The ODCA is a consortium of approximately 50 large IT consumers, including large manufacturing, hosting and telecomm providers, with the avowed intent of developing standards for interoperable cloud computing. In addition to the roster of users, the announcement highlighted Intel with an ambiguous role as a technology advisor to the group. The ODCA believes that it will achieve some weight in the industry due to its estimated $50 billion per year of cumulative IT purchasing power, and the trade press was full of praises for influential users driving technology as opposed to allowing rapacious vendors such as HP and IBM to drive users down proprietary paths that lead to vendor lock-in.

Now that we’ve had a month or more to allow the purple prose to settle a bit, let’s look at the underlying claims, potential impact of the ODCA and the shifting roles of vendors and consumers of technology. And let’s not forget about the role of Intel.

First, let me state unambiguously that one of the core intentions of the ODCA, the desire to develop common use case models that will in turn drive vendors to develop products that comply with the models based on the economic clout of the ODCA members (and hopefully there will be a correlation between ODCA member requirements and those of a wider set of consumers), is a good idea. Vendors spend a lot of time talking to users and trying to understand their requirements, and having the ODCA as a proxy for the requirements of a lot of very influential customers will be a benefit to all concerned.

Read more