It’s not the most daring and cutting-edge prediction to say 2011 will be Wi-Fi’s second coming. However, you might be caught off guard when I tell you to not worry about a vendor’s WLAN architecture. Your business needs will flush out the right one. Despite the initial hype seven years ago that Wi-Fi was going to be the new edge, it’s been the second choice for most users to connect with at work — but that will change. A tidal wave of wireless devices will be crashing through the enterprise front door very soon. Just look at the carriers scrambling to build out their infrastructure — there’s no shortage of stories about AT&T and their build-out of Wi-Fi in metropolitan areas. And users have fused their work and personal phones and are looking to seek coverage from carrier data plans.
The time to start was yesterday, and you have a ton of work to do. Your edge will be servicing:
Employees with corporate netbooks and their own smartphones and/or tablets who watch training videos on YouTube from companies like VMware.
Devices like torque tools, temperature sensors in exothermic chambers, ambient light sensors, and a myriad other devices.
Contractors with their own laptops, netbooks, tablets, and/or smartphones who need access to specific company applications.
Guests like account executives entering customer information into their CRM programs.
All the things being developed at venture capital backed incubators.
Ciscoannounced today its intent to acquire NewScale, a small, but well-respected automation software vendor. The financial terms were not disclosed, but it is a small deal in terms of money spent. It is big in the sense that Cisco needed the kind of capabilities offered by NewScale, and NewScale has proven to be one of the most innovative and visible players in that market segment.
The market segment in question is what has been described as “the tip of the iceberg” for the advanced automation suites needed to create and operate cloud computing services. The “tip” refers to the part of the overall suite that is exposed to customers, while the majority of the “magic” of cloud automation is hidden from view – as it should be. The main capabilities offered by NewScale deal with building and managing the service catalog and providing a self-service front end that allows cloud consumers to request their own services based on this catalog of available services. Forrester has been bullish on these capabilities because they are the customer-facing side of cloud – the most important aspect – whereas most of the cloud focus has been directed at the “back end” technologies such as virtual server deployment and workload migration. These are certainly important, but a cloud is not a cloud unless the consumers of those services can trigger their deployment on their own. This is the true power of NewScale, one of the best in this sub-segment.
Everyone understands that cloud computing provides pay per use access to resources and the ability to elastically scale up an application as its traffic increases. Those are values that turn on cloud economics, but how do you turn cloud economics to your advantage?
That was the topic of my keynote session at the Cloud Connect 2011 event in Santa Clara, Calif. earlier this month. The video of this keynote can now be viewed on the event website at http://tv.cloudconnectevent.com/. You will need to register (free) on the site. In this short -- six minute -- keynote you will get the answers to this question. I also encourage you to view many of the other keynotes from this same event, as this was the first cloud computing conference I have attended that finally moved beyond Cloud 101 content and provided a ton of great material on how to really take advantage of cloud computing. We still have a long way to go, but this is a great step forward for anyone still learning about the Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) solutions and how they can empower your organization.
If you still aren't experimenting with these platforms, get going. While they won't transform the world, they do give you new deployment options that can accelerate time-to-market, increase deployment flexibility, and prepare you for the new economic model they are bringing to many early adopters today.
Oracle announced today that it is going to cease development for Itanium across its product line, stating that itbelieved, after consultation with Intel management, that x86 was Intel’s strategic platform. Intel of course responded with a press release that specifically stated that there were at least two additional Itanium products in active development – Poulsen (which has seen its initial specifications, if not availability, announced), and Kittson, of which little is known.
This is a huge move, and one that seems like a kick carefully aimed at the you know what’s of HP’s Itanium-based server business, which competes directly with Oracle’s SPARC-based Unix servers. If Oracle stays the course in the face of what will certainly be immense pressure from HP, mild censure from Intel, and consternation on the part of many large customers, the consequences are pretty obvious:
Intel loses prestige, credibility for Itanium, and a potential drop-off of business from its only large Itanium customer. Nonetheless, the majority of Intel’s server business is x86, and it will, in the end, suffer only a token loss of revenue. Intel’s response to this move by Oracle will be muted – public defense of Itanium, but no fireworks.
Companies of all industries and sizes are considering, planning for, and implementing cloud-based solutions in their infrastructure. One of the first questions that comes up is: “Where do we start?” Email is one — if not the first — significant resource that is cited. Why? It’s a relatively discrete piece of infrastructure where many companies can realize typical cloud benefits like upfront infrastructure cost avoidance, an easier route to being on the latest platform, and the opportunity to offload responsibility to a domain specialist. And then there’s ongoing operational costs: Infrastructure and operations pros, along with their business peers, look at what it costs to run email themselves and compare that with the cost of the provider and struggle to see how they can match provider economics.
Another question that often comes up is: “How real is this trend?” When was the last time you saw vendors like Microsoft, IBM, Google, AT&T, Verizon, Cisco, and Oracle sink billions into the same market? Organizations large and small, like GlaxoSmithKline, Manpower, Panasonic, the US General Services Administration, and a host of others, have made moves to the cloud. That’s a lot of major players and customers, and that’s a good indicator that this isn’t a fad.
Enterprises of all sizes are challenged with supporting a wide range of mobile devices and applications. In addition, empowered employees are circumventing the IT organization by purchasing their own mobile devices and downloading applications used for work from mobile app stores sponsored by mobile device manufacturers. Adding to this complexity are the accelerated mobile device release and application update timelines, which often occur in a matter of months, not years. Mobile cloud services are emerging as a method for the corporate IT department to address these challenges, while still maintaining control of the firm’s mobile device and application environment.
Cloud services have been available in a traditional software and hardware arena for the past few years. However, now vendors and service providers in the mobility ecosystem are offering new types of mobile cloud services to help firms simplify and manage the complex mobility landscape. The key characteristics today’s mobile cloud services include:
Standardized, on-demand mobile services delivered in a public or private cloud environment. Today’s mobile cloud services tend to focus on helping firms deliver mobile applications. However, mobile cloud service deployment will evolve over time to include other services such as storage, security billing, governance, and reporting capabilities.
Mobile cloud services are delivered in an “as a service” manner, and promise to deliver operational savings by only requiring firms to pay for the software, platform, and infrastructure resources used. The software as a service “SaaS” delivery model is commonly used to distribute mobile applications, and infrastructure as a service components include mobile network and storage which can be incorporated into public clouds or private clouds.
We're only a couple weeks away from Forrester's Marketing Forum 2011, April 5-6 in San Francisco, California. (You can view the event details and sign up to attend here). The theme is "Innovating Your Marketing For The Next Digital Decade," which will help attendees navigate the rapidly changing world of digital experiences. Rapid innovation is creating radical shifts in the methods and media that people use to engage with your company, brand, and products. From connected TVs to Microsoft's Xbox Kinect to tablet PCs to mobile-based location awareness, the panoply of emerging platforms and techniques gives powerful new means of creating rich product experiences and engaging with your customers.
For Consumer Product Strategy professionals, we've focused our sessions around a research theme called Total Product Experience. Developed by the amazing James McQuivey, the Total Product Experience thesis is that digital channels are no longer being used just to deliver marketing messages. Instead, they are swiftly being enlisted to simulate and stimulate product trial and use. Already, using Kinect for Xbox, marketers can enable you to kick the virtual tires of a car; tomorrow, with a tablet PC app, marketers will let you take pictures of yourself and dress your own body in virtual clothes. Welcome to the Era of Experience, a time in which product strategists and product marketers must collaborate to deepen the digital customer relationship and extend the total product experience to create value.
Calxeda, one of the most visible stealth mode startups in the industry, has finally given us an initial peek at the first iteration of its server plans, and they both meet our inflated expectations from this ARM server startup and validate some of the initial claims of ARM proponents.
While still holding their actual delivery dates and details of specifications close to their vest, Calxeda did reveal the following cards from their hand:
The first reference design, which will be provided to OEM partners as well as delivered directly to selected end users and developers, will be based on an ARM Cortex A9 quad-core SOC design.
The SOC, as Calxeda will demonstrate with one of its reference designs, will enable OEMs to design servers as dense as 120 ARM quad-core nodes (480 cores) in a 2U enclosure, with an average consumption of about 5 watts per node (1.25 watts per core) including DRAM.
While not forthcoming with details about the performance, topology or protocols, the SOC will contain an embedded fabric for the individual quad-core SOC servers to communicate with each other.
Most significantly for prospective users, Calxeda is claiming, and has some convincing models to back up these claims, that they will provide a performance advantage of 5X to 10X the performance/watt and (even higher when price is factored in for a metric of performance/watt/$) of any products they expect to see when they bring the product to market.
Yesterday, NetApp announced the acquisition of the Engenio storage division of LSI for $480 million. Engenio is mainly known as an OEM provider of storage systems with a broad partner list that includes IBM, Sun, Blue-Arc, Teradata, Panasas, RAID Inc., SGI and Huawei.
This move follows a busy year of storage acquisitions with HP, EMC and Dell each spending more than a billion dollars on buys in the space. $480 million represents the biggest acquisition ever for NetApp, more than they spent this past year on Bycast and Akorri, or previously on Onaro, Topio, Decru or Spinnaker.
Most of the money in storage acquisitions has gone to software that can be mated to industry standard (or nearly standard) hardware, but this deal goes in a different direction with LSI being a vendor known for meat and potatoes Fibre Channel storage hardware without a lot of frills. NetApp is largely a software company that sells OEM hardware running their DataONTap operating system, chock full of some of the most features and protocol options in the industry. Focusing on differentiated software has allowed NetApp to enjoy high margins and a consistent, unified family of products from entry level to enterprise class. So why would NetApp want to jump into the fairly commoditized storage hardware business?
In another token that the movement toward converged infrastructures and vertically integrated solutions is becoming ever more mainstream, HP and Microsoft recently announced a line of specialized appliances that combine integrated hardware, software and pre-packaged software targeting Exchange email, business analytics with Microsoft SharePoint and PowerPivot, and data warehousing with SQL Server. The offerings include:
HP E5000 Messaging System – Microsoft Exchange mailboxes in standard sizes of 500 – 3000 mailboxes. This product incorporates a pair of servers derived from HP's blade family in a new 3U rack enclosure plus storage and Microsoft Exchange software. The product is installed as a turnkey system from HP.
HP Business Decision Appliance – Integrated servers and SQL Server PowerPivot software targeting analytics in midmarket and enterprise groups, tuned for 80 concurrent users. This offering is based on standard HP rack servers and integrated Microsoft software.
HP Enterprise Data Warehouse Appliance – Intended to compete with Oracle Exadata, at least for data warehouse applications, this is targeted at enterprise data warehouses in the 100s of Terabyte range. Like Exadata, it is a massive stack of integrated servers and software, including 13 HP rack servers, 10 of their MSA storage units and integrated Ethernet, Infiniband and FC networking, along with Microsoft SQL Server 2008 R2 Parallel Data Warehouse software.