Extremes of x86 Servers Illustrate the Depth of the Ecosystem and the Diversity of Workloads

Richard Fichera

I’ve recently been thinking a lot about application-specific workloads and architectures (Optimize Scalalable Workload-Specific Infrastructure for Customer Experiences), and it got me to thinking about the extremes of the server spectrum – the very small and the very large as they apply to x86 servers. The range, and the variation in intended workloads is pretty spectacular as we diverge from the mean, which for the enterprise means a 2-socket Xeon server, usually in 1U or 2U form factors.

At the bottom, we find really tiny embedded servers, some with very non-traditional packaging. My favorite is probably the technology from Arnouse digital technology, a small boutique that produces computers primarily for military and industrial ruggedized environments.

Slightly bigger than a credit card, their BioDigital server is a rugged embedded server with up to 8 GB of RAM and 128 GB SSD and a very low power footprint. Based on an Atom-class CPU, thus is clearly not the choice for most workloads, but it is an exemplar of what happens when the workload is in a hostile environment and the computer maybe needs to be part of a man-carried or vehicle-mounted portable tactical or field system. While its creators are testing the waters for acceptance as a compute cluster with up to 4000 of them mounted in a standard rack, it’s likely that these will remain a niche product for applications requiring the intersection of small size, extreme ruggedness and complete x86 compatibility, which includes a wide range of applications from military to portable desktop modules.

Read more

Nginx And Mobile Are Marching Together Into Internet History

Ted Schadler

[This is an update from a June 2013 post. Also see the new book I wrote with Julie Ask and Josh Bernoff, The Mobile Mind Shift.]

The techologist in me (still) loves getting the monthly Web server report from Netcraft.com. Astounding statistics like the number of registered public Web sites (998 million in August, up from 23,000 in 1995) and active Web sites (179 million) put into the context of history shows simply and directly just how deeply the Internet has penetrated our lives over the last 19 years.

Read more

Digital Disruption To The Ultimate - I Should Have Taken The Michael Connelly Novel

Craig Le Clair

Vacation is a good time to read things that you can never get to while working. My list is quite long but I  scanned it  and took a copy of “The ZERO Marginal Cost Society” by Jeremy Rifkin to the beach. Now Forrester has a lot of focus on digital disruption, helping enterprises avoid being disrupted by new digitally based business models. We write about business agility, how to drive better customer experiences through mobile, social, and cloud. But we pretty much stop at what disruption means to an enterprise, as these are our clients.  

Jeremy Rifkin takes the digital disruption concept to its ultimate end state, and projects the effect on the entire economic system. He paints a somewhat murky but thought provoking picture of where this all leads.  The basic idea?  Digital alternatives, fueled by the Internet of things, big data, the sharing economy, 3D printing, AI and analytics, will drive the marginal cost of producing a product or service to near 0 and this disrupts the entire capitalist system. Established companies can't generate profit, emerging companies can only maintain temporary advantage, and people don’t have “real jobs” anymore. They ride the wave that he calls “the democratization of innovation” that works outside of traditional business and government.

Read more

Digitize Your Business Today Or Prepare To Become Obsolete

Clement Teo

In a previous blog entry, I argued that everyone needs to digitize their business, but not every business knows what to do. Transforming into a digital businesses, especially if you’re a traditional enterprise, is hard work. However, we believe that Asia Pacific is already primed for digital disruption.

In my report, The State Of Digital Business In Asia Pacific In 2014, we found that, while the highest-profile digital business pioneers are headquartered in North America, market demand in Asia Pacific is more conducive to long-term digital disruption. Asia Pacific has five times as many Internet users and smartphone subscribers as the US and almost as much online retail spending as the US and Europe combined. You just need to look at regional powerhouses like Alibaba.com and Commonwealth Bank of Australia and their multibillion-dollar businesses to grasp the rewards of digital business success in Asia Pacific.

However, knowing what these firms have accomplished is insufficient; knowing how to get there is more critical. You should:

Read more

Wearables Shouldn’t Be An Exercise In Screen Miniaturization

JP Gownder

Too many wearables today have screens that look like miniaturized smartphones.

Just as smartphones shouldn’t be PC screens shrunk down to a 4-5” screen, smartwatches shouldn’t look like smartphones shrunk to 1”. Nor is it a matter of responsive web design (RWD), which resizes web content to fit the screen.

Samsung's Gear 2 looks like a tiny smartphone screen.

Instead, it’s a different type of design philosophy – one with DNA in the mobile revolution, and then extending mobile thinking even further.

Let’s start with the concept of mobile moments. As my colleagues write in The Mobile Mind Shift, mobile moments are those points in time and space when someone pulls out a mobile device to get what he or she wants immediately, in context. In the case of wearables, the wearer often won’t need to pull out a device – it’s affixed to her wrist, clothing, or eyeglasses. But she might need to lift her wrist, as a visitor to Disney World must do with MagicBand.

Now we’re getting closer to what wearables should be. But there are additional dimensions to wearables that obviate the need for pixel-dense screens:

Read more

My Three Assumptions For Why The Next Generation Of SW Innovation Will Be Cognitive!

Diego Lo Giudice

I am just back from the first ever Cognitive Computing Forum organized by DATAVERSITY in San Jose, California. I am not new to artificial intelligence (AI), and was a software developer in the early days of AI when I was just out of university. Back then, if you worked in AI, you would be called a SW Knowledge Engineer, and you would use symbolic programming (LISP) and first order logic programming (Prolog) or predicate calculus (MRS) to develop “intelligent” programs. Lot’s of research was done on knowledge representation and tools to support knowledge based engineers in developing applications that by nature required heuristic problem solving. Heuristics are necessary when problems are undefined, non-linear and complex. Deciding which financial product you should buy based on your risk tolerance, amount you are willing to invest, and personal objectives is a typical problem we used to solve with AI.

Fast forward 25 years, and AI is back, has a new name, it is now called cognitive computing. An old friend of mine, who’s never left the field, says, “AI has never really gone away, but has undergone some major fundamental changes.” Perhaps it never really went away from labs, research and very nich business areas. The change, however, is heavily about the context: hardware and software scale related constraints are gone, and there’s tons of data/knowledge digitally available (ironically AI missed big data 25 years ago!). But this is not what I want to focus on.

Read more

The Good The Bad And The Ugly Of Enterprise BI

Boris Evelson
Unified information architecture, data governance, and standard enterprise BI platforms are all but a journey via a long and winding road. Even if one deploys the "latest and greatest" BI tools and best practices, the organization may not be getting any closer to the light at the end of the tunnel because:
  • Technology-driven enterprise BI is scalable but not agile. For the last decade, top down data governance, centralization of BI support on standardized infrastructure, scalability, robustness, support for mission critical applications, minimizing operational risk, and drive toward absolute single version of the truth — the good of enterprise BI — were the strategies that allowed organizations to reap multiple business benefits. However, today's business outlook is much different and one cannot pretend to put new wine into old wine skins. If these were the only best practices, why is it that Forrester research constantly finds that homegrown or shadow BI applications by far outstrip applications created on enterprise BI platforms? Our research often uncovers that — here's where the bad part comes in — enterprise BI environments are complex, inflexible, and slow to react and, therefore, are largely ineffective in the age of the customer. More specifically, our clients cite that the their enterprise BI applications do not have all of the data they need, do not have the right data models to support all of the latest use cases, take too long, and are too complex to use. These are just some of the reasons Forrester's latest survey indicated that approximately 63% of business decision-makers are using an equal amount or more of homegrown versus enterprise BI applications. And an astonishingly miniscule 2% of business decision-makers reported using solely enterprise BI applications.
Read more

Creating A Software Engineering Group Becomes Key to Closing The Experience Gaps

John McCarthy

In our recent report Closing The Experience Gaps, Ted Schadler and I talked about two key elements to meeting customers rising expectations: 1) creating an architecture for cross-channel experience delivery and 2) developing a philosophy and culture of business agility. Given it builds on many of the concepts that we outlined in the Software Must Enhance Your Brand, I wanted to highlight the key elements of the second element -- developing a philosophy and culture of business agility. 

Read more

Expect 3.5 Billion Global Smartphone Subscribers By 2019

In 2012, the number of smartphone subscribers worldwide passed the 1 billion mark, primarily due to adoption in North America and Europe. But the focus of the smartphone market is now shifting toward Asia Pacific, the Middle East and Africa (MEA), and Latin America. These three regions, which are home to 84% of the world’s population, will contribute a significant proportion of the next 2.5 billion subscribers, which Forrester believes will happen by 2019. According to our recently published Forrester Research World Mobile and Smartphone Adoption Forecast, 2014 to 2019 (Global), Asia Pacific is the fastest-growing region in terms of subscribers with a CAGR of 14%, followed by MEA, and Latin America. Some of the findings from the forecast:

  • Low-cost smartphones are turning feature phone subscribers into smartphone subscribers. Chinese companies such as iocean, JiaYu, Meizu, Xiaomi, and Zopo and Indian players like Karbon and Micromax are flooding the market with sub-$200 Android-based smartphones. Declining smartphone prices and shrinking feature phone product lines have contributed to a steep rise in smartphone subscriptions: More than 46% of mobile subscribers owned a smartphone in 2013, compared with 9% in 2009. By 2019, we expect that 85% of all mobile subscribers will have smartphones.
  • The focus is shifting to India. India is the fastest-growing market for smartphones; as such, it’s attracting most of the focus from vendors. Gionee, Huawei, Konka, Lenovo, Xiaomi, and ZTE have recently entered the market, and Google launched its Android One program in partnership with Indian companies to provide sub-$100 Android phones.
Read more

Digital Disruption And The Electronic Medical Record

Skip Snow

For those of us who write and think about the future of healthcare, the story of rapid and systemic change rocking the healthcare system is a recurrent theme. We usually point to the regulatory environment as the source of change. Laws like the Affordable Care Act and the HITECH Act are such glaring disruptive forces, but what empowers these regulations to succeed? Perhaps the deepest cause of change affecting healthcare, and the most disruptive force, is the digitalization of our clinical records. As we continue to switch to electronic charts, this force of  the vast data being collected becomes increasingly obvious. One-fifth of the world’s data is purported to be administrative and clinical medical records. Recording medical observations, lab results, diagnoses, and the orders that care professionals make in binary form is a game-changer.

Workflows are dramatically altered because caregivers spend so much of their time using the system to record clinical facts and must balance these record-keeping responsibilities with the more traditional bedside skills. They have access to more facts more easily than before, which allows them to make better judgments. The increasing ability of caregivers to see what their colleagues are doing, or have done, across institutional boundaries is allowing for better coordination of care. The use of clinical data for research into what works and what is efficient is becoming pervasive. This research is conducted by combining records from several institutions and having the quality committees of individual institutions look at the history of care within their institutions to enhance the ways in which they create the institutional standards of care. The data represents a vast resource of evidence that allows great innovation.

Read more