My Three Assumptions For Why The Next Generation Of SW Innovation Will Be Cognitive!

Diego Lo Giudice

I am just back from the first ever Cognitive Computing Forum organized by DATAVERSITY in San Jose, California. I am not new to artificial intelligence (AI), and was a software developer in the early days of AI when I was just out of university. Back then, if you worked in AI, you would be called a SW Knowledge Engineer, and you would use symbolic programming (LISP) and first order logic programming (Prolog) or predicate calculus (MRS) to develop “intelligent” programs. Lot’s of research was done on knowledge representation and tools to support knowledge based engineers in developing applications that by nature required heuristic problem solving. Heuristics are necessary when problems are undefined, non-linear and complex. Deciding which financial product you should buy based on your risk tolerance, amount you are willing to invest, and personal objectives is a typical problem we used to solve with AI.

Fast forward 25 years, and AI is back, has a new name, it is now called cognitive computing. An old friend of mine, who’s never left the field, says, “AI has never really gone away, but has undergone some major fundamental changes.” Perhaps it never really went away from labs, research and very nich business areas. The change, however, is heavily about the context: hardware and software scale related constraints are gone, and there’s tons of data/knowledge digitally available (ironically AI missed big data 25 years ago!). But this is not what I want to focus on.

Read more

The Good The Bad And The Ugly Of Enterprise BI

Boris Evelson
Unified information architecture, data governance, and standard enterprise BI platforms are all but a journey via a long and winding road. Even if one deploys the "latest and greatest" BI tools and best practices, the organization may not be getting any closer to the light at the end of the tunnel because:
  • Technology-driven enterprise BI is scalable but not agile. For the last decade, top down data governance, centralization of BI support on standardized infrastructure, scalability, robustness, support for mission critical applications, minimizing operational risk, and drive toward absolute single version of the truth — the good of enterprise BI — were the strategies that allowed organizations to reap multiple business benefits. However, today's business outlook is much different and one cannot pretend to put new wine into old wine skins. If these were the only best practices, why is it that Forrester research constantly finds that homegrown or shadow BI applications by far outstrip applications created on enterprise BI platforms? Our research often uncovers that — here's where the bad part comes in — enterprise BI environments are complex, inflexible, and slow to react and, therefore, are largely ineffective in the age of the customer. More specifically, our clients cite that the their enterprise BI applications do not have all of the data they need, do not have the right data models to support all of the latest use cases, take too long, and are too complex to use. These are just some of the reasons Forrester's latest survey indicated that approximately 63% of business decision-makers are using an equal amount or more of homegrown versus enterprise BI applications. And an astonishingly miniscule 2% of business decision-makers reported using solely enterprise BI applications.
Read more

Creating A Software Engineering Group Becomes Key To Closing The Experience Gaps

John McCarthy

In our recent report Closing The Experience Gaps, Ted Schadler and I talked about two key elements to meeting customers’ rising expectations: creating an architecture for cross-channel experience delivery and developing a philosophy and culture of business agility. Given it builds on many of the concepts that we outlined in the Software Must Enhance Your Brand, I wanted to highlight the key aspects of the second element: developing a philosophy and culture of business agility.

Closing the experience gaps — performance, convenience, personalization, and trust — requires a different mindset. The shift in customer expectations, fueled by an increasing rate of technology change, means that firms need to act more like a cloud-based ISV, not a traditional IT shop. This requires an agile process and continuous development from small teams spanning business, design, and technology competencies. Part of this makeover includes improving technical and design competencies. Companies like GE and Wal-Mart have dramatically upskilled their technology teams.

At the core of this new mindset are five cultural, process, and skill imperatives:

  • Align business and technology executives. Successful customer experience transformation efforts at Delta Air Lines and The Home Depot have at their core an accommodation between the CEO, business executives, and the CIO.
  • Embrace an agile, sense-and-respond continuous delivery process. Great customer experiences today are table stakes tomorrow. To continuously improve experiences, companies must work differently, in small agile teams that span business, design, and technology — what we call IDEA teams.
Read more

Expect 3.5 Billion Global Smartphone Subscribers By 2019

In 2012, the number of smartphone subscribers worldwide passed the 1 billion mark, primarily due to adoption in North America and Europe. But the focus of the smartphone market is now shifting toward Asia Pacific, the Middle East and Africa (MEA), and Latin America. These three regions, which are home to 84% of the world’s population, will contribute a significant proportion of the next 2.5 billion subscribers, which Forrester believes will happen by 2019. According to our recently published Forrester Research World Mobile and Smartphone Adoption Forecast, 2014 to 2019 (Global), Asia Pacific is the fastest-growing region in terms of subscribers with a CAGR of 14%, followed by MEA, and Latin America. Some of the findings from the forecast:

  • Low-cost smartphones are turning feature phone subscribers into smartphone subscribers. Chinese companies such as iocean, JiaYu, Meizu, Xiaomi, and Zopo and Indian players like Karbon and Micromax are flooding the market with sub-$200 Android-based smartphones. Declining smartphone prices and shrinking feature phone product lines have contributed to a steep rise in smartphone subscriptions: More than 46% of mobile subscribers owned a smartphone in 2013, compared with 9% in 2009. By 2019, we expect that 85% of all mobile subscribers will have smartphones.
  • The focus is shifting to India. India is the fastest-growing market for smartphones; as such, it’s attracting most of the focus from vendors. Gionee, Huawei, Konka, Lenovo, Xiaomi, and ZTE have recently entered the market, and Google launched its Android One program in partnership with Indian companies to provide sub-$100 Android phones.
Read more

Digital Disruption And The Electronic Medical Record

Skip Snow

For those of us who write and think about the future of healthcare, the story of rapid and systemic change rocking the healthcare system is a recurrent theme. We usually point to the regulatory environment as the source of change. Laws like the Affordable Care Act and the HITECH Act are such glaring disruptive forces, but what empowers these regulations to succeed? Perhaps the deepest cause of change affecting healthcare, and the most disruptive force, is the digitalization of our clinical records. As we continue to switch to electronic charts, this force of  the vast data being collected becomes increasingly obvious. One-fifth of the world’s data is purported to be administrative and clinical medical records. Recording medical observations, lab results, diagnoses, and the orders that care professionals make in binary form is a game-changer.

Workflows are dramatically altered because caregivers spend so much of their time using the system to record clinical facts and must balance these record-keeping responsibilities with the more traditional bedside skills. They have access to more facts more easily than before, which allows them to make better judgments. The increasing ability of caregivers to see what their colleagues are doing, or have done, across institutional boundaries is allowing for better coordination of care. The use of clinical data for research into what works and what is efficient is becoming pervasive. This research is conducted by combining records from several institutions and having the quality committees of individual institutions look at the history of care within their institutions to enhance the ways in which they create the institutional standards of care. The data represents a vast resource of evidence that allows great innovation.

Read more

As You Prepare for VMworld 2014, Educate Yourself On Digital Workspace Technologies

David Johnson
Bill Gates said "People everywhere love Windows.” Whether or not you agree, the fact that Microsoft Windows remains the de facto standard for business productivity after nearly 3 decades, suggests that many still do. But as the sales figures of Microsoft’s competitors suggest, people everywhere love lots of other things too. And one of the reasons they love them so much is that they like to get things done, and sometimes that means getting away from the office to a quiet place, or using a technology that isn’t constrained by corporate policies and controls, so they can be freer to experiment, grow their skills and develop their ideas uninhibited.
 
Technology managers I speak with are aware of this, but they’re justifiably paranoid about security, costs, and complexity. So the result of these conflicting forces coming together is inspiring rapid innovation in a mosaic of technologies that Forrester collectively calls digital workspace delivery systems. It involves many vendors, including Microsoft, Citrix, VMware, Dell, nComputing, Amazon Web Services, Fujitsu, AppSense, Moka5, and more. The goal of our work is to help companies develop their capabilities for delivering satisfying Microsoft Windows desktop and application experiences to a wide range of users, devices, and locations.
 
Read more

Check Out These Tools Linking Content To Sales Conversations

Peter O'Neill

I hear so much about how modern marketers are now content publishers and getting better and better at being able to engage with buyers much earlier in their buyer journeys – but what about your poor sales people?  My experience from almost all of my client engagements is that many content marketers forget about them and ended up producing yet more “random acts of marketing” which ignore the sales enablement imperative.  I remember asking when I presented “A Valuable Message Framework” at our Sales Enablement Forum back in March:  

·          “Do you let your sales people know what content is out there so that they can leverage it and distribute it for you?”

·          “Do you want them meeting a customer and hearing what content they have already seen, and being surprised”

·          “How do get feedback on your content?”?

Not a great contribution from marketing to the total customer experience - which definitely involves a sales conversation for some type of product or service (see last week’s blog). 

Read more

Make Your Next Storage Array An App

Henry Baltazar

Storage has been confined to hardware appliance form factors for far too long.  Over the past two decades, innovation in the storage space has transitioned from proprietry hardware controllers and processors to proprietary software running on commodity X86 hardware.  The hardware driving backup appliances, NAS systems, iSCSI arrays, and object storage systems, are often quite similar in terms of processors and components, yet despite this fact I&O professionals are still used to purchasing single purpose systems which lock customers into a technology stack. 

Over the past few years, companies such as HP (StoreVirtual VSA), Nexenta, Sanbolic and Maxta have released software-only storage offerings to complete head to head with proprietrary hardware appliances, and have found some success with cost conscious enterprises and service providers.  The software-only storage revolution is now ready for primetime with startup offerings now reaching maturity and established players such as IBM, EMC and NetApp jumping into the market.

I&O professionals should consider software only storage since: 

The storage technology acquisition process is broken.  Any storage purchase you complete today will be bound to your datacenter for the next 3 to 5 years.  When business stakeholders and clients need storage resources for emerging use cases such as object storage and flash storage, these parties often do not have the luxury of time to wait for storage teams to complete RFPs and product evaluations.  With software-only storage access to new technology can be accelerated to meet the provisioning velocity needs of customers.

Read more

Cognitive Computing Forum: 7 Things You Need To Know

Michele Goetz

Day one of the first Cognitive Computing Forum in San Jose, hosted by Dataversity, gave a great perspective on the state of cognitive computing; promising, but early.  I am here this week with my research director Leslie Owens and analyst colleague Diego LoGudice.  Gathering research for a series of reports for our cognitive engagement coverage, we were able to debrief tonight on what we heard and the questions these insights raise.  Here are some key take-aways:

1)  Big data mind shift to explore and accept failure is a heightened principle.  Chris Welty, formerly at IBM and a key developer of Watson and it's Jeoapardy winning solution, preached restraint.  Analytic pursuit of perfect answers delivers no business value.  Keep your eye on the prize and move the needle on what matters, even if your batting average is only .300 (30%).  The objective is a holistic pursuit of optimization.

2)  The algorithms aren't new, the platform capabilities and greater access to data allow us to realize cognitive for production uses.  Every speaker from academic, vendor, and expert was in agreement that the algorithms created decades ago are the same.  Hardware and the volume of available data have made neural networks and other machine learning algorithms both possible and more effective.  

Read more

The Beginning Of The End For The "Programmatic" Ad Network

The acquisition of [X+1] by Rocketfuel signals the beginning of the end for “programmatic” ad networks. Since the industry’s shift to programmatic, countless ad networks have changed how they market themselves, adjusting their sales language to mimic legitimate programmatic platforms. The “programmatic” ad network insertion order-based and flat-rate business model has prolonged the black box opacity that spurred the need for demand side platforms and exchange based media buying. It’s only fitting that one of the industry’s most successful “programmatic” ad networks — Rocketfuel — is addressing client demand by making a move that launches them into the digital marketing SaaS market.

There is a lot to be said about the success that Rocketfuel has had in the industry; they have done great things for marketers looking to automate audience prospecting and retargeting. They certainly have done an amazing job marketing their programmatic chops, with the success of their AI product and their success with agencies running performance based campaigns. Their recent revenue growth and the fact that Rocketfuel had the capital to acquire a DSP/DMP in [X+1], are testaments to the success that they have had in the industry.

Despite their success, prolonging opacity for marketers in this market is a short-term strategy, and Rocketfuel is positioning itself for long-term success.

Coming from the agency trading desk world, I did not partner with Rocketfuel for several reasons:

  • Rocketfuel works with marketers and agencies on a flat-rate business model, which is aligned with traditional ad network buying.
Read more