So you need some work done that you’ve never had done before or you need to buy something you’ve never bought before. What should you pay? That can be a tough question. What seems reasonable? Sometimes we set arbitrary rules. It’s OK if it’s under $50 or under $100. But that’s just a reassurance that you’re not getting ripped off too badly. Certainly the best way to avoid that outcome is to know how much that service or thing is worth, or at least know what others have paid for the same thing.
Fortunately now, in the age of the customer, that’s easier to find out. Price information for most consumer goods is easier to come by, making the buying process more efficient. But what about governments? We’ve all heard about the $600 toilet seat or the $400 hammer. Stories of government spending excess and mismanagement abound. Some are urban legends or misrepresentations. Others have legs — such as the recent reports of Boeing overcharging the US Army. While these incidents are likely not things of the past, open data initiatives have made significant progress in exposing spending data and improving transparency. Citizens can visit sites such as USAspending.gov for US federal government spending or "Where Does My Money Go?" for details on UK national government spending, and most large cities publish spending as well.
To jump on this R feeding frenzy most leading BI vendors claim that they “integrate with R”, but what does that claim really mean? Our take on this – not all BI/R integration is created equal. When evaluating BI platforms for R integration, Forrester recommends considering the following integration capabilities:
Usually when a product or service shouts about its low pricing, that’s a bad thing but in Google’s case there’s unique value in its Sustained-use Discounts program which just might make it worth your consideration.
A journalist called and asked me today about the market size for wearables. I replied, “That’s not the big story.”
So what is? It's data, and what you can do with it.
First you have to collect the data and have the permission to do so. Most of these relationships are one-to-one. I have these relationships with Nike, Jawbone, Basis, RunKeeper, MyFitnessPal and a few others. I have an app for each on my phone that harvests the data and shows it to me in a way I can understand. Many of these devices have open APIs, so I can import my Fitbit or Jawbone data into MyFitnessPal, for example.
From the story on 9to5mac.com, it is clear that Apple (like with Passbook) is creating a single place for consumers to store a wide range of healthcare and fitness information. From the screenshots they have, it also appears that one can trend this information over time. The phone is capable of collecting some of this information, and is increasingly doing so with less battery burn due to efficiencies in how the sensor data is crunched, so to speak. Wearables – perhaps one from Apple – will collect more information. Other data will certainly come from third-party wearables - such as fitness wearables, patches, bandages, socks and shirt - and attachments, such as the Smartphone Physical. There will always be tradeoffs between the amount of information you collect and the form factor. While I don't want to wear a chubby, clunky device 24x7, it gets better every day.
IBM recently kicked off its big data market planning for 2014 and released a white paper that discusses how analytics create new business value for end user organizations. The major differences compared with last year’s event:
Organizational change. IBM has assigned a new big data practice leader for China, similar to what it’s done for other new technologies including mobile, social, and cloud. IBM can integrate resources from infrastructure (IBM STG), software (IBM SWG), and services (IBM GBS/GTS) teams, although the team members do not report directly to them.
A new analytics platform powered by Watson technology. The Watson Foundation platform has three new functions. It can be deployed on SoftLayer; it extends IBM’s big data analysis capabilities to social, mobile, and cloud; and it offers enterprises the power and ease of use of Watson analysis.
Measurable benefits from customer insights analysis. Chinese organizations have started to buy into the value of analytics and would like to invest in technology tools to optimize customer insights. AmorePacific, a Hong Kong-based skin care and cosmetics company, is using IBM’s SPSS predictive analytics solution to craft tailored messages to its customers and has improved its response rate by more than 30%. It primarily analyzes point-of-sale data, demographic information from its loyalty program, and market data such as property values in the neighborhoods where customers live.
It’s been a long wait, about four years if memory serves me well, since Intel introduced the Xeon E7, a high-end server CPU targeted at the highest performance per-socket x86, from high-end two socket servers to 8-socket servers with tons of memory and lots of I/O. In the ensuing four years (an eternity in a world where annual product cycles are considered the norm), subsequent generations of lesser Xeons, most recently culminating in the latest generation 22 nm Xeon E5 V2 Ivy Bridge server CPUs, have somewhat diluted the value proposition of the original E7.
So what is the poor high-end server user with really demanding single-image workloads to do? The answer was to wait for the Xeon E7 V2, and at first glance, it appears that the wait was worth it. High-end CPUs take longer to develop than lower-end products, and in my opinion Intel made the right decision to skip the previous generation 22nm Sandy Bridge architecture and go to Ivy Bridge, it’s architectural successor in the Intel “Tick-Tock” cycle of new process, then new architecture.
What was announced?
The announcement was the formal unveiling of the Xeon E7 V2 CPU, available in multiple performance bins with anywhere from 8 to 15 cores per socket. Critical specifications include:
Up to 15 cores per socket
24 DIMM slots, allowing up to 1.5 TB of memory with 64 GB DIMMs
Approximately 4X I/O bandwidth improvement
New RAS features, including low-level memory controller modes optimized for either high-availability or performance mode (BIOS option), enhanced error recovery and soft-error reporting
It looks like the beginning of a new technology hype for artificial intelligence (AI). The media has started flooding the news with product announcements, acquisitions, and investments. The story is how AI is capturing the attention of tech firm and investor giants such as Google, Microsoft, IBM. Add to that the release of the movie ‘Her’, about a man falling for his virtual assistant modeled after Apple’s Siri (think they got the idea from Big Bang Theory when Raj falls in love with Siri), and you know we have begun the journey of geek-dom going mainstream and cool. The buzz words are great too: cognitive computing, deep learning, AI2.
For those who started their careers in AI and left in disillusionment (Andrew Ng confessed to this, yet jumped back in) or data scientists today, the consensus is often that artificial intelligence is just a new fancy marketing term for good old predictive analytics. They point to the reality of Apple’s Siri to listen and respond to requests as adequate but more often frustrating. Or, IBM Watson’s win on Jeopardy as data loading and brute force programming. Their perspective, real value is the pragmatic logic of the predictive analytics we have.
But, is this fair? No.
First, let’s set aside what you heard about financial puts and takes. Don’t try to decipher the geek speak of what new AI is compared to old AI. Let’s talk about what is on the horizon that will impact your business.
New AI breaks the current rule that machines must be better than humans: they must be smarter, faster analysts, or they manufacturing things better and cheaper.
Many of us have spent the past 10 years focusing on business intelligence solutions in order to help our businesses make better fact-based decisions. In fact, BI has been among CIOs’ top 10 priorities for more than a decade. These solutions have, for the most part, been successful — and we continue to improve our BI capabilities as the demand for fact-based decision-making goes deeper, wider, and further into the business.
This whole time, we’ve also been aware of the significant amount of unstructured data that resides within our business, and the fact that we struggle to use it to make better decisions. To begin to get value from this data, we have made our organizations more collaborative and implemented tools and platforms to support that collaboration — with varying degrees of success.
The fact remains that there’s a huge amount of unstructured information and data that we do not get value from. However, a growing number of solutions are beginning to mine elements of this data: product information, software code, legal case files, medical literature, messaging data, and other unstructured business data.
I’ve recently been working with TrustSphere, which is a messaging intelligence provider. TrustSphere has an interesting solution that mines your messaging data to get real insights and information from the mountains of emails and messages that bounce into, out of, and around your organization every day. This is an interesting concept, and TrustSphere has developed a number of use cases for its solution. I’ll be presenting at a webinar hosted by TrustSphere on February 25— feel free to register here.
During 2014, we’ll pass a key milestone: an installed base of 2 billion smartphones globally. Mobile is becoming not only the new digital hub but also the bridge to the physical world. That’s why mobile will affect more than just your digital operations — it will transform your entire business. 2014 will be the year that companies increase investments to transform their businesses, with mobile as a focal point.
Let’s highlight a few of the mobile trends that we predict for 2014:
Competitive advantage in mobile will shift from experience design to big data and analytics. Mobile is transformative but only if you can engage your consumers in their exact moment of need with the right services, content, or information. Not only do you need to understand their context in that moment but you also need insights gleaned from data over time to know how to best serve them in that moment.
Mobile contextual data will offer deep customer insights — beyond mobile. Mobile is a key driver of big data. Most advanced marketers will get that mobile’s value as a marketing tool will be measured by more than just the effectiveness of marketing to people on mobile websites or apps. They will start evaluating mobile’s impact on other channels.
IBM launched on January 9, 2014 its first business unit in 19 years to bring Watson, the machine that beat two Jeopardy champions in 2011, to the rest of us. IBM posits that Watson is the start of a third era in computing that started with manual tabulation, progressed to programmable, and now has become cognitive. Cognitive computing listens, learns, converses, and makes recommendations based on evidence.
IBM is placing big bets and big money, $1 billion, on transforming computer interaction from tabulation and programming to deep engagement. If they succeed, our interaction with technology will truly be personal through interactions and natural conversations that are suggestive, supportive, and as Terry Jones of Kayak explained, "makes you feel good" about the experience.
There are still hurdles for IBM and organizations, such as expense, complexity, information access, coping with ambiguity and context, the supervision of learning, and the implications of suggestions that are unrecognized today. To work, the ecosystem has to be open and communal. Investment is needed beyond the platform for applications and devices to deliver on Watson value. IBM's commitment and leadership are in place. The question is if IBM and its partners can scale Watson to be something more than a complex custom solution to become a truly transformative approach to businesses and our way of life.
Forrester believes that cognitive computing has the potential to address important problems that are unmet with today’s advanced analytics solutions. Though the road ahead is unmapped, IBM has now elevated its commitment to bring cognitive computing to life through this new business unit and the help of one third of its research organization, an ecosystem of partners, and pioneer companies willing to teach their private Watsons.