Developers: Will AI Run You Out Of Your Job?

Diego Lo Giudice

Much has been written about how artificial intelligence (AI) will put white-collar workers out of a job eventually. Will robots soon be able to do what programmers do best — i.e., write software programs? Actually, if you are or were a developer, you’ve probably already written or used software programs that can generate other software programs. That’s called code generation; in the past, it was done through “next” generation programming languages (such as a second-, third-, fourth-, or even fifth-generation languages), today are called low code IDEs. But also Java, C and C++ geeks have been turning high level graphical models like UML or BPML into code. But that’s not what I am talking about: I am talking about a robot (or bot) or AI software system that, if given a business requirement in natural language, can write the code to implement it — or even come up with its own idea and write a program for it.

Read more

Search Can Build The Foundation For Cognitive Experiences In The Enterprise

Rowan Curran

Knowledge is power. And in a time where insights drive business differentiation, knowledge is also the origin of power. In our daily routines as consumers, search is probably the most common application we use to find knowledge, and it forms the basis of our personal systems of insight. But at long last, search in the enterprise is catching up. A new wave of search-based applications and search-driven experiences are now being delivered by companies who understand the need to empower their employees and customers with immediate, contextual knowledge in an easily-consumable format.

These applications are not for mere search and results, but also for knowledge discovery. And increasingly, they are a foundational component of cognitive application experiences. Building cognitive experiences can seem arcane and mysterious, but by taking advantage of familiar search technologies at the foundation, enterprise developers can start on the cognitive journey.

In our new research, Mike Gualtieri and I look at how the emerging landscape of cognitive search experiences are incorporating advanced analytics, natural language processing (NLP), and machine learning to enable organizations to see across wide arrays of enterprise data and stitch together insights hidden among them.

Cognitive Search Is Ready To Rev Up Your Enterprise's IQ

This Time, AI Is Truly Here To Help Build Intelligent Applications

Diego Lo Giudice

What’s taken artificial intelligence (AI) so long? We invented AI capabilities like first-order logical reasoning, natural-language processing, speech/voice/vision recognition, neural networks, machine-learning algorithms, and expert systems more than 30 years ago, but aside from a few marginal applications in business systems, AI hasn’t made much of a difference. The business doesn’t understand how or why it could make a difference; it thinks we can program anything, which is almost true. But there’s one thing we fail at programming: our own brain — we simply don’t know how it works.

What’s changed now? While some AI research still tries to simulate our brain or certain regions of it — and is frankly unlikely to deliver concrete results anytime soon — most of it now leverages a less human, but more effective, approach revolving around machine learning and smart integration with other AI capabilities.

What is machine learning? Simply put, sophisticated software algorithms that learn to do something on their own by repeated training using big data. In fact, big data is what’s making the difference in machine learning, along with great improvements in many of the above AI disciplines (see the AI market overview that I coauthored with Mike Gualtieri and Michele Goetz on why AI is better and consumable today). As a result, AI is undergoing a renaissance, developing new “cognitive” capabilities to help in our daily lives.

Read more

My Three Assumptions For Why The Next Generation Of SW Innovation Will Be Cognitive!

Diego Lo Giudice

I am just back from the first ever Cognitive Computing Forum organized by DATAVERSITY in San Jose, California. I am not new to artificial intelligence (AI), and was a software developer in the early days of AI when I was just out of university. Back then, if you worked in AI, you would be called a SW Knowledge Engineer, and you would use symbolic programming (LISP) and first order logic programming (Prolog) or predicate calculus (MRS) to develop “intelligent” programs. Lot’s of research was done on knowledge representation and tools to support knowledge based engineers in developing applications that by nature required heuristic problem solving. Heuristics are necessary when problems are undefined, non-linear and complex. Deciding which financial product you should buy based on your risk tolerance, amount you are willing to invest, and personal objectives is a typical problem we used to solve with AI.

Fast forward 25 years, and AI is back, has a new name, it is now called cognitive computing. An old friend of mine, who’s never left the field, says, “AI has never really gone away, but has undergone some major fundamental changes.” Perhaps it never really went away from labs, research and very nich business areas. The change, however, is heavily about the context: hardware and software scale related constraints are gone, and there’s tons of data/knowledge digitally available (ironically AI missed big data 25 years ago!). But this is not what I want to focus on.

Read more

Make no mistake - IBM’s Watson (and others) provide the *illusion* of cognitive computing

IBM has just announced that one of Australia’s “big four” banks, the ANZ, will adopt the IBM Watson technology in their wealth management division for customer service and engagement. Australia has always been an early adopter of new technologies but I’d also like to think that we’re a little smarter and savvier than your average geek back in high school in 1982.

IBM’s Watson announcement is significant, not necessarily because of the sophistication of the Watson technology, but because of IBM's ability to successfully market the Watson concept.   

To take us all back a little, the term ‘cognitive computing’ emerged in response to the failings of what was once termed ‘artificial intelligence’. Though the underlying concepts have been around for 50 years or more, AI remains a niche and specialist market with limited applications and a significant trail of failed or aborted projects. That’s not to say that we haven’t seen some sophisticated algorithmic based systems evolve. There’s already a good portfolio of large scale, deep analytic systems developed in the areas of fraud, risk, forensics, medicine, physics and more.

Read more