As many of my readers know, for years I’ve been quite skeptical about non-mainstream BI solutions, such as BI SaaS. Security, control, operational risk, data, metadata and application integration are just some of the requirements for enterprise BI that are still on my watch list as potential reasons to be weary about BI SaaS. However, I am also a very pragmatic analyst and truly believe that nothing but supply and demand drive the markets. And I am now, slowly but surely, beginning to believe there couldn’t be a better case for demand for BI SaaS especially after findings from one of the project that I am currently conducting.
I recently talked to a few dozen non-IT professionals (specifically in front office roles, such as sales and marketing) across multiple industries, regions and company sizes. Guess how many of them fully or partially relied on IT for their day to day operational and strategic information needs? BIG FAT ZERO!!! This finding was a huge surprise to me – yes, I did expect to find something like less then 50% reliance on IT, but I surely did not expect to find 0%.
As teams become more agile, or, add more agile like practices to traditional
development practices, I’m seeing increasing frustration on the part of test
managers. Rapid development cycles and scrutinized bottom lines are putting
more pressure on them to deliver comprehensive testing in tighter time
frames. More and more testing is being taken on by development teams, and while
that is a positive trend to be sure. More stringent testing performed by
development is a good thing, as a long time QA manager myself, I used to pray
for consistent unit and integration testing, but, ultimately, developers are not
trained to think in the same way that QA does. Development testing is meant to
ensure that the code, service, or integration performs the way it was conceived;
it doesn’t always cover the assurance that the business process is being met
and it doesn’t replace the end to end perspective that an organization needs to
ensure that the highest quality software is being delivered. Development testing
is faster, to be sure. End to end testing takes more time, but it’s necessary.
Test managers must do something to prevent testing being co-opted by development
at the expense of business value.
Clay Richardson interviews Tom Higgins, CIO with Territory Insurance Office, a commercial insurance and financial services firm based in Darwin, Australia. The discussion covers how TIO was able to deliver value to the business by delivering business process management in a cost effective way — without the usual bloat and excessive overhead associated with enterprise BPM initiatives.
During the past two weeks, I received two client Inquiries about specialized Java hardware and Larry Ellison announced v2 of Oracle's "database machine."
These two seemingly disconnected events made me ask: Is specialized hardware for software inevitable? Last year, we saw TIBCO announce a messaging appliance too. And IBM has a robust business in XML and security appliances. Will growing volumes of data, messages, and logical operations force us to adopt specialized hardware, abandoning the unbundled software model IBM introduced back when real Hippies roamed the Earth?
The client Inquiries were from firms having to invest heavily in infrastructure and still struggling to keep up with their Java processing loads. Both had seen Azul Systems, found its touted performance numbers compelling, but wondered: "Where are the other competitors?" Answer: I don't see any others doing what Azul does.
Why? My answer: Too few customers buy this way -- particularly from a startup with a proprietary software machine. IBM, HP, Intel, and the other big vendors don't see enough of a market yet to take the plunge.
With its database machine, Oracle claims impressive advances in I/O and query speeds, and disk compression. But the company says it has 20 customers for this product. For a new model and a new product, that's not bad. But I think it helps answer my question: Only the tippy top of the enterprise food chain really needs software machines for general purpose products like databases, Java application servers, and message processing. The majority of customers can use other software techniques to keep growing without being locked into proprietary software machines. Virtualization, distributed caching, in-memory databases, optimized garbage collection, and alternative database structures come to mind.
There is a lot of noise lately from 2 camps - one swears that the availability of people with mainframe skills is drying up rapidly - they either forecast dire shortages, or note problems hiring for certain positions internally. Most of the trade press articles are firmly in this camp.
CA is a vendor that already enjoys a leading position in overall network management. Its 2005 acquisition of Concord, which brought along the assets of the previously acquired Aprisma, instantly moved CA from an also-ran to one of the clear leaders. Concord was good, and CA has an impressive track record of growing that business since the acquisition. Still, there were some weaknesses with regard to more advanced performance analysis.
On September 14, 2009, CA finally addressed these performance gaps by announcing its intent to acquire NetQoS for $200 million. Based in Austin, TX, NetQoS is one of those exciting small companies that proved there is a better approach to many of the challenges we face. It is one of the true innovators in performance management of both infrastructure and applications.
Our last post on Gen X using Web 2.0 at work generated a lot of buzz in other blog posts, particularly at ReadWriteWeb.
One of the biggest comments had to do with how generations are defined.
At Forrester, we spent about a month looking at this question back in 2006 when we did our first generational analysis of the use of technology. (We've since updated that work in "The State Of Consumers And Technology: Benchmark 2008" -- available to Forrester clients.)
The more we looked, the more we realized that nobody frickin' knows. So we did what we comes naturally -- we researched it. We canvassed the universe. We read all the books and talked to a bunch of experts, mostly from the Agency world, where they know a thing or two about generations.
Since nobody had a definitive set, we create them based on blended average of the best references out there. Then we added a Forrester twist: What technology era does it correspond to? (Meaning, when they were teenagers, what technology was new to them.)
Gen Y: 1980-1991. This group started spending money in the mobile era -- it's still their defining characteristic. Mobile texting, making party plans on the fly while out, carrying their identity around in their phones, that's Gen Y. They don't think twice, they just do it. This group would love to use social media at work, but mobile's good enough for now thank you very much. They are 50% more likely to text while at work as Gen X (51% versus 36%.) As far as showing the rest of us the path forward, it's probably leading by example at this point in their careers.
Once apon a time ... the definition of an application was easy - firms built accounts payable, general ledger, purchasing, order entry, and other applications to meet the automation needs of the business. The applications were written as monolithic collections of functionality that were dedicated to accomplishing key business functions and had relatively clear boundaries.
However, over the years, technology shifts have resulted in "applications" that don't fit the earlier simplistic definitions. DLLs, Services / SOA, ASPs and Software-as-a-Service (SaaS) all bent the definition of an "application" from its previous form. Looking forward, Cloud computing promises to alter the definition once again.