Overhauling Battle Cards (And Transforming Other Sales Tools)

Dean Davison

As part of Forrester’s research into sales enablement, I recently took a journey to “plumb the depths” of sales battle cards. Why?

Sales reps at technology companies tell Forrester that they must understand their competitors if so that they can outmaneuver them during the sales cycle; but, these same sales professionals tell Forrester that, despite the best efforts of product managers, competitive teams, and sales operations, current battle cards are not consistent, instrumental tools that help win more deals.

And thus, my journey into battle cards begins.

During my career, I’ve worked in competitive intelligence at two technology companies, so I already had some strong opinions about battle cards. I tried to set my own views aside, though, and adopted Forrester’s methods of developing a hypothesis and interviewing professionals in the industry.

My initial research looked at the “thing” called a battle card – the layout, structure, and content with the goal of building battle cards that helped sales reps address competitive issues during customer conversations. While testing some really good ideas that came out of the interviews, I could see that the improved battle cards still weren’t enough to meet our objective – routinely helping reps win more deals. 

I turned my attention to the “process” of building battle cards – specifically, how sales enablement professionals identify the competitive issues that merit battle cards, how they work with product managers and marketing teams to create the content for battle cards, and how they deliver battle cards to sales reps. While testing some really good process ideas that came out of the interviews, I could see that even when the groups creating battle cards actively work with sales, their points of view and professional skills are so different, that they miss important details.

Read more

One Code To Rule Them All: Reflections On Oracle Fusion Applications From Oracle OpenWorld 2010

Holger Kisker

With about 41,000 attendees, 1,800 sessions, and a whooping 63,000-plus slides, Oracle OpenWorld 2010 (September 19-23) in San Francisco was certainly a mega event with more information than one could possibly digest or even collect in a week. While the main takeaway for every attendee depends, of course, on the individual’s area of interest, there was a strong focus this year on hardware due to the Sun Microsystems acquisition. I’m a strong believer in the integration story of “Hardware and Software. Engineered to Work Together.” and really liked the Iron Man 2 show-off all around the event; but, because I’m an application guy, the biggest part of the story, including the launch of Oracle Exalogic Elastic Cloud, was a bit lost on me. And the fact that Larry Ellison basically repeated the same story in his two keynotes didn’t really resonate with me — until he came to what I was most interested in: Oracle Fusion Applications!

Read more

Part 2: Three FUD Statements I&O Managers Use Not To Implement Standards-Based Networking

Andre Kindness

Carrying on from my thoughts in Part 1:  It’s time to start deploying purely standards-based infrastructure outside the data center; data center protocols are just starting to be created for a converged and virtualized world.  With the amount of tested and deployed standards protocols, there’s no excuse for networks to be locked in to a certain vendor with proprietary protocols when standards-based network solutions provide access to compelling volume economics, flexibility to adapt a much wider array of solutions, and relief from hiring specialized talent to run a science project.  Although many organizations understand that standards-based networking provides them with the flexibility to choose from the best available solutions at a lower cost of ownership, they still feel trapped.  Listed below are three top shackles and the keys to open them up:

Read more

Part 1: Standards And Proprietary Technology: A Time And Place For Both

Andre Kindness

I was listening to a briefing the other day and got swept up in a Western melodrama, set against the backdrop of Calamity Jane’s saloon in Deadwood Gulch, South Dakota, revolving around three major characters: helpless heroine (the customer); valiant hero (vendor A, riding a standards-based white horse); and the scoundrel villain (a competitor, riding the proprietary black stallion) (insert boo and hiss). Vendor A tries to evoke sympathy at his plight of not offering the latest features because he doesn’t have the same powers as the villain and has chosen to follow the morally correct path, which is filled with prolonged and undeserved suffering supporting standards-based functions. What poppycock! There is no such thing as good and evil in networking. If the vendors were reversed in positions, vendor A would be doing the same thing as its competitors. Every vendor has some type of special sauce to differentiate themselves. Anyway, it’s business, plain and simple; networking fundamentally needs proprietary and standards-based features. However, there’s a time and place for both.  

With that in mind, I want to let you know that I’m a big proponent of standards-based networking. The use of open standards improves the choices that help you reduce risk, implement durable solutions, obtain flexibility, and benefit from quality. Ninety-plus percent of networking should be leveraging standard protocols, but to get to that point features need to go through three stages:

Read more