According to Gartner’s latest blog post, the super secret formula to gaining information about a vendor’s product is the “scripted demo”.
Here’s the Big Idea: you ask the vendor how to accomplish the “applicable tasks you do most often”, and then you ask the vendor to demonstrate, step by step, how to do them.
Well, that’s a terrific idea if the vendor has nothing new or innovative to show you, and if both of you are as dull as dishwater. If all you want the vendor to do is show you how to accomplish some mundane business task that you already do every day, then you’ll never see anything the least bit interesting or innovative. Plus, given time, the vendor will “polish” that part of the demo so it looks much more impressive than it really is.
Here’s some free advice to business and technology analysts who have to evaluate products:
- DON’T ask the vendor to do ordinary things.
Ask the vendor to show you extraordinary things. Sometimes those extraordinary things can be quite ordinary on the surface, like making it possible for ordinary users to employ highly complex technology successfully and usefully. Problem is, you actually have to understand what’s going on under the covers to figure out whether it’s extraordinary, or just something that’s actually quite simple, or even something that’s just plain technically impossible, and therefore a load of bull (see point 3, below). If all you have is an MBA, chances are that you can’t make that evaluation.
- DON’T count mouse clicks.
Count innovations that could really make a difference for your clients and their businesses.
- DON’T assume that you know anything about technology.
Instead, hire (or rent) someone who does, to save you from drawing uninformed conclusions about stuff you don’t (and might never) understand. If you bring someone to the party who can’t be buffaloed by technobabble and a pretty UI, then you don’t have to worry about “scripted” or “unscripted” demos.
- DON’T test for “100 features and often many more — up to 500 features” to finalize a rating.
That’s just “checklist testing”, and it’s not very useful. Fact is, only a few features matter; the rest are bells and whistles that nobody cares about. You need to do a deep dive on what’s important, not worry about who has the longest checklist. It’s exactly this “please the reviewer” attitude that contributed to bloatware like Microsoft Word — a product that, after years of introducing new whiz-bang features that only ever half-worked, still couldn’t get basic bulleted lists to function properly as late as Word 2003.
The other Big Idea in the Gartner article is References. Well, references are a slippery slope. Does the reference customer understand the technology? Probably not. Does he understand what else is out there? Almost certainly not. Does he think his (pick one: RFx engine, spend analysis tool, contracts management system, etc.) is the greatest thing since sliced bread? Maybe, but who cares? He could be using a crappy tool of marginal value in comparison to what he could be using and have no clue. You could interview him until the cows come home, and not learn a thing, except that he’s a happy lemming with no idea he’s about to run off a cliff.
Fact is, the old model of analysts running around interviewing and briefing the vendors that pay them, and then running off to interview the customers that the paying vendor has teed up, and therefore thinking that they’re somehow “getting a feel for what’s out there and what’s good”, is so … over. All you end up creating for your final report is an unappetizing mashup of the marketing nonsense of all the big vendors.
No wonder everyone is running to the web for guidance. Or running for the hills.