Take a look at the notes we took from our webinar, “SAM Tool Selection for the Modern Age”.
State of the SAM Tool Industry
When it comes to the state of the SAM tool industry, it’s clear that most tools are trending toward hosted platforms. Many of the legacy platform “dinosaurs” are becoming extinct – some vendors have completely pulled the plug on these old platforms; the rest just aren’t making significant investments into them.
The SAM tool industry is seeing a lot of growth, which is accompanied by some issues. Amongst vendors, there has been some acquisition binging, as we like to call it, in order to gain breadth and stay relevant. Many of the tools have poor integration of their acquired functionality – meaning that the data integration is not as tight as end users assume. The other major issue we’re seeing is that some SAM tool vendors ignore known shortcomings to chase buzzwords. Certain feature enhancements aren’t prioritized because the vendor wants to release products/features that generate buzz, i.e. SaaS Integration, AI, etc. These are not bad things, but they’re being prioritized at the expense of fixing known issues.
Lastly, there has been a significant amount of consolidation in the SAM tool industry’s second tier while the vendors in the third tier are struggling to hold on to their niches.
A 2017 Gartner survey found that only 23% of respondents said that the capabilities of the SAM tools they were using aligned “extremely well” with their pre-purchase expectations.*
When it comes to SAM tools, buyer’s remorse is not uncommon. Sometimes, pre-purchase expectations are not met because a vendor misrepresented the tool’s capability. Often, however, the pre-purchase expectations were either not realistic or not representative of what the end user truly needed the tool to accomplish. That leads us to the Information Gap.
The Information Gap
What causes the Information Gap?
The end user organization doesn’t truly understand its requirements. Often, the requirements are labeled as “must have”, but it ends up being something very different.
Tool vendor salespeople understand the tool well enough to sell it, but don’t have the technical knowledge of “how the tool does what it does”. That knowledge isn’t found in the marketing documents either.
So how do we bridge the Information Gap? The traditional RFP process doesn’t quite do it.
What’s Wrong with the Traditional RFP
The first few steps of the RFP are crucial, but this is where we see significant problems.
The notification package typically either contains too little detail, or way too many “must have” requirements that don’t meet the buyer’s needs (because the buyer doesn’t know what they really need). Some notification packages even show clear evidence of multiple uncoordinated groups heaping on their “must have” requirements. What you then end up with is an impossible set of “must haves” that no tool can deliver on. And ultimately, very few of the “must haves” truly turn out to be actual “must-haves”.
The tool vendor is now in a precarious position. Do they accurately answer the question as written? Not usually. Instead, they do one of two things: Provide a vague answer that obfuscates details or answers a slightly different question. Just say “yes”.
Now to be fair the vendor is simply trying to compete on an equal playing field because they know that their competitors are doing the same thing just to get to the next round. Additionally, oftentimes the person responding to the RFP doesn’t have the best product knowledge so when in doubt, just say yes.
The Q&A also has its issues. During the Q&A, tool vendors make an earnest attempt to try and better understand the context behind the “must have” requirements and to understand vital scoping information that the RFP package glossed over. Then the buyer often gives short, curt answers – either because they’re just trying to move things along, or they don’t have the necessary information.
Another problem with the traditional RFP is that the oral presentations and demos use sanitized, hand-picked data that doesn’t exist in the real world. Unrealistic demos easily create unrealistic expectations.
If you know where and why the most common pitfalls happen, you can better protect yourself from making the same mistakes.
Scoring tools artificially low due to extraneous rules/factors/likability/responsiveness. - Scoring a tool is about aligning your requirements with the tool’s capabilities, not how much you like the sales rep. In this case, don’t take the tool out of the running, take the salesperson out of the running.
Not allocating adequate time for a full Proof of Concept (POC).
Perceiving a sanitized demo with curated data to be reflective of reality.
Not understanding the true realities of what actual implementation in your environment requires.
Believing marketing spin. - If you can’t verify it for yourself, don’t believe that it exists.
Not documenting verbal promises in the agreement.
Failing to get to a true apples-to-apples comparison.
Not being transparent. - Buyer transparency is just as important as vendor transparency.
Keys to More Effective Tool Selection
Understanding what your rationalized requirements truly are. - This is one of the hardest things to get right from a buyer standpoint. Make sure you put in all the work and effort necessary to figure this out, it will make a huge difference.
Understanding differentiators among the tools. - Many tools have similar capabilities, ensure that you know what makes each tool unique.
Understanding what the tools do NOT do.
Understanding how the tools actually do what they do.
Use your time wisely. - Proper time allocation is important if you want to select the best tool for your organization. Don’t waste time with mis-aligned tools, it’s not worth it. Conversely, don’t spend too little time with the tools that do align with your rationalized requirements.
Eviscerate your down-selected tools (Capability Verification Exercise – read more below).
Narrowing the Field
It’s important that you develop a method in which you can efficiently narrow the field so that you don’t waste time on tools that poorly align to your must-have requirements. This requires that you have a clear understanding of your needs. Then evaluate each of the tools individually. Below is an example of how this can be done:
Capability Verification Exercise
What’s missing in the traditional RFP is what we call a Capability Verification Exercise (CVE). This is done face-to-face and is intended to cut through any marketing/sales misdirection by getting to the heart of actual technical capabilities. It’s during the CVE that the tools get eviscerated to lay bare shortcomings, limitations, implementation complications, and the like. To do this requires a considerable amount of licensing expertise, tool implementation experience, forensic interviewing skills, and healthy doses of professional skepticism, gumption, and persistence. We have a tried and true methodology that has proven to efficiently get to the heart of what differentiates tools, which we use on a regular basis with our clients. But here are a few things we think should always happen during a CVE:
Use precise terms and enforce the use of definitions. - Each vendor has its own language; for example, what one calls inventory, another calls discovery and vice versa, definition of an asset, etc. Buyers should define their own terms and make vendors adhere to those definitions.
Only allow technical product specialists to talk/present. - At this point, salespeople should take a back seat to the actual technical product experts. - Solicit differentiating capabilities and associated questions. - Find out what really makes each tool different. You could ask each vendor, “what questions should we be asking of your competition?” or similar questions.
Focus on differentiating capabilities aligned to your true & actual requirements and key use cases.
Provide writeup of capabilities back to tool vendor verification. - This provides ultimate transparency. Vendors can correct buyers who misunderstood, but once the writeup is verified the vendor has to stake its name to it and deliver.
Dive into detail on every must-have capability, how it works, what it takes to get it to work, limitations, etc. - Focus on use cases that you care about the most and will show differentiation.
Make the POC the actual test environment. - This is a true try-before-you-buy and it’s the only way to truly understand what it’s going to take to implement the tool in your environment.
Demand utter and complete transparency.
To conclude, when selecting a SAM tool there are a few essentials that must happen:
- Understand what your differentiating requirements truly are and make sure that they are realistic. - Be thorough in your evisceration of the tools to understand how they meet or do not meet your must-have requirements, how they do it, and what the tool does not do. - Be transparent throughout the process; this fosters trust and results in better information exchange. - Ensure that vendors verify your understanding of the tool, and then hold them accountable to that mutual understanding.