Analyst Decision Tools: Accuracy Requires Due Diligence and Real-time Updates
IT purchasers heavily rely upon analyst Decision Tools to make choices. To ensure these tools remain credible and reliable, maintaining real-time accuracy is vital in providing buyers with the most trustworthy information available.
Many IT organizations rely on Analyst Decision Tools to evaluate products and streamline decision-making processes. Given the significant impact of these decisions, the creators of these tools must ensure the accuracy and timeliness of the data and analysis they provide. This necessitates thorough due diligence and real-time updates to reflect a company's market position accurately. Such precision is vital for defining competitive differentiation and properly informing potential IT buyers.
Recently, I wrote a post critiquing Cisco’s placement in the Forrester XDR Wave. Based on my knowledge of the Cisco product, I felt the company was grossly misrepresented. This made me question whether this was an isolated case or indicative of broader issues in Forrester's research process and methodology. I closely examined the recently released Forrester Wave for Mobile Threat Defense Solutions to explore this further. In this report, I found yet another instance where I believe the analyst's due diligence was noticeably insufficient, raising further concerns about the reliability of these vital decision tools.
What immediately stood out to me was Lookout's "Market Presence." As I reviewed the report, most vendors' positions seemed reasonably accurate. While one could argue that Check Point and Lookout might be better off switching places, the leaders were generally where they should be. However, what truly caught my attention was the unexpectedly small market presence “dot” assigned to Lookout.
My initial concern was with the size of the dot representing Lookout. The dot's size reflects the vendor's revenue and the number of enterprise customers (those with 1,000+ employees). However, the dot seemed surprisingly small, given my familiarity with Lookout’s business, products, and market presence. To clarify, I contacted Lookout to get their perspective and see if they believed the dot size accurately represented their market position. During our conversation, they revealed they had misunderstood the initial question and submitted an incorrect enterprise customer count. When they brought this mistake to Forrester's lead analyst, they were informed that correcting their response in the published report was too late.
This unfortunate situation highlighted a more significant concern for me: How does Forrester validate the vendor data that informs the data used to build the report? Based on the feedback from Lookout, it appears there is no clear validation and credibility process to ensure that the data submitted by vendors is accurate. When Lookout spoke to the lead analyst, they were surprised to learn that the validation process relied more on a "gut check" than a rigorous verification procedure. For example, even though three of the four vendors in the leadership position are private companies and are not required to disclose revenue and customer numbers publicly, each vendor should be held accountable for the data they submit in some way. It’s troubling that the research firm might take these figures at face value without proper verification.
If a research firm like Forrester cannot effectively validate the data submitted by each vendor, it naturally raises questions about the overall validity and credibility of the entire report. When new information comes to light, the research firm must update its data to ensure the tool remains accurate and reliable for buyers’ decisions. Lookout shared with me that since the report's release, they’ve been inundated with calls from key partners, customers, and investors, all expressing confusion over the market presence of vendors in the leader position. These discussions have cast doubt and directly questioned the integrity and validity of the report itself. Several customers with deep market knowledge have even reached out directly to Forrester's lead analyst to provide clarification, yet the report remains unchanged.
Real-time info is critical
This brings up the value of a decision tool that’s published and then not changed until the next year. I believe that to maximize effectiveness when new information is presented, a research firm should update its document and ensure that a tool used for buyers to make decisions is updated regularly for accuracy.
The XDR Wave I referred to earlier and CrowdStrike’s place is an excellent example. While I agree that the vendor was a leader at the time of publishing, since then, a software error has taken down several major companies and crippled the airline industry for the better part of a week. I reached out to a couple of Forrester analysts I know and inquired as to whether they would move CrowdStrike down, and they told me the document gets updated annually, and there was nothing that could be done.
IT purchasers heavily rely upon analyst Decision Tools like Forrester Waves and Gartner Magic Quadrants to minimize uncertainty and make well-informed choices. To ensure these tools remain credible and reliable, maintaining real-time accuracy is vital in providing buyers with the most trustworthy information available, thereby upholding the integrity of the source.
Zeus Kerravala is the founder and principal analyst with ZK Research.
Read his other Network Computing articles here.
About the Author
You May Also Like