This is part four of a four-part series!
I can’t overstate the need to do the necessary groundwork before proceeding! Please make sure you read the previous posts in this series:
Every ounce of due diligence you invest in this program will pay back dividends. Trying to cut corners could put your whole program at risk.
Do You Need Both?
Now that we’ve clarified the differences between these two solicitation processes, you may ask whether both are necessary for predictive analytics success.
The short answer is “no.”
However, I’d recommend that you use the RFI process to give your shortlisted providers the opportunity to share with you their full range of offerings. This may be your last chance to consider alternative approaches, techniques, or technologies, so it works to both your advantage and that of your vendors.
Of course, your RFI process may uncover a deal-breaker issue that takes a solution provider out of further consideration; again, this serves both you and the provider.
Once you’ve done this, I recommend that you advance your most promising vendors to the RFP stage. For predictive analytics solutions, the RFP will let you and your prospective providers get into the details. How will their systems talk to your systems? How many user seats are included and/or required? What kind of tech support comes standard, and what premium support is available?
Now to The Finalists
Quick recap: working with a consensus view of your solution needs, you and your team have systematically narrowed down a universe of candidates to a small set of finalists. How small a set? Ideally, two. Two finalists assures that you still have a choice between two equally-viable candidates. If you and your team have the bandwidth, time, and resources, you might wish to extend your finalist list to three.
But since the next step is to conduct a small-scale pilot of each finalist’s solution, the downside of longer finalist lists is the additional time and expense involved. And time and expense aren’t trivial at this stage. You and other stakeholders will want to sit down with your finalists, define the scope, goals, parameters, and timeframes for your pilots.
While it’s impossible to formulate identical pilot implementations for each finalist, the more closely they resemble each other, the easier to compare experiences and results, and therefore the easier to make your ultimate selection.
As with previous steps, your team can simplify the pilot evaluation process through the use of scorecards. Define one scorecard for use with all pilots and share scoring criteria with your finalists in advance.
We at Atonix Digital hope you’ll share your insights and experiences with us in the near future. In the meantime, feel free to get started by reviewing our Monitoring and Diagnostics SaaS offering.