By Terri Mock - March 22, 2021
Due to the many different types of campus security software, there is no one-size-fits-all procedure for evaluating each option. However, regardless of whether you are evaluating campus security software to secure real estate, keep populations safe, or protect data from loss or theft, there are three key considerations to take into account.
By the time you get to the evaluation stage of the procurement process, it is likely you have already identified what vulnerability the software is supposed to address and how the software is supposed to address it. It is also likely you will know who will be responsible for implementing the software, monitoring its effectiveness, and developing/executing back-up plans if anything goes wrong.
Therefore, in theory, an evaluation of campus security software should mostly consist of confirming the software does what it claims to do and proving the software's value proposition. It should also give end users the opportunity to become familiar with the software to compare the capabilities of one potential solution against others being considered in the procurement process.
The involvement of end users is critical at the evaluation stage as it can generate “what if” scenarios that may not previously have been considered. It may also be the case that end users find logistic or practical issues that might affect the operation of the campus security software or its effectiveness. In addition, end users can help determine three key considerations of evaluating software for campus security software - complexity, compatibility, and continuity.
The most common types of campus security software fall into three categories - deterrents (access controls, metal detectors, etc.), monitoring solutions (CCTV, gunshot detection systems, etc.), and threat alert communications systems (mass notifications, panic buttons, etc.). In some cases, campuses might also install physical security information management software or video analytics software such as AI-driven surveillance software with facial recognition capabilities.
Whatever type of software is being evaluated, it has to be easy to use. Anything that requires complex configuration, advanced operational skills, or a deep understanding of how the solution works Is unlikely to fulfill its full potential and possibly fail to fulfill the task for which it is being evaluated. Naturally, the complexity of the software needs to be determined at the evaluation stage because, once the software has been approved and implemented, it might be too late.
In recent years, there has been a significant growth in smart technologies that unite disparate campus security software into one unified system. While these technologies can be valuable assets in the right circumstances, the potential exists for gaps to appear in coverage due to incompatibility issues – resulting in delayed emergency responses or inappropriate responses and meaning that campuses may have to replace existing solutions to ensure the smart technology has total coverage.
A further issue with off-the-shelf unified systems is that they can duplicate the capabilities of existing software for campus security or include capabilities that will never be used. Therefore, to prevent paying twice for the same capability or paying for capabilities that are not required, campuses should evaluate campus security software alongside existing systems to ensure compatibility; and, if necessary, develop a unified security system from individual software components.
While it is important not to pay for duplicated capabilities or capabilities that will never be used, it is important for those with a responsibility for security to be aware that campus security software is constantly evolving. This means that software being evaluated during the current procurement process may need to be upgraded at a later date to address new threats. This is particularly relevant to data protection software but could also be applied to other categories of security software.
To determine whether the solution(s) being evaluated at present will be equally as effective in the future, campuses should review vendors' past histories. The reviews should identify the frequency with which new products or upgrades are released, whether new releases make existing software more complicated to use, or whether they create more work for the end user – an indirect cost not often considered when evaluating software for campus security.
If you would like further advice about evaluating software for campus security, or would like to see a demo of an easy-to-use, fully compatible, future-proof security system, do not hesitate to get in touch. Our team of safety experts will be happy to organize a free demo of Rave Alert in action and answer any questions you have about ensuring you select the right campus security software for your needs.
Terri Mock is Rave's Chief Strategy & Marketing Officer, overseeing strategy, product, and marketing. She is an executive leader with achievements in delivering revenue growth, driving go-to-market, innovating products, and scaling operations from high-tech startups to global companies.