Survey Shows Most Schools Regularly Test Emergency Notification System

Published: April 13, 2010

Omnilert LLC, maker of e2Campus, the leading unified emergency notification system for education, announced the results of a customer survey that measured the testing practices of clients’ emergency notification systems (ENS).

A brief survey was sent out to a random sampling of e2Campus clients asking for basic ENS testing practices. The first 100 schools to respond provided the results, and some trends were very easy to spot.

The vast majority (82 percent) of schools polled do test their ENS regularly. Of the 18 percent of schools that do not test, half of them expressed an interest in a testing procedure, but simply have not had the time or resources to put a plan together.

Forty-four percent of schools polled test once per semester, with a smaller amount (14 percent) testing once per month. Some schools (16 percent) have a testing plan in place, but test “as needed,” rather than on a schedule.

——Article Continues Below——

Get the latest industry news and research delivered directly to your inbox.

A full sixty-two percent (62 percent) of participating schools perform a “full test” during the year, sending SMS and emails to all users, rather than a limited “test list.” Of those schools, 8 percent perform additional testing at other times during the year, but send messages to a limited testing group instead.

The criteria for a test to be designated a success differed from school to school. Most schools polled also differ in how they respond to their testing results. Many of the administrators stated that they do not review the results in any way, and simply leave it up to staff and students to verify that they received the message. Several of those schools also stated that the email and SMS tools “simply work,” so there is no need to review the results.

Other universities simply look for a successful send as their testing goal. If the message appears on all of their endpoints, posting to Twitter or Facebook, appearing on their website or digital signage, or setting off the loudspeakers in the student union… then they consider the test a success.

Of the schools that do use delivery statistics to analyze the results, testing is usually followed by a questionnaire, or a mass email requesting that the “failed” users modify their account settings or contact a support department for assistance.

“Too many ‘full-tests’ could lead people to ignore real threats or view your test alerts as SPAM, and unsubscribe altogether,” explained Ara Bagdasarian, CEO of Omnilert. “It’s important to balance the benefit of your system tests versus the possible backlash. But keep in mind; it’s not just about testing the ENS service. It’s also about testing your own infrastructure, your emergency response plan and policy, as well as managing your staff’s and student’s expectations of your public safety efforts.”

Mr. Bagdasarian concluded, “Testing an ENS regularly will help keep skills sharp and allows schools to be prepared when an actual emergency occurs.”

The key to interpreting test results is to look for trends. If the same mobile carrier always seems to reject texts, it may indicate a problem in the school’s area that can be addressed. If a large number of emails on the school’s domain show up as rejected, the school may need to double-check their local firewall or spam filter. If a large number of voice calls show up as “busy,” the school may be overwhelming their local phone lines. The combination of user feedback with e2Campus delivery confirmation reports that provides specific delivery data can help schools troubleshoot all of these situations.


Omnilert April 14, 2010 press release

ADVERTISEMENT
Strategy & Planning Series
Strategy & Planning Series
Strategy & Planning Series
Strategy & Planning Series
Strategy & Planning Series