Subscribe to DSC Newsletter

Operational analytics in electronic security environments

 

 

A.      INTRODUCTION

 

The science of predictive analytics is an exciting and enticing field, filled with vaguely understood concepts and urban legends of future conditions that are supposedly dictated by the esoteric interaction of thousands of different factors in the present - it is the old Sherlock Holmes premise that, by noticing the correct clues, the correct deductive process will lead you to the absolute and correct final truth. This is, of course, rubbish, which is usually dished up in convoluted mathematics or analytics speak to hide the fact behind the science: it is an intricate network of best guesses based on assumptions and tentative truths. The house of cards is built on the fervent hope that we know everything that we should know at the time that we attempt to make the prediction, and that the things that we do know, are indeed accurate. If any of these base factors are out or wrong, we're in trouble, and the predictions we've formulated so carefully will be worth as little as those of the Mayan Calendar.

 

The key, then, must lie in making sure that we have trusted information sources, and that we know everything we should know when we should know it. This is quite difficult to achieve, as it probably involves the implementation of intrusive information gathering systems mandated to extract accurate information about every aspect of the subject being studied, with powerful data repositories to act as retention banks for this derived intelligence. In the real world, this kind of big brother system do not exist, despite the gospel as preached by conspiracy theorists. Public domain data sources, which offers a viable alternative, are defused, require tremendous work to mine, and specialist algorithms to transform into predictive systems - this has, of course been achieved, but the point still remains valid.


It is much easier to cheat: find an environment where the environment itself, by definition, is a data source, and is a sensory mine rich in real-time streams and trusted information vaults. Several such kinds of environments do indeed exist, and are easily identified as being associated with a high incidence of electronic or digital sensing and control systems, long-life databases, and data consumers that are very intelligence oriented - a veritable analytics playground. One such environment is the electronic security industry, which implements digital and electronic technology to reduce both physical and virtual risks in client environments. The resulting streams of information regarding shifting environmental conditions, personnel movements, dynamic threat conditions and more lends itself to truly brilliant analytical exploitation yielding real-time, and real-world, results which impacts on the environment itself -  the client does not just receive an analytical product that is tangible and immediate, but also offers up results that can be experienced firsthand.

 

 

B.      SECURITY SYSTEMS AS CLOSED SYSTEMS

It is often difficult to illustrate analytical concepts through the use of case studies, as a complex environment is usually required to accommodate whatever collection of algorithms may be relevant to the argument in question. In the case of security systems, this challenge is overcome by the very nature of the environment itself: security systems are designed to combat the laws of entropy, in that these systems impose strict and predictable dictates into chaotic systems with the intention of changing said systems into closed systems. Closed systems, as defined by the guidelines of mathematics, are systems where all variables are known, and all outcomes are predictable. As example, we have a high rise building populated by people from multiple employers. The movement of these people, the staff of the various employers, needs to be controlled as a means of aligning said movement with business processes, policies and procedures (it does not make sense to have staff wander the offices, for instance, at night, while no one is in the office during business hours). The act of controlling the movement of people is a synthetic and unnatural act, and falls to, for instance, a security access control system - thus the expectation is now for the access control system to impart a measure of predictability and measurability to the chaotic movement of staff in the building by granting access at the appropriate time, denying access when applicable, and imparting a sense of measurement that aids in the atmosphere of discipline that assist with the enforcement of employer policies.

 

It is of course ridiculous to reason that the access control system is solely responsible for staff adhering to business hours and attendance policies, but the illustration suffices for the purposes of this publication. The same premise can be posited for other kinds of security technologies, such as alarm systems, surveillance systems, intruder detection systems and such, as all of these technologies will be installed to reduce or control risk, a statement that, in mathematical terms, defies logic as the devolution of a closed system is a foregone conclusion and inevitable. At best a security system can aid in the detection of instances where the closed system becomes chaotic, and assist in the reverting of the system to base, or neutral and controlled, state. An example of this would be a fire system detecting a fire, setting of a siren and alerting system, and deploying a fire suppression system to douse the flames - detection and assistance in one simple system with the end result an environment in neutral state, with no fire.

 

The application of analytics in such an environment would, at first, seem exceedingly easy: if all the variables are easily measured, and all the conditions are known, then the recognition of patterned behaviour, prescriptive conditions and predictable indicators must be simple. If the correct classification techniques are applied to the elements being studied, and the populations in question are subsequently correctly grouped, this statement is true, and the applied analytic solution will yield analytical products very quickly. As example, if the high rise discussed earlier utilises an access control system that incorporates a grouping model as means of easily controlling large collections of staff's movement privileges, then these access privilege groups may very well constitute the first scalar in a population definition. Add to this the department of the staff members in question, and a full vector population is created. If a movement algorithm is now applied to this population, which isolates the population from the main dataset and studies the manner in which the members of the population enters and exists the building, the hours of presence in question, the areas of the building most frequented and more, a definitive behaviour will be defined, and associated with the population. The concept of inheritance associates all attributes of the population also with each member of the population in turn, while the rule of compliance makes it easy to identify typical behaviour traits - if most of the members of the population is exhibiting a specific behaviour, then that behaviour must be typical of the population. It now becomes exceedingly easy to identify atypical behaviour, as this would be any behaviour where a member of the population behaves in a manner that breaches typical population behaviour. If a population consists of the typists for the law firm leasing space in the high rise, and typical behaviour for these typists would be to adhere to office hours and enter the building at 07:00 in the morning, leaving at 17:00, a lone typist entering the building at 06:00 on a specific day would be a clear and evident breach of the behaviour of the population 'law firm typists'.

 

It must be understood that population behaviour, both in the detection of compliant and anomalous behaviour, are not just limited in application to the movement of people. Another example may be a collection of secured doors in a bank on the first floor of the high rise. The doors are all members of the population 'bank doors', and have a specific behaviour with regards to locking and unlocking times, established over a period of time through the application of neural networks and organic matrices. If a specific door should unlock at a time that is not adherent to the 'normal' behaviour of the population, it will constitute anomalous behaviour, and warrant attention.

 

The use of populations is one of the simplest applications of operational analytics in an electronic security environment, and yields definite and powerful results. An even more powerful concept may the application of convergence: if x and y always happens in the proximity of, or shortly after, z, then it must be a determined behaviour. Two types of convergence exists, namely proximate convergence (the physical proximity of the events in question is the determining rule), and temporal convergence (the difference in time of occurrence between the events is the determining rule), although it is possible to splice the two algorithms to derive truly exquisitely accurate and powerful rule sets. The premise for proximate convergence is, simply put, that if element x is in location, or in close proximity to location, y, then the event z occurs with sufficient regularity to constitute an enforceable behaviour. Simply put, if the bank manager of the bank on the first floor of the high rise enters the foyer of the bank (that is, his presence is established through the bank access control system), and the security alarm deactivates every time the manager does so in the morning, then this constitutes a valid behaviour that can be attributed to the bank manager. Temporal convergence is defined as the rule where event y always follows event x within a defined temporal (time difference) window z. In the case of the bank manager opening the bank in the morning, this would be the lights in the bank switching on within 5 minutes of the manager switching off the security alarm -  the lights may not all be in physical proximity to the alarm panel, but the 5 minute time window grant the temporal convergence rule validity.

 

The true power of convergence algorithms becomes apparent when the factors related to the rule set of a particular convergence behaviour lends itself to automation - the application of automation systems now allows the deployed solution to not only learn convergent behaviours, but also assist the subjects of the behaviours in the achievement of the results of these behaviours. The presence of the bank manager in the foyer prompts the automation system to disarm the security alarm system, which in turn, 5 minutes later, activates the second rule, switching on all of the bank's lights - the sole trigger was the bank manager's presence in the foyer. It should also be evident that the learned behaviour negates the need for reconfiguration and adaptation, as the nature of the impulse is, by definition, dynamic- if the bank manager is late, on a morning, the complete process will adapt to execute in line with this deviation.

 

Perhaps the most useful aspect of an electronic security system is the clear and precise manner in which events are detected, categorised and recorded. This supports the application of numerous event based algorithms, of which the disciplines based on the identification of cause-and-effect may arguably be the most powerful. The premise of cause-and-effect is that it is possible, in a closed system, to identify the series of causes that will always have a specific effect within a specific following temporal window. More simply put, if causes, or events, x, y and z always precedes effect, or event, a, than it follows that x, y and z must be causing a. Extrapolations of this premise yields other exciting algorithms, such as the possibility to predict the occurrence of a through the early detection of any collection of x, y and z. Such a derived postulation dictate may be reinforced through the application of an evolving, self-affirming trust association that gains a specified trust increment for each successful prediction that is confirmed through an instance of a proving event. As an example, if the lawyer firm occupying floors in the high rise building have a financial month closing on the 25th of every month, and the target for every month may be a billing target of a specific number of hours, the related behaviour may be that the firm will have an office party on the last Friday of the month to celebrate. Include in the argument a member of the firm that exhibits bad behaviour when under the influence of alcohol, and who have, in four separate incidents in the past, damaged building property whilst inebriated, and a very easy prediction matrix becomes evident: a successful billing target, plus an office party, plus attendance by the deviant employee, will result in damage to building property. It should be possible to action this prediction matrix prior to its conclusion, and prevent the resulting negative outcome, thus yielding a real-world result in line with a real-world prediction.

 

 

C.      C ONCLUSION

If electronic security systems are considered as closed systems, these systems' depth as analytical sources becomes evident. The spectrum of algorithms that may be applied is numerous, and the manner in which these results may be applied is limited only by the budget of the client. The merger of electronic security systems and advanced operational analytical systems is a logical and necessary evolution of the field, and will yield spectacular advances in the science for many more years.

The concepts discussed here, as well as the examples used, are all based on the PBS Technologies' Cengence range of products: www.cengence.com

Views: 260

Tags: Cengence

Comment

You need to be a member of AnalyticBridge to add comments!

Join AnalyticBridge

On Data Science Central

© 2021   TechTarget, Inc.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service