Abstract
To effectively use Machine Learning (ML) in cybersecurity, automation is required. However, there cannot be true Intelligent Automation in cybersecurity if anomalies are managed manually, as is currently the case. As exponential technologies advance towards the widespread adoption of Internet of Things, Cyber Physical Systems, Internet+, Industry 4.0, and Society 5.0 there will be an influx of new cyber anomalies triggered by cyber attacks.
Detecting and managing these anomalies in complex Information and Operational Technology environments will require advanced autonomous anomaly management. This research defines cybersecurity preliminaries, stimulates questions to expose the complexities and identifies twelve cybersecurity automation challenges. One of these is that cybersecurity controls rely heavily on databases that require manual management.
First, a normal system state needs to be defined with any anomaly detection research, and anything outside of this state is considered to be an anomaly. Unfortunately, the normal system state or baseline is defined during the creation of these databases and datasets. Attempts are made by subject matter experts to keep these baselines operationally relevant with ad-hoc updates as part of their maintenance.
Anomaly detection is traditionally performed by querying, comparing and identifying an anomalous string from a database containing human readable entries, such as Indicators of Compromise, firewall rules, Intrusion Detection System signatures, application whitelisting, vulnerability management, and Anti-Virus rules.
Cybersecurity analysts manage anomalies using traditional as well as ML-infused security solutions, with database updates and manual data entries.
In this research anomaly is identified not as an anomalous entry in a database, but as an “unknown change” in concept of host or system combined with unique and incoherent applications' life-cycles. The application's integrity is monitored with a graph-based and real time measurable solution. A flexible application behavioural profile is developed as the base for system normality, that is dynamic. This is done by measuring change using entropy based anomaly detection.
The Biodiversity Index compares diverse relationships between species and
communities in ecology, is used as the research contribution to compare diverse dynamic relationships between business applications that drive emerging technologies.
Continuous sampling is used to track changes in application diversity, which contributes to both the applications' unique behavioural profiles for the novel Cyber Diversity Index (CDI), and dynamic normal baselines for the proposed theoretical autonomous anomaly management.
The CDI, is used to measure application system calls in real time and to monitor and detect unknown diversity changes, during normal operational environment and when placed under a zero-day cyber attacks. The purpose of CDI is to improve automation in cybersecurity by providing signature-less, self-managed and real time solution. In addition, CDI could provide a measurable solution, as to the level of cyber integration in a synthesised cyber ecological environment.
The research reports modest results from the experiments, however additional data points, different diversity indexes, and different types of entropy, could contribute towards future research.
Detecting and managing these anomalies in complex Information and Operational Technology environments will require advanced autonomous anomaly management. This research defines cybersecurity preliminaries, stimulates questions to expose the complexities and identifies twelve cybersecurity automation challenges. One of these is that cybersecurity controls rely heavily on databases that require manual management.
First, a normal system state needs to be defined with any anomaly detection research, and anything outside of this state is considered to be an anomaly. Unfortunately, the normal system state or baseline is defined during the creation of these databases and datasets. Attempts are made by subject matter experts to keep these baselines operationally relevant with ad-hoc updates as part of their maintenance.
Anomaly detection is traditionally performed by querying, comparing and identifying an anomalous string from a database containing human readable entries, such as Indicators of Compromise, firewall rules, Intrusion Detection System signatures, application whitelisting, vulnerability management, and Anti-Virus rules.
Cybersecurity analysts manage anomalies using traditional as well as ML-infused security solutions, with database updates and manual data entries.
In this research anomaly is identified not as an anomalous entry in a database, but as an “unknown change” in concept of host or system combined with unique and incoherent applications' life-cycles. The application's integrity is monitored with a graph-based and real time measurable solution. A flexible application behavioural profile is developed as the base for system normality, that is dynamic. This is done by measuring change using entropy based anomaly detection.
The Biodiversity Index compares diverse relationships between species and
communities in ecology, is used as the research contribution to compare diverse dynamic relationships between business applications that drive emerging technologies.
Continuous sampling is used to track changes in application diversity, which contributes to both the applications' unique behavioural profiles for the novel Cyber Diversity Index (CDI), and dynamic normal baselines for the proposed theoretical autonomous anomaly management.
The CDI, is used to measure application system calls in real time and to monitor and detect unknown diversity changes, during normal operational environment and when placed under a zero-day cyber attacks. The purpose of CDI is to improve automation in cybersecurity by providing signature-less, self-managed and real time solution. In addition, CDI could provide a measurable solution, as to the level of cyber integration in a synthesised cyber ecological environment.
The research reports modest results from the experiments, however additional data points, different diversity indexes, and different types of entropy, could contribute towards future research.
Original language | English |
---|---|
Qualification | Doctor of Information Technology |
Awarding Institution |
|
Supervisors/Advisors |
|
Place of Publication | Australia |
Publisher | |
Publication status | Published - 2020 |