anomaly detection


anomaly detection

[ə′näm·ə·lē di‚tek·shən] (computer science) The technology that seeks to identify an attack on a computer system by looking for behavior that is out of the norm.

anomaly detection

(1) An approach to intrusion detection that establishes a baseline model of behavior for users and components in a computer system or network. Deviations from the baseline cause alerts that direct the attention of human operators to the anomalies. See IDS and anomaly.

(2) Detecting data that lie outside the normal range. Also called "outlier detection."