Abstract: Recently, with the prevalence of various sensing devices and numerical simulation software, a large amount of data is being generated in the form of a two-dimensional (2D) array. One of the important tasks for analyzing such arrays is to find anomalous or outlier regions in such a 2D array. In this article, we propose an effective method for detecting outlier regions in an arbitrary 2D array, which show a significantly different pattern from that of their surrounding regions. Unlike most existing methods that determine the outlierness of a region based on how different its average is from that of its neighboring elements, our method exploits the regression models of a region in determining its outlierness. More specifically, this method first divides the array into a number of small subarrays and then builds a regression model for each subarray. In turn, the method iteratively merges adjacent subarrays with similar regression models into larger clusters. After the clustering, the proposed method reports very small clusters as outlier regions at the final step. Lastly, we demonstrate in our experiments the effectiveness of the proposed method on synthetic and real datasets.
Loading