Search results

Filters

  • Journals
  • Authors
  • Keywords
  • Date
  • Type

Search results

Number of results: 1
items per page: 25 50 75
Sort by:
Download PDF Download RIS Download Bibtex

Abstract

Acquiring labels in anomaly detection tasks is expensive and challenging. Therefore, as an effective way to improve efficiency, pretraining is widely used in anomaly detection models, which enriches the model's representation capabilities, thereby enhancing both performance and efficiency in anomaly detection. In most pretraining methods, the decoder is typically randomly initialized. Drawing inspiration from the diffusion model, this paper proposed to use denoising as a task to pretrain the decoder in anomaly detection, which is trained to reconstruct the original noise-free input. Denoising requires the model to learn the structure, patterns, and related features of the data, particularly when training samples are limited. This paper explored two approaches on anomaly detection: simultaneous denoising pretraining for encoder and decoder, denoising pretraining for only decoder. Experimental results demonstrate the effectiveness of this method on improving model’s performance. Particularly, when the number of samples is limited, the improvement is more pronounced.
Go to article

Authors and Affiliations

Xianlei Ge
1 2
Xiaoyan Li
3
Zhipeng Zhang
1

  1. School of Electronic Engineering, Huainan Normal University, China
  2. College of Computing and Information Technologies, National University, Philippines
  3. School of Computer, Huainan Normal University, China

This page uses 'cookies'. Learn more