Search results

Filters

  • Journals
  • Authors
  • Keywords
  • Date
  • Type

Search results

Number of results: 3
items per page: 25 50 75
Sort by:
Download PDF Download RIS Download Bibtex

Abstract

Perception takes into account the costs and benefits of possible interpretations of incoming sensory data. This should be especially pertinent for threat recognition, where minimising the costs associated with missing a real threat is of primary importance. We tested whether recognition of threats has special characteristics that adapt this process to the task it fulfils. Participants were presented with images of threats and visually matched neutral stimuli, distorted by varying levels of noise. We found threat superiority effect and liberal response bias. Moreover, increasing the level of noise degraded the recognition of the neutral images to higher extent than the threatening images. To summarise, recognising threats is special, in that it is more resistant to noise and decline in stimulus quality, suggesting that threat recognition is a fast ‘all or nothing’ process, in which threat presence is either confirmed or negated.

Go to article

Authors and Affiliations

Ewa Magdalena Król
Download PDF Download RIS Download Bibtex

Abstract

Keypoint detection is a basic step in many computer vision algorithms aimed at recognition of objects, automatic navigation and analysis of biomedical images. Successful implementation of higher level image analysis tasks, however, is conditioned by reliable detection of characteristic image local regions termed keypoints. A large number of keypoint detection algorithms has been proposed and verified. In this paper we discuss the most important keypoint detection algorithms. The main part of this work is devoted to description of a keypoint detection algorithm we propose that incorporates depth information computed from stereovision cameras or other depth sensing devices. It is shown that filtering out keypoints that are context dependent, e.g. located at boundaries of objects can improve the matching performance of the keypoints which is the basis for object recognition tasks. This improvement is shown quantitatively by comparing the proposed algorithm to the widely accepted SIFT keypoint detector algorithm. Our study is motivated by a development of a system aimed at aiding the visually impaired in space perception and object identification.
Go to article

Authors and Affiliations

Paweł Strumiłło
Karol Matusiak
Piotr Skulimowski
Download PDF Download RIS Download Bibtex

Abstract

This paper proposes a method for offline accurate ball tracking for short volleyball actions in sport halls. Our aim is to detect block touches on the ball and to determinate accurate trajectory and impact positions of the ball to support referees. The proposed method is divided into two stages, namely training and ball tracking, and is based on background subtraction. Application of the Gaussian mixture model has been used to estimate a background, and a high-speed camera with a capture rate of 180 frames per second and a resolution of 1920 × 1080 are used for motion capture. In sport halls significant differences in light intensity occur between each sequence frame. To minimize the influence of these light changes, an additional model is created and template matching is used for accurate determination of ball positions when the ball contour in the foreground image is distorted. We show that this algorithm is more accurate than other methods used in similar systems. Our light intensity change model eliminates almost all pixels added to images of moving objects owing to sudden changes in intensity. The average accuracy achieved in the validation process is of 0.57 pixel. Our algorithm accurately determined 99.8% of all ball positions from 2000 test frames, with 25.4 ms being the average time for a single frame analysis. The algorithm presented in this paper is the first stage of referee support using a system of many cameras and 3D trajectories.

Go to article

Authors and Affiliations

P. Kurowski
K. Szelag
W. Zaluski
R. Sitnik

This page uses 'cookies'. Learn more