Đang chuẩn bị nút TẢI XUỐNG, xin hãy chờ
Tải xuống
Tham khảo tài liệu 'computational intelligence in automotive applications by danil prokhorov_5', kỹ thuật - công nghệ, điện - điện tử phục vụ nhu cầu học tập, nghiên cứu và làm việc hiệu quả | 66 T. Gandhi and M.M. Trivedi Training Phase Testing Phase Pedestrian locations Fig. 5. Validation stage for pedestrian detection. Training phase uses positive and negative images to extract features and train a classifier. Testing phase applies feature extractor and classifier to candidate regions of interest in the images 3.2 Candidate Validation The candidate generation stage generates regions of interest ROI that are likely to contain a pedestrian. Characteristic features are extracted from these ROIs and a trained classifier is used to separate pedestrian from the background and other objects. The input to the classifier is a vector of raw pixel values or characteristic features extracted from them and the output is the decision showing whether a pedestrian is detected or not. In many cases the probability or a confidence value of the match is also returned. Figure 5 shows the flow diagram of validation stage. Feature Extraction The features used for classification should be insensitive to noise and individual variations in appearance and at the same time able to discriminate pedestrians from other objects and background clutter. For pedestrian detection features such as Haar wavelets 28 histogram of oriented gradients 13 and Gabor filter outputs 12 are used. Haar Wavelets An object detection system needs to have a representation that has high inter-class variability and low intraclass variability 28 . For this purpose features must be identified at resolutions where there will be some consistency throughout the object class while at the same time ignoring noise. Haar wavelets extract local intensity gradient features at multiple resolution scales in horizontal vertical and diagonal directions and are particularly useful in efficiently representing the discriminative structure of the object. This is achieved by sliding the wavelet functions in Fig. 6 over the image and taking inner products as w m n ýk m n f 2k-jm m 2k-jn n 8 m 0 n 0 where f is the original .