Project/Area Number |
09650445
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Single-year Grants |
Section | 一般 |
Research Field |
System engineering
|
Research Institution | Tokyo Denki University |
Principal Investigator |
MURAKAMI Shin-ichi Faculty of Engineering, Tokyo Denki University, Professor, 工学部, 教授 (30219894)
|
Project Period (FY) |
1997 – 1999
|
Project Status |
Completed (Fiscal Year 1999)
|
Budget Amount *help |
¥3,400,000 (Direct Cost: ¥3,400,000)
Fiscal Year 1999: ¥1,300,000 (Direct Cost: ¥1,300,000)
Fiscal Year 1998: ¥700,000 (Direct Cost: ¥700,000)
Fiscal Year 1997: ¥1,400,000 (Direct Cost: ¥1,400,000)
|
Keywords | Camera work / Motion Extraction / Human image extraction / Motion recognition / Motion description / 人物抽出 / 退動記述 / カメラワーク検出 |
Research Abstract |
An efficient motion image database from the view points of fast image retrieval and data compression rates can be constructed if every motion in an image sequence can be recognized and described by motion command. In order to attain the above goal, fundamental techniques for them are studied by the following two steps. (1) Extraction of camera works : Camera works such as zooming, panning, tilting and so on are often used in news films, sports program, dramas and so on. But they often yield some difficulties in the recognition of objects because they change the appeared sizes and positions of the objects. Then, it is preferable to eliminate the camera works from image sequences before the recognition process of the objects. For this purpose, a camera work extraction method which partitions every image frame into 4 by 4 blocks and recognizes the motion vectors of every block. It is shown that almost all camera works can be extracted by this method. (2) Recognition of human behaviors : For this purpose, an object extraction method which utilizes edge detection technique is proposed. Through a simple experiment, it is shown that a walking pitch of a rambling persons and a circulating pitch of an athlete for horizontal bar can be extracted efficiently by this method.
|