Project/Area Number |
25730126
|
Research Category |
Grant-in-Aid for Young Scientists (B)
|
Allocation Type | Multi-year Fund |
Research Field |
Human interface and interaction
|
Research Institution | Institute of Systems, Information Technologies and Nanotechnologies |
Principal Investigator |
Yoshinaga Takashi 公益財団法人九州先端科学技術研究所, その他部局等, 研究員 (10598098)
|
Project Period (FY) |
2013-04-01 – 2017-03-31
|
Project Status |
Completed (Fiscal Year 2016)
|
Budget Amount *help |
¥4,160,000 (Direct Cost: ¥3,200,000、Indirect Cost: ¥960,000)
Fiscal Year 2015: ¥780,000 (Direct Cost: ¥600,000、Indirect Cost: ¥180,000)
Fiscal Year 2014: ¥780,000 (Direct Cost: ¥600,000、Indirect Cost: ¥180,000)
Fiscal Year 2013: ¥2,600,000 (Direct Cost: ¥2,000,000、Indirect Cost: ¥600,000)
|
Keywords | 計測 / 可視化 / 超音波診断 / 医療支援 / コンピュータビジョン |
Outline of Final Research Achievements |
The recent miniaturization of ultrasound diagnostic equipment has paved the way to perform image diagnosis regardless of time and location. Since image acquisition requires sophisticated skills, however, "not everyone" can make use of the equipment. Our previous studies helped substantiate an imaging support system that could display the positional relation between the ultrasound probe and the internal organ visually with the use of AR technology. The problem was, however, that the probe position was measured by an optical technique that needed a camera and marker. This led to complicated procedures to set up the system. As a result, it remained difficult to put it into practical use. Against this backdrop, this research used RGB-D data obtained by the Kinect sensor in an effort to realize a position and angle measuring system that was simple and usable with no marker. The research has confirmed that this system can be applied to assist in ultrasound image shooting.
|