• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

1992 Fiscal Year Final Research Report Summary

Visual Recognition of Objects and Control of Hand Shaping in Grasping Movements.

Research Project

Project/Area Number 03650338
Research Category

Grant-in-Aid for General Scientific Research (C)

Allocation TypeSingle-year Grants
Research Field 計測・制御工学
Research InstitutionThe University of Tokyo

Principal Investigator

SUZUKI Ryoji  Univ.of Tokyo, Dept.of M.E.I.P., Prof., 工学部, 教授 (80013811)

Co-Investigator(Kenkyū-buntansha) UNO Yoji  ATR H.I.P.R.L., Senior Researcher, ティ・アール人間情報通信研究所, 主任研究員 (10203572)
NISHII Jun  Univ.of Tokyo, Dept.of M.E.I.P., Res. Ass., 工学部, 助手 (00242040)
Project Period (FY) 1991 – 1992
KeywordsGrasping Movement / Neural Network / Sensory Integration / Data Compression / Data Glove
Research Abstract

The brain must solve two important problems in grasping movements. The first problem concerns the recognition of grasped objects. Specifically, how does the brain integrated visual and motor information on a grapsed object? The second problem concerns hand shape planning. In other words, how does the brain determine the suitable hand posture according to the shape of an object and the task?
A neural network model which solves such problems has been developed. The network consists of multilayers of neurons with forward connections. The operations of the neural network are divided into the learning phase and the pattern generating phase. In the learning phase, internal representations of grasped objects are formed in the middle layer of the network by integrating visual and somatosensory information. In the pattern generating phase, the finger configuration for grasping an object is determined by using the relaxation computation of the network.
A neural network model for acquiring an internal representation of the weight of grasped objects was also developed. This model consists of two networks; the one calculates weight of the object from joint angles and torques and the other calculates torques from the weight and joint angles. The multilayered network combining these two networks can be used to learn arm movements without knowing the weight of the grasped object.

  • Research Products

    (8 results)

All Other

All Publications (8 results)

  • [Publications] UNO, Y.: "A neural network model which acquires the internal representation of grasped objects." Proc.International Symposium on Neural Information Processing, July 12-15, 1992 in Iizuka.118-121 (1992)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] 登内 洋次郎: "多対多の変換を可能にする神経回路モデル" SICE '92 July 22-24, Kumamoto. 予稿集. 751-752 (1992)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] 宇野 洋二: "把持対象の重さの内部表現を獲得する神経回路モデル" 電子情報通信学会論文誌 D-II. (1993)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] UNO,Y.: "Integration of visual and somatosensory information for preshaping hand in grasping movements. In “Advances in Neural Infomation Processing Systems 5."" Morgan Kaufmann Publishers,

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] Yoji Uno,: "A Neural Network Model which Acquires the Internal Representation of Grasped Objects." Proc.Int. Symp. on Neural Information Processing, July 12-15, 1992 in Iizuka.118-121 (1991)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Yojiro Tonouchi: "A Neural Network Model for Realizing Many-to-Many Transformation." SICE'92, July 22-24, Kumamoto. Proc.751-752 (1992)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Yoji Uno,: "A Neural Network Model for Acquiring and Internal Representation of the Weight of Grasped Objects." Transactions of IEICE, D-II.

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Yoji Uno,: Morgan Kaufmann Publishers. Integration of visual and somatosensory information for preshaping hand in grasping movements. In "Advances in Neural Information Processing Systems 5.",

    • Description
      「研究成果報告書概要(欧文)」より

URL: 

Published: 1994-03-24  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi