• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2010 Fiscal Year Final Research Report

Extraction and Sharing of Metadata for Spectator Sports with Personal Digital Assistant and its Application to Video Annotation

Research Project

  • PDF
Project/Area Number 21700104
Research Category

Grant-in-Aid for Young Scientists (B)

Allocation TypeSingle-year Grants
Research Field Media informatics/Database
Research InstitutionNagoya University

Principal Investigator

OHIRA Shigeki  Nagoya University, 情報基盤センター, 助教 (60339695)

Project Period (FY) 2009 – 2010
Keywordsモバイルシステム / コンテンツ / 映像アノテーション
Research Abstract

We propose a method for extracting metadata from the spontaneous eye and body movements of spectators at sporting events and combining the metadata with video shot while the event is being watched. In particular, our system extracts the spectator's body movement and direction of their gaze in a stadium using multi-sensor equipment, which is made up of a personal digital assistant and a digital camera, and this is used to create the spectator sport content based on associating these metadata with the still images and videos shot while the game is being watched. We developed a method for detecting the spectator's view area on the field by using a direction sensor and created a prototype of our sport video viewer using the spectator's content, which includes the gaze direction and body movement.

  • Research Products

    (2 results)

All 2011

All Presentation (2 results)

  • [Presentation] 観戦者のセンサ情報を利用したスポーツ映像アノテーションと情報提示2011

    • Author(s)
      大平茂輝
    • Organizer
      情報処理学会コンピュータビジョンとイメージメディア研究会
    • Place of Presentation
      東京工業大学
    • Year and Date
      2011-03-17
  • [Presentation] スポーツ観戦中の情報抽出と映像アノテーションへの利用2011

    • Author(s)
      大平茂輝
    • Organizer
      情報処理学会第73回全国大会
    • Place of Presentation
      東京工業大学
    • Year and Date
      2011-03-03

URL: 

Published: 2012-02-13   Modified: 2016-04-21  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi