Rchr
J-GLOBAL ID:200901098164001804   Update date: Jan. 23, 2025

Kaneko Hirohiko

カネコ ヒロヒコ | Kaneko Hirohiko
Affiliation and department:
Job title: Professor
Research field  (1): Perceptual information processing
Research theme for competitive and other funds  (11):
  • 2024 - 2029 Non-verbal communication support system based on identification of attentional target by pupillary response
  • 2021 - 2024 Attention estimation methods based on pupillary response and an information input system based on the methods
  • 2015 - 2018 Classification and mechanism for individual difference in stereopsis
  • 2012 - 2016 Effect of binocular disparity on unconscious body actions
  • 2009 - 2014 Developments of various kind of three dimensional information processing in infants
Show all
Papers (46):
  • 内田 陸太, 久方 瑠美, 金子 寛彦. 両眼視差による奥行き知覚可能な左右眼刺激の限界時間差に与える刺激記憶の影響. Vision. 2024. 36. 4. 182-182
  • Li Guanhua, 久方 瑠美, 金子 寛彦. Accessing Motion Sickness during Walking in Virtual Reality Environment Based on Pupil Response(タイトル和訳中). Vision. 2024. 36. 4. 195-196
  • 高 奥, 久方 瑠美, 金子 寛彦. アイコンタクト動画観察時の瞳孔反応を用いた社交不安状態の推定. Vision. 2024. 36. 4. 199-199
  • 前田 吏功, 久方 瑠美, 金子 寛彦. 視野拡張順応による歩行,手技到達行動,両眼視野闘争の変化. Vision. 2024. 36. 4. 199-200
  • 梁 徳法, 久方 瑠美, 金子 寛彦. 垂直大きさ視差が単眼像の奥行きに与える影響. Vision. 2024. 36. 1. 53-53
more...
MISC (237):
  • LV Yuhong, 久方瑠美, 金子寛彦. Pupillary Variabilities Properties Arising from Objective-based Visual Attention with Dynamic Stimuli and the Feasibility of a Communication Support System Based on These Properties. 電子情報通信学会技術研究報告(Web). 2024. 124. 19(HCS2024 1-33)
  • GAO Ao, LV Yuhong, 土井理美, 伊角彩, 久方瑠美, 金子寛彦. Estimation of mental state using pupillary response during eye contact. 電子情報通信学会技術研究報告(Web). 2024. 124. 19(HCS2024 1-33)
  • 鈴木真冬, 久方瑠美, 金子寛彦. Developing Interface for controlling camera & visual direction in VR environment using eye movement. 情報処理学会研究報告(Web). 2024. 2024. EC-72
  • SU Yi, 金子寛彦, 久方瑠美. Gravity direction perception based on parabolic motion cues. Vision. 2023. 35. 1
  • 川野智希, 久方瑠美, 金子寛彦. Development of information input interface using eye movement and pupillary light reflex. 電子情報通信学会技術研究報告(Web). 2023. 123. 24(HCS2023 1-41)
more...
Books (4):
  • 4・1 空間知覚の基礎
    オーム社出版局 2009
  • 視覚II; 立体・奥行きの知覚の手がかり
    朝倉書房 2007
  • 視覚II;両眼立体視の特性とモデル
    朝倉書房 2007
  • 視覚II;両眼情報の統合と奥行き情報の統合;奥行き手がかりの統合と相互作用
    朝倉書房 2007
Lectures and oral presentations  (159):
  • Optimal frequency band for the estimation of emotional state based on the frequency analysis of pupil frequency band for the estimation of emotional state based on the frequency analysis of pupil diameter variability using Wavelet Transformation
    (IEICE Technical Report 2010)
  • Effect of the magnitude and extent of attention on sacade eye movement
    (IEICE Technical Report 2010)
  • Change of the eye movement in the learning process of face discrimination
    (IEICE Technical Report 2010)
  • Effext of Perceptual entropy on the perception of order and disorder
    (IEICE Technical Repot 2010)
  • The luminance distribution and the perception of gravitaion vertical in pictures
    (IEICE Technical Report 2010)
more...
Education (3):
  • - 1992 Tokyo Institute of Technology Graduate School, Division of Integrated Science
  • - 1992 Tokyo Institute of Technology
  • - 1987 Tokyo Institute of Technology School of Science
Professional career (2):
  • Doctor of Engineering (Tokyo Institute of Technology)
  • Master of Engineering (Tokyo Institute of Technology)
Work history (3):
  • 1995 - 2000 Advanced Telecommunications Research Institute International
  • 2000 - Tokyo Institute of Technology
  • 1992 - 1995 York University 博士研究員
Committee career (1):
  • 1999 - 日本視覚学会 世話人
Awards (1):
  • 1998 - 応用物理学会光学論文賞
Association Membership(s) (8):
Vision Society of Japan ,  The Optical Society of Japan ,  The Association for Research in Vision and Ophthalmology ,  The Japanese Psychological Association ,  日本視覚学会 ,  日本光学会 ,  The Association for Research in Vision and Ophthalmology ,  日本心理学会
※ Researcher’s information displayed in J-GLOBAL is based on the information registered in researchmap. For details, see here.

Return to Previous Page