• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2022 Fiscal Year Final Research Report

Dynamic observation of dislocation glide by machine-learning-assisted in-situ electron microscopy

Research Project

  • PDF
Project/Area Number 20K21093
Research Category

Grant-in-Aid for Challenging Research (Exploratory)

Allocation TypeMulti-year Fund
Review Section Medium-sized Section 26:Materials engineering and related fields
Research InstitutionKyushu University

Principal Investigator

Saito Hikaru  九州大学, 先導物質化学研究所, 准教授 (50735587)

Co-Investigator(Kenkyū-buntansha) 波多 聡  九州大学, 総合理工学研究院, 教授 (60264107)
Project Period (FY) 2020-07-30 – 2023-03-31
Keywords透過電子顕微鏡 / 転位 / その場観察 / 機械学習 / ノイズフィルタリング
Outline of Final Research Achievements

Current temporal resolution of STEM is far below that of conventional TEM. Rapid image acquisition in the millisecond per frame or faster generally causes image distortion, poor electron signals, and unidirectional blurring, which are obstacles for realizing video-rate STEM observation. In this study, we have developed a deep learning (DL)-based denoising and image distortion correction for STEM rapid image acquisition, which can remove not only the statistical noise but also the unidirectional blurring. By using this DL-based denoising method, we have achieved rapid STEM tomography visualizing 3D dislocation arrangement only within five-second acquisition of all the tilt-series images even in a 300 nm thick steel specimen. Video-rate STEM observation of thermally activated dislocation glide was also realized.

Free Research Field

電子顕微鏡

Academic Significance and Societal Importance of the Research Achievements

豊かで安全な生活の追求、継続的な社会インフラの整備には材料開発が欠かせない。材料組織の変形過程をナノスケールで直接観察することが可能になれば、マクロスケールでの材料機能や特性を材料微細組織と結びつけた理解がより進展し、開発速度の飛躍的な向上や新たな開発アプローチのヒントを得ることが期待できる。本開発手法は転位運動だけでなく種々の材料中の動的現象のナノイメージングによる解析に広く応用可能である。また、試料 への電子線照射量の大幅な低減を可能とし、電子線に対して耐久力の低いソフトマテリアルや反応性物質、生体の電子顕微鏡解析にも役立つ。

URL: 

Published: 2024-01-30  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi