2022 Fiscal Year Annual Research Report
A scalable privacy-preserving information retrieval system based on federated optimization, on-device intelligence and semantic matching
Project/Area Number |
19H04215
|
Research Institution | University of Tsukuba |
Principal Investigator |
于 海涛 筑波大学, 図書館情報メディア系, 准教授 (30751052)
|
Co-Investigator(Kenkyū-buntansha) |
吉川 正俊 京都大学, 情報学研究科, 教授 (30182736)
康 シン 徳島大学, 大学院社会産業理工学研究部(理工学域), 助教 (80777350)
|
Project Period (FY) |
2019-04-01 – 2024-03-31
|
Keywords | federated learning / online learning-to-rank / interactive search |
Outline of Annual Research Achievements |
This year our main task is to develop the federated information retrieval model based on a large-scale dataset. To this end, we firstly view information retrieval as an interactive process between users and the search engine system rather than an independent ranking per query. Secondly, we explore how to effectively integrate online learning-to-rank and federated learning. Specifically, online learning-to-rank enables us to cope with the aforementioned interactive process. Federated learning enables us to deal with the privacy issue by learning the ranking model in an on-device manner. Based on a series of experiments on several benchmark ranking datasets, our experimental results show that this direction is applicable.
|
Current Status of Research Progress |
Current Status of Research Progress
3: Progress in research has been slightly delayed.
Reason
Due to the impact of COVID-19, the number of students is limited in order to guarantee a healthy working environment when using the research room. Sometimes the research work has to be conducted either online or at home. As a result, the efficiency is impacted to some extent.
|
Strategy for Future Research Activity |
The major research objective of the next fiscal year is to develop the federated information retrieval model. Since the large-scale language models have achieved a revolutionary effect on many fields, including but not limited to, natural language processing and information retrieval. For the future work, we will explore whether it is possible to make full use of the newly proposed large-scale language models to enhance our research. Finally, the COVID-19 pandemic has passed away. We plan to have more on-site discussions and attend more top international conferences in order to better conduct the planned research.
|
Research Products
(9 results)