Yoichi Ishibashi, Sho Yokoi, Katsuhito Sudoh, Satoshi Nakamura. Subspace Representations for Soft Set Operations and Sentence Similarities. Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers). 2024. 3512-3524
Goro Kobayashi, Tatsuki Kuribayashi, Sho Yokoi, Kentaro Inui. Transformer Language Models Handle Word Frequency in Prediction Head. ACL (Findings). 2023. 4523-4535
Momose Oyama, Sho Yokoi, Hidetoshi Shimodaira. Norm of word embedding encodes information gain. 2022
Hiroaki Yamagiwa, Sho Yokoi, Hidetoshi Shimodaira. Improving word mover's distance by leveraging self-attention matrix. CoRR. 2022. abs/2211.06229
Yoichi Ishibashi, Sho Yokoi, Katsuhito Sudoh, Satoshi Nakamura. Subspace-based Set Operations on a Pre-trained Word Embedding Space. CoRR. 2022. abs/2210.13034