Y. Takeishi, M. Iida & J. Takeuchi:
“Approximate Spectral Decomposition of Fisher Information Matrix for Simple ReLU Networks,” Neural Networks, Vol. 164, pp. 691-706, July 2023. (Link)
Y. Takeishi & J. Takeuchi:
“An Improved Analysis of Least Squares Superposition Codes with Bernoulli Dictionary,” Japanese Journal of Statistics and Data Science, 2, pp. 591-613, September 2019.
Y. Takeishi, M. Kawakita, & J. Takeuchi:
“Least Squares Superposition Codes with Bernoulli Dictionary are Still Reliable at Rates up to Capacity,” IEEE Transactions on Information Theory, Vol. 60, No. 5, pp. 2737-2750, May 2014.
Conference Papers (peer reviewed)
M. Iida, Y. Takeishi, & J. Takeuchi:
“On Fisher Information Matrix for Simple Neural Networks With Softplus Activation,” Proc. of 2022 IEEE International Symposium on Information Theory, pp. 3014 - 3019, Espoo, Finland, June 26-July 1, 2022.
Y. Takeishi & J. Takeuchi:
“An Improved Upper Bound on Block Error Probability of Least Squares Superposition Codes with Unbiased Bernoulli Dictionary,” Proc. of 2016 IEEE International Symposium on Information Theory, pp. 1168 - 1172, Barcelona, Spain, July 10-15, 2016.
Y. Takeishi, M. Kawakita, & J. Takeuchi:
“Least Squares Superposition Codes with Bernoulli Dictionary are Still Reliable at Rates up to Capacity,” Proc. of 2013 IEEE International Symposium on Information Theory, pp. 1396-1400, Istanbul, Turkey, July 7-12, 2013.
Preprint
Y. Takeishi, M. Iida, & J. Takeuchi:
“Approximate Spectral Decomposition of Fisher Information Matrix for Simple ReLU Networks,” arXiv:2111.15256, 2021.
Materials
Eigenvectors of Fisher Information Matrix for Simple ReLU Networks (IBIS2021, 2021/11/10~2021/11/13) (Link)
On Sparse Superposition Codes and Discretization of Their Dictionaries (Mathematics for Innovation in Telecommunications Technology, 2022/09/15~2022/09/16) (Link)
A novel method to stabilize gradient descent learning (MIRAI2.0 Research & Innovation Week 2022, 2022/11/16) (Link)
Research Interest
Information theory (Coding theory, MDL principle)
スパース重ね合わせ符号についての
解説論文
を書きました(2023年7月)
Machine learning theory (Neural networks)
Curriculum Vitae
Education
2022 Dr.Eng.
Graduate School of Information Science and Electrical Engineering, Kyushu University, Japan.
Thesis: Some Studies on Linear Regression Problems in Sparse Superposition Codes and Neural Networks.
Advisor: Prof. Jun'ichi Takeuchi.
2013 M.Eng.
Graduate School of Information Science and Electrical Engineering, Kyushu University, Japan.
Thesis: A Study on Discrete Sparse Superposition Codes for Additive White Gaussian Noise Channel.
Advisor: Prof. Jun'ichi Takeuchi.
2011 B.Eng.
School of Engineering, Kyushu University, Japan.
Job Experience
April 2022--present.
Assistant Professor, Faculty of Information Science and Electrical Engineering, Kyushu University.
April 2013--February 2021.
Server Engineer at Mitsubishi Electric Information Network Corporation.