»úе½ø½¨ÖÐËæ»úÅ£¶Ùµü´ú·¨µÄÇбÈÑ©·ò¼Ó¿ìËã·¨

2025.10.16

Ͷ¸å£ºÉÛ·Ü·Ò²¿ÃÅ£ºÀíѧԺä¯ÀÀ´ÎÊý£º

»î¶¯ÐÅÏ¢

»ã±¨±êÌâ (Title)£ºChebyshev polynomial acceleration of stochastic Newton method for machine learning

(»úе½ø½¨ÖÐËæ»úÅ£¶Ùµü´ú·¨µÄÇбÈÑ©·ò¼Ó¿ìËã·¨)

»ã±¨ÈË (Speaker)£ºÅ˽¨è¤ ½ÌÊÚ£¨»ª¶«Ê¦·¶´óѧ£©

»ã±¨¹¦·ò (Time)£º2025Äê10ÔÂ14ÈÕ(Öܶþ) 15:00

»ã±¨µØÖ· (Place)£ºÐ£±¾²¿GJ303

Ô¼ÇëÈË(Inviter)£ºÁõÇÉ»ª

Ö÷°ì²¿ÃÅ£ºÀíѧԺÊýѧϵ

»ã±¨ÌáÒª£º

In this talk, we consider the acceleration of stochastic Newton method for the large scale optimization problems arising from machine learning. In order to reduce the cost of computing Hessian and Hessian inverse, we propose to utilize the Chebyshev polynomial to approximate the Hessian inverse. We show that, by utilizing the short-term recurrence formula, Chebyshev polynomial approximation can effectively reduce the computational cost. The convergence analysis are given and experiments on multiple benchmarks are carried out to illustrate the performance of our proposed acceleration method.

¡¾ÍøÕ¾µØÍ¼¡¿