讲座题目:Neural Systems Integrating and Segregating Informationfrom Different Channels
主讲人:K. Y. Michael Wong
主持人:刘宗华
开始时间:2019年12月19日上午10:00
讲座地址:闵行校区物理楼314会议室
报告人简介:
K.Y. Michael Wong,香港科技大学物理系教授。1978 年获香港大学物理系物理学学士、1981年与1986年分别获美国 UCLA物理学硕士与博士学位。1986至1989年在帝国大学做博士后,1989至1991年在牛津大学做博士后,1982年至1997年在香港科技大学物理系任讲师,1998年至2006年任副教授,2006年至现在任教授。2002年至2006年任香港物理学会主席,现为Journal of Statistical Mechanics: Theory andExperiment编辑(2007-present),曾任Special Issue of Philosophical Magazine on DavidSherrington’s 70th Birthday (2011) 与Research Topic onNeural Information Processing with Dynamical Synapses, Frontiers inComputational Neuroscience (2012-2013) 的Guest Editor,并获EPL Distinguished Referee 2014。到目前为止已在美国科学院期刊PNAS及物理学顶级期刊PRL等发表论文184篇。研究领域为统计物理、相变理论、大脑认知科学、及复杂网络科学。
报告内容简介:
Neural systems gather information fromdifferent channels resulting in enhanced reliability. The optimal estimate isgiven by Bayes' rule, and remarkably, experiments on macaques showed that thebrain can achieve this optimum. It is therefore interesting to consider theneural architecture and mechanism underlying this feat. Nevertheless, thisleads to some interesting information issues to be addressed. For channelinputs that are correlated in their prior distribution, where and how is theinformation stored? Furthermore, when the channel inputs are partiallycorrelated, how can the system perform partial integration? Besides performingintegration when the channel inputs are correlated, how can the system performsegregation when they are not? Finally, since input integration implies thatthe system extracts the information common to the channels, can the informationof the individual channels be recovered? To answer these questions, I willdiscuss the results of our study on a decentralized network architecture wheresame-channel and cross-channel information are processed in parallel. I willprovide an explanation of the role played by opposite neurons in the VIP andMSTd areas of the brain, and report a striking discovery that the direct andindirect cross-channel pathways are opposite to each other -- an apparentlyredundant architecture.