Learning Deep Second-order Visual Representations via Symmetric Positive Definite Matrices

發布者:曹玲玲發布時間:2025-01-11浏覽次數:10

報告人:Lei Wang 教授 University of Wollongong 

主持人:魏通

報告時間:2015年1月16日(周四)上午10:00

報告地點:bet356手机版唯一官网九龍湖校區計算機樓513報告廳

報告摘要:Fine-grained image recognition requires robust feature representations to capture the subtle differences between image classes. In recent years, the covariance matrix has emerged as a powerful feature representation for this task. In this talk, I will present our work on learning generic symmetric positive definite matrices to improve image recognition performance. The first part of the talk explores how kernel-matrix-based representations can be integrated into deep neural networks, allowing for end-to-end joint learning. The second part focuses on learning sparse inverse covariance matrices to develop deep visual representations. I will present experimental results on various fine-grained image recognition tasks to highlight the effectiveness and advantages of the proposed methods.

報告人簡介:Lei Wang received his PhD degree from Nanyang Technological University, Singapore. He is now Professor and Director of Centre for Artificial Intelligence at School of Computing and Information Technology of University of Wollongong Australia. His research interests include machine learning, pattern recognition, and computer vision. Lei Wang has published over 200 peer-reviewed papers, including those in highly regarded journals and conferences such as IEEE TPAMI, IJCV, CVPR, ICCV and ECCV, etc. He was awarded the Early Career Researcher Award by Australian Academy of Science and Australian Research Council. He served as Program Co-Chair of ACCV 2022, Senior Area Chair of NeurIPS 2024 and ACMMM 2024, Area Chair of NeurIPS 2023, CVPR 2024-25, and ICLR 2024-25, and Action Editor for Transactions on Machine Learning Research. Lei Wang is a senior member of IEEE.

  • 聯系方式
  • 通信地址:南京市江甯區bet356手机版唯一官网路2号bet356手机版唯一官网九龍湖校區計算機學院
  • 郵政編碼:211189
  • ​辦公地點:bet356手机版唯一官网九龍湖校區計算機樓
  • 學院微信公衆号
Baidu
sogou