关注
Shida Wang
Shida Wang
在 u.nus.edu 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
State-space models with layer-wise nonlinearity are universal approximators with exponential decaying memory
S Wang, B Xue
Advances in Neural Information Processing Systems 36, 2024
122024
A brief survey on the approximation theory for sequence modelling
H Jiang, Q Li, Z Li, S Wang
Journal of Machine Learning (JML) 2 (1), 1-30, 2023
72023
Efficient hyperdimensional computing
Z Yan, S Wang, K Tang, WF Wong
Joint European Conference on Machine Learning and Knowledge Discovery in …, 2023
42023
Inverse approximation theory for nonlinear recurrent neural networks
S Wang, Z Li, Q Li
The 12th International Conference on Learning Representations (Spotlight …, 2024
32024
StableSSM: Alleviating the Curse of Memory in State-space Models through Stable Reparameterization
S Wang, Q Li
Proceedings of the 41 st International Conference on Machine Learning, 2023
22023
HyperSNN: A new efficient and robust deep learning model for resource constrained control applications
Z Yan, S Wang, K Tang, WF Wong
arXiv preprint arXiv:2308.08222, 2023
12023
The Effects of Nonlinearity on Approximation Capacity of Recurrent Neural Networks
S Wang, Z Li, Q Li
12022
LongSSM: On the Length Extension of State-space Models in Language Modelling
S Wang
arXiv preprint arXiv:2406.02080, 2024
2024
Integrating Deep Learning and Synthetic Biology: A Co-Design Approach for Enhancing Gene Expression via N-terminal Coding Sequences
Z Yan, W Chu, Y Sheng, K Tang, S Wang, Y Liu, WF Wong
arXiv preprint arXiv:2402.13297, 2024
2024
Improve Long-term Memory Learning Through Rescaling the Error Temporally
S Wang, Z Yan
arXiv preprint arXiv:2307.11462, 2023
2023
系统目前无法执行此操作,请稍后再试。
文章 1–10