【2020-2024持续更新】Echo State Network和储层计算论文汇总!包括经典ESN、DeepESN、组合ESN和综述!

本文主要是介绍【2020-2024持续更新】Echo State Network和储层计算论文汇总!包括经典ESN、DeepESN、组合ESN和综述!,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!

关键词:ESN、Echo state network、Reservoir Computing
更新时间:2024

目录

  • 1 综述
  • 2 ESN模型分类
    • 2.1 ESN
    • 2.2 DeepESN
    • 2.3 组合ESN
  • 3 开源论文
  • 4 储层计算相关研究
  • 5 应用

1 综述

  1. Gallicchio, Claudio and Alessio Micheli. “Deep Echo State Network (DeepESN): A Brief Survey.” ArXiv abs/1712.04323 (2017): n. pag.
  2. Sun, Chenxi et al. “A Systematic Review of Echo State Networks From Design to Application.” IEEE Transactions on Artificial Intelligence 5 (2024): 23-37.
  3. Soltani, Rebh et al. “Echo State Network Optimization: A Systematic Literature Review.” Neural Processing Letters 55 (2023): 10251-10285.
  4. Xu Y. A review of machine learning with echo state networks[J]. Proj. Rep, 2020.
  5. Margin D A, Dobrota V. Overview of Echo State Networks using Different Reservoirs and Activation Functions[C]//2021 20th RoEduNet Conference: Networking in Education and Research (RoEduNet). IEEE, 2021: 1-6.
  6. Sun, Chenxi et al. “A Review of Designs and Applications of Echo State Networks.” ArXiv abs/2012.02974 (2020): n. pag.
  7. Sun, Chenxi et al. “A Systematic Review of Echo State Networks From Design to Application.” IEEE Transactions on Artificial Intelligence 5 (2024): 23-37.

2 ESN模型分类

2.1 ESN

典型的ESN由一个输入层、一个循环层(储层,由大量的稀疏连接的神经元组成)和一个输出层组成。包含对经典ESN、并对ESN的结构改进的研究的论文。

  1. Manneschi, Luca et al. “Exploiting Multiple Timescales in Hierarchical Echo State Networks.” Frontiers in Applied Mathematics and Statistics (2021).
  2. Fourati R, Ammar B, Jin Y, et al. EEG feature learning with intrinsic plasticity based deep echo state network[C]//2020 international joint conference on neural networks (IJCNN). IEEE, 2020: 1-8.
  3. Liu, Qianwen et al. “Memory augmented echo state network for time series prediction.” Neural Computing and Applications (2023): 1-16.
  4. Akrami, Abbas et al. “Design of a reservoir for cloud-enabled echo state network with high clustering coefficient.” EURASIP Journal on Wireless Communications and Networking 2020 (2020): 1-14.
  5. Arroyo, Diana Carolina Roca. “A Modified Echo State Network Model Using Non-Random Topology.” (2023).
  6. Fu, Jun et al. “A double-cycle echo state network topology for time series prediction.” Chaos 33 9 (2023): n. pag.
  7. Akrami, Abbas et al. “Design of a reservoir for cloud-enabled echo state network with high clustering coefficient.” EURASIP Journal on Wireless Communications and Networking 2020 (2020): n. pag.
  8. Yang, Cuili and Zhanhong Wu. “Multi-objective sparse echo state network.” Neural Computing and Applications 35 (2022): 2867-2882.
  9. Tortorella, Domenico et al. “Spectral Bounds for Graph Echo State Network Stability.” 2022 International Joint Conference on Neural Networks (IJCNN) (2022): 1-8.
  10. Zheng, Shoujing et al. “Improved Echo State Network With Multiple Activation Functions.” 2022 China Automation Congress (CAC) (2022): 346-350.
  11. Morra, Jacob and Mark Daley. “Imposing Connectome-Derived Topology on an Echo State Network.” 2022 International Joint Conference on Neural Networks (IJCNN) (2022): 1-6.
  12. McDaniel, Shane et al. “Investigating Echo State Network Performance with Biologically-Inspired Hierarchical Network Structure.” 2022 International Joint Conference on Neural Networks (IJCNN) (2022): 01-08.
  13. Yao, Xianshuang et al. “A stability criterion for discrete-time fractional-order echo state network and its application.” Soft Computing 25 (2021): 4823 - 4831.
  14. Mu, Xiaohui and Lixiang Li. “Memristor-based Echo State Network and Prediction for Time Series.” 2021 International Conference on Neuromorphic Computing (ICNC) (2021): 153-158.
  15. Mahmoud, Tarek A. and Lamiaa M. Elshenawy. “TSK fuzzy echo state neural network: a hybrid structure for black-box nonlinear systems identification.” Neural Computing and Applications 34 (2022): 7033 - 7051.
  16. Maksymov, Ivan S. et al. “Neural Echo State Network using oscillations of gas bubbles in water: Computational validation by Mackey-Glass time series forecasting.” Physical review. E 105 4-1 (2021): 044206 .
  17. Wang, Lei et al. “Design of sparse Bayesian echo state network for time series prediction.” Neural Computing and Applications 33 (2020): 7089 - 7102.
  18. Gong, Shangfu et al. “An Improved Small-World Topology for Optimizing the Performance of Echo State Network.” 2020 IEEE Intl Conf on Parallel & Distributed Processing with Applications, Big Data & Cloud Computing, Sustainable Computing & Communications, Social Computing & Networking (ISPA/BDCloud/SocialCom/SustainCom) (2020): 1413-1419.
  19. Iacob, Stefan et al. “Delay-Sensitive Local Plasticity in Echo State Networks.” 2023 International Joint Conference on Neural Networks (IJCNN) (2023): 1-8.
  20. Jordanou, Jean P. et al. “Investigation of Proper Orthogonal Decomposition for Echo State Networks.” Neurocomputing 548 (2022): 126395.
  21. Paassen, Benjamin et al. “Tree Echo State Autoencoders with Grammars.” 2020 International Joint Conference on Neural Networks (IJCNN) (2020): 1-8.(有源码)
  22. Liu, Junxiu, et al. “Echo state network optimization using binary grey wolf algorithm.” Neurocomputing 385 (2020): 310-318.
  23. Trouvain, Nathan, et al. “Reservoirpy: an efficient and user-friendly library to design echo state networks.” International Conference on Artificial Neural Networks. Cham: Springer International Publishing, 2020.(源码)
  24. Hart, Allen, James Hook, and Jonathan Dawes. “Embedding and approximation theorems for echo state networks.” Neural Networks 128 (2020): 234-247.
  25. Morra, Jacob, and Mark Daley. “Imposing Connectome-Derived topology on an echo state network.” 2022 International Joint Conference on Neural Networks (IJCNN). IEEE, 2022.
  26. Na, Xiaodong, Weijie Ren, and Xinghan Xu. “Hierarchical delay-memory echo state network: A model designed for multi-step chaotic time series prediction.” Engineering Applications of Artificial Intelligence 102 (2021): 104229.

2.2 DeepESN

Deep Echo State Network
DeepESN是利用深度学习DL框架堆叠多个ESN而成的网络。它由输入层、动力学堆叠的储层组件和输出层组成。

  1. Bouazizi, Samar et al. “Enhancing EEG-based emotion recognition using PSD-Grouped Deep Echo State Network.” JUCS - Journal of Universal Computer Science (2023): n. pag.
  2. Margin, Dan-Andrei et al. “Deep Reservoir Computing using Echo State Networks and Liquid State Machine.” 2022 IEEE International Black Sea Conference on Communications and Networking (BlackSeaCom) (2022): 208-213.
  3. Wang, Yuanhui et al. “A Weight Optimization Method of Deep Echo State Network Based on Improved Knowledge Evolution.” 2022 China Automation Congress (CAC) (2022): 395-400.
  4. Yang, Xiaojian et al. “An improved deep echo state network inspired by tissue-like P system forecasting for non-stationary time series.” Journal of Membrane Computing 4 (2022): 222 - 231.
  5. Kanda, Keiko and Sou Nobukawa. “Feature Extraction Mechanism for Each Layer of Deep Echo State Network.” 2022 International Conference on Emerging Techniques in Computational Intelligence (ICETCI) (2022): 65-70.
  6. Kim, Taehwan and Brian R. King. “Time series prediction using deep echo state networks.” Neural Computing and Applications (2020): 1-19.
  7. Hu, Ruihan et al. “Ensemble echo network with deep architecture for time-series modeling.” Neural Computing and Applications 33 (2020): 4997 - 5010.
  8. Ma, Qianli, Lifeng Shen, and Garrison W. Cottrell. “DeePr-ESN: A deep projection-encoding echo-state network.” Information Sciences 511 (2020): 152-171.
  9. Song, Zuohua, Keyu Wu, and Jie Shao. “Destination prediction using deep echo state network.” Neurocomputing 406 (2020): 343-353.
  10. Barredo Arrieta, Alejandro, et al. “On the post-hoc explainability of deep echo state networks for time series forecasting, image and video classification.” Neural Computing and Applications (2022): 1-21.(有源码)

2.3 组合ESN

ESN与深度学习、机器学习网络、特殊数据结构结合

  1. Lien, Justin. “Hypergraph Echo State Network.” ArXiv abs/2310.10177 (2023): n. pag.
  2. Deng, Lichi and Yuewei Pan. “Machine Learning Assisted Closed-Loop Reservoir Management using Echo State Network.” (2020).
  3. Trierweiler Ribeiro, Gabriel, et al. “Bayesian optimized echo state network applied to short-term load forecasting.” Energies 13.9 (2020): 2390.

3 开源论文

包含ESN和储层计算的研究,不限时间

  1. Cerina L, Santambrogio M D, Franco G, et al. EchoBay: design and optimization of echo state networks under memory and time constraints[J]. ACM Transactions on Architecture and Code Optimization (TACO), 2020, 17(3): 1-24.
  2. Lukoševičius M, Uselis A. Efficient implementations of echo state network cross-validation[J]. Cognitive computation, 2021: 1-15.
  3. Sun C, Hong S, Song M, et al. Te-esn: Time encoding echo state network for prediction based on irregularly sampled time series data[J]. arXiv preprint arXiv:2105.00412, 2021.
  4. Özdemir A, Scerri M, Barron A B, et al. EchoVPR: Echo state networks for visual place recognition[J]. IEEE Robotics and Automation Letters, 2022, 7(2): 4520-4527.
  5. Li Z, Liu Y, Tanaka G. Multi-Reservoir Echo State Networks with Hodrick–Prescott Filter for nonlinear time-series prediction[J]. Applied Soft Computing, 2023, 135: 110021.
  6. Barredo Arrieta A, Gil-Lopez S, Laña I, et al. On the post-hoc explainability of deep echo state networks for time series forecasting, image and video classification[J]. Neural Computing and Applications, 2022: 1-21.
  7. Robust optimization and validation ofecho state networksfor learning chaotic dynamics
  8. Gallicchio, Claudio and Alessio Micheli. “Deep Echo State Network (DeepESN): A Brief Survey.” ArXiv abs/1712.04323 (2017): n. pag.
  9. Steiner, Peter, Azarakhsh Jalalvand, and Peter Birkholz. “Cluster-based input weight initialization for echo state networks.” IEEE Transactions on Neural Networks and Learning Systems (2022).
  10. Bianchi, Filippo Maria et al. “Bidirectional deep-readout echo state networks.” The European Symposium on Artificial Neural Networks (2017).
  11. Maat, Jacob Reinier et al. “Efficient Optimization of Echo State Networks for Time Series Datasets.” 2018 International Joint Conference on Neural Networks (IJCNN) (2018): 1-7.
  12. Heim, Niklas and James E. Avery. “Adaptive Anomaly Detection in Chaotic Time Series with a Spatially Aware Echo State Network.” ArXiv abs/1909.01709 (2019): n. pag.
  13. Bianchi, Filippo Maria et al. “Reservoir Computing Approaches for Representation and Classification of Multivariate Time Series.” IEEE Transactions on Neural Networks and Learning Systems 32 (2018): 2169-2179.
  14. Lukoševičius, Mantas, and Arnas Uselis. “Efficient implementations of echo state network cross-validation.” Cognitive computation (2021): 1-15.
  15. Lukoševičius, Mantas and Arnas Uselis. “Efficient Cross-Validation of Echo State Networks.” International Conference on Artificial Neural Networks (2019).
  16. Özdemir, Anil et al. “EchoVPR: Echo State Networks for Visual Place Recognition.” IEEE Robotics and Automation Letters PP (2021): 1-1.
  17. Verzelli, Pietro et al. “Echo State Networks with Self-Normalizing Activations on the Hyper-Sphere.” Scientific Reports 9 (2019): n. pag.
  18. Rodriguez, Nathaniel et al. “Optimal modularity and memory capacity of neural reservoirs.” Network Neuroscience 3 (2017): 551 - 566.
  19. Chattopadhyay, Ashesh et al. “Data-driven prediction of a multi-scale Lorenz 96 chaotic system using deep learning methods: Reservoir computing, ANN, and RNN-LSTM.” (2019).
  20. Steiner, Peter, et al. “PyRCN: A toolbox for exploration and application of Reservoir Computing Networks.” Engineering Applications of Artificial Intelligence 113 (2022): 104964.
  21. Strock, Anthony et al. “A Simple Reservoir Model of Working Memory with Real Values.” 2018 International Joint Conference on Neural Networks (IJCNN) (2018): 1-8.
  22. Zhang, Yuanzhao and Sean P. Cornelius. “Catch-22s of reservoir computing.” Physical Review Research (2022): n. pag.
  23. Gao, Ruobin et al. “Dynamic ensemble deep echo state network for significant wave height forecasting.” Applied Energy (2023): n. pag.
  24. Gallicchio, Claudio and Alessio Micheli. “Reservoir Topology in Deep Echo State Networks.” International Conference on Artificial Neural Networks (2019).
  25. Lukoševičius, Mantas and Arnas Uselis. “Efficient Implementations of Echo State Network Cross-Validation.” Cognitive Computation 15 (2020): 1470 - 1484.
  26. Mattheakis, Marios et al. “Unsupervised Reservoir Computing for Solving Ordinary Differential Equations.” ArXiv abs/2108.11417 (2021): n. pag.
  27. Paassen, Benjamin et al. “Tree Echo State Autoencoders with Grammars.” 2020 International Joint Conference on Neural Networks (IJCNN) (2020): 1-8.
  28. Evanusa, Matthew et al. “Hybrid Backpropagation Parallel Reservoir Networks.” ArXiv abs/2010.14611 (2020): n. pag.
  29. Trouvain, Nathan, et al. “Reservoirpy: an efficient and user-friendly library to design echo state networks.” International Conference on Artificial Neural Networks. Cham: Springer International Publishing, 2020.
  30. Cossu, Andrea, et al. “Continual learning with echo state networks.” arXiv preprint arXiv:2105.07674 (2021).
  31. Gauthier, Daniel J., et al. “Next generation reservoir computing.” Nature communications 12.1 (2021): 5564.
  32. Vlachas, Pantelis R., et al. “Backpropagation algorithms and reservoir computing in recurrent neural networks for the forecasting of complex spatiotemporal dynamics.” Neural Networks 126 (2020): 191-217.
  33. Cucchi, Matteo, et al. “Hands-on reservoir computing: a tutorial for practical implementation.” Neuromorphic Computing and Engineering 2.3 (2022): 032002.(储层计算实践)
  34. Mattheakis, Marios, Hayden Joy, and Pavlos Protopapas. “Unsupervised reservoir computing for solving ordinary differential equations.” arXiv preprint arXiv:2108.11417 (2021).
  35. Barredo Arrieta, Alejandro, et al. “On the post-hoc explainability of deep echo state networks for time series forecasting, image and video classification.” Neural Computing and Applications (2022): 1-21.

4 储层计算相关研究

  1. Margin D A, Ivanciu I A, Dobrota V. Deep Reservoir Computing using Echo State Networks and Liquid State Machine[C]//2022 IEEE International Black Sea Conference on Communications and Networking (BlackSeaCom). IEEE, 2022: 208-213.
  2. Bianchi, Filippo Maria et al. “Reservoir Computing Approaches for Representation and Classification of Multivariate Time Series.” IEEE Transactions on Neural Networks and Learning Systems 32 (2018): 2169-2179.
  3. Chattopadhyay, Ashesh et al. “Data-driven prediction of a multi-scale Lorenz 96 chaotic system using deep learning methods: Reservoir computing, ANN, and RNN-LSTM.” (2019).
  4. Steiner, Peter, et al. “PyRCN: A toolbox for exploration and application of Reservoir Computing Networks.” Engineering Applications of Artificial Intelligence 113 (2022): 104964.
  5. Zhang, Yuanzhao and Sean P. Cornelius. “Catch-22s of reservoir computing.” Physical Review Research (2022): n. pag.
  6. Gallicchio, Claudio and Alessio Micheli. “Reservoir Topology in Deep Echo State Networks.” International Conference on Artificial Neural Networks (2019).
  7. Margin, Dan-Andrei et al. “Deep Reservoir Computing using Echo State Networks and Liquid State Machine.” 2022 IEEE International Black Sea Conference on Communications and Networking (BlackSeaCom) (2022): 208-213.
  8. Manjunath, G… “Memory-Loss is Fundamental for Stability and Distinguishes the Echo State Property Threshold in Reservoir Computing & Beyond.” ArXiv abs/2001.00766 (2020): n. pag.
  9. Margin, Dan-Andrei et al. “Deep Reservoir Computing using Echo State Networks and Liquid State Machine.” 2022 IEEE International Black Sea Conference on Communications and Networking (BlackSeaCom) (2022): 208-213.
  10. Gonon, Lukas et al. “Infinite-dimensional reservoir computing.” ArXiv abs/2304.00490 (2023): n. pag.
  11. Sun, Xiaochuan et al. “Towards Fault Tolerance of Reservoir Computing in Time Series Prediction.” Inf. 14 (2023): 266.
  12. Lee, Kundo and Tomoki Hamagami. “Reservoir Computing for Scalable Hardware with Block‐Based Neural Network.” IEEJ Transactions on Electrical and Electronic Engineering 16 (2021): n. pag.
  13. Ren, Bin and Huanfei Ma. “Global optimization of hyper-parameters in reservoir computing.” Electronic Research Archive (2022): n. pag.
  14. Storm, Lance et al. “Constraints on parameter choices for successful reservoir computing.” ArXiv abs/2206.02575 (2022): n. pag.
  15. Bendali, Wadie et al. “Optimization of Deep Reservoir Computing with Binary Genetic Algorithm for Multi-Time Horizon Forecasting of Power Consumption.” Journal Européen des Systèmes Automatisés (2022): n. pag.
  16. Bacciu, Davide et al. “Federated Reservoir Computing Neural Networks.” 2021 International Joint Conference on Neural Networks (IJCNN) (2021): 1-7.
  17. Mattheakis, Marios et al. “Unsupervised Reservoir Computing for Solving Ordinary Differential Equations.” ArXiv abs/2108.11417 (2021): n. pag.(有源码)
  18. Love, Jake et al. “Task Agnostic Metrics for Reservoir Computing.” ArXiv abs/2108.01512 (2021): n. pag.
  19. Heyder, Florian et al. “Generalizability of reservoir computing for flux-driven two-dimensional convection.” Physical review. E 106 5-2 (2021): 055303 .
  20. Honda, Hirotada. “A novel framework for reservoir computing with inertial manifolds.” 2021 International Conference on Artificial Intelligence in Information and Communication (ICAIIC) (2021): 347-352.
  21. Hart, Allen G… “(Thesis) Reservoir Computing With Dynamical Systems.” (2021).(可视化美观)
  22. Doan, Nguyen Anh Khoa et al. “Auto-Encoded Reservoir Computing for Turbulence Learning.” ArXiv abs/2012.10968 (2020): n. pag.
  23. Gallicchio, Claudio et al. “Frontiers in Reservoir Computing.” The European Symposium on Artificial Neural Networks (2020).
  24. Evanusa, Matthew et al. “Hybrid Backpropagation Parallel Reservoir Networks.” ArXiv abs/2010.14611 (2020): n. pag.(有源码)
  25. Kleyko, Denis, et al. “Integer echo state networks: Efficient reservoir computing for digital hardware.” IEEE Transactions on Neural Networks and Learning Systems 33.4 (2020): 1688-1701.
  26. Huhn, Francisco, and Luca Magri. “Gradient-free optimization of chaotic acoustics with reservoir computing.” Physical Review Fluids 7.1 (2022): 014402.
  27. Alomar, Miquel L., et al. “Efficient parallel implementation of reservoir computing systems.” Neural Computing and Applications 32 (2020): 2299-2313.
  28. Manneschi, Luca, Andrew C. Lin, and Eleni Vasilaki. “SpaRCe: Improved learning of reservoir computing systems through sparse representations.” IEEE Transactions on Neural Networks and Learning Systems (2021).
  29. Damicelli, Fabrizio, Claus C. Hilgetag, and Alexandros Goulas. “Brain connectivity meets reservoir computing.” PLoS Computational Biology 18.11 (2022): e1010639.
  30. Gauthier, Daniel J., et al. “Next generation reservoir computing.” Nature communications 12.1 (2021): 5564.(有源码)
  31. Gallicchio, Claudio. “Sparsity in reservoir computing neural networks.” 2020 International Conference on INnovations in Intelligent SysTems and Applications (INISTA). IEEE, 2020.
  32. Vlachas, Pantelis R., et al. “Backpropagation algorithms and reservoir computing in recurrent neural networks for the forecasting of complex spatiotemporal dynamics.” Neural Networks 126 (2020): 191-217.
  33. Cucchi, Matteo, et al. “Hands-on reservoir computing: a tutorial for practical implementation.” Neuromorphic Computing and Engineering 2.3 (2022): 032002.(有源码)(储层计算实践)
  34. Lim, Soon Hoe, et al. “Predicting critical transitions in multiscale dynamical systems using reservoir computing.” Chaos: An Interdisciplinary Journal of Nonlinear Science 30.12 (2020).
  35. Mattheakis, Marios, Hayden Joy, and Pavlos Protopapas. “Unsupervised reservoir computing for solving ordinary differential equations.” arXiv preprint arXiv:2108.11417 (2021).(有源码)

5 应用

  1. Bouazizi S, Benmohamed E, Ltifi H. Enhancing EEG-based emotion recognition using PSD-Grouped Deep Echo State Network[J]. JUCS: Journal of Universal Computer Science, 2023, 29(10).
  2. Valencia C H, Vellasco M M B R, Figueiredo K. Echo State Networks: Novel reservoir selection and hyperparameter optimization model for time series forecasting[J]. Neurocomputing, 2023, 545: 126317.
  3. Viehweg J, Worthmann K, Mäder P. Parameterizing echo state networks for multi-step time series prediction[J]. Neurocomputing, 2023, 522: 214-228.
  4. Bai, Yu-ting et al. “Nonstationary Time Series Prediction Based on Deep Echo State Network Tuned by Bayesian Optimization.” Mathematics (2023): n. pag.
  5. Bianchi, Filippo Maria et al. “Reservoir Computing Approaches for Representation and Classification of Multivariate Time Series.” IEEE Transactions on Neural Networks and Learning Systems 32 (2018): 2169-2179.
  6. Özdemir, Anil et al. “EchoVPR: Echo State Networks for Visual Place Recognition.” IEEE Robotics and Automation Letters PP (2021): 1-1.
  7. Gao, Ruobin et al. “Dynamic ensemble deep echo state network for significant wave height forecasting.” Applied Energy (2023): n. pag.
  8. Liu, Qianwen et al. “Memory augmented echo state network for time series prediction.” Neural Computing and Applications (2023): 1-16.
  9. Deng, Lichi and Yuewei Pan. “Machine-Learning-Assisted Closed-Loop Reservoir Management Using Echo State Network for Mature Fields under Waterflood.” Spe Reservoir Evaluation & Engineering 23 (2020): n. pag.
  10. Mandal, Swarnendu and Manish Dev Shrimali. “Learning unidirectional coupling using echo-state network.” Physical review. E 107 6-1 (2023): 064205 .
  11. Koprinkova-Hristova, Petia D. et al. “Echo state network for features extraction and segmentation of tomography images.” Computer Science and Information Systems (2023): n. pag.
  12. Bouazizi, Samar et al. “Enhancing EEG-based emotion recognition using PSD-Grouped Deep Echo State Network.” JUCS - Journal of Universal Computer Science (2023): n. pag.
  13. Soltani, Rebh et al. “Optimized Echo State Network based on PSO and Gradient Descent for Choatic Time Series Prediction.” 2022 IEEE 34th International Conference on Tools with Artificial Intelligence (ICTAI) (2022): 747-754.
  14. Caremel, Cedric et al. “Echo State Network for Soft Actuator Control.” J. Robotics Mechatronics 34 (2022): 413-421.
  15. Ren, Weijie et al. “Time series prediction based on echo state network tuned by divided adaptive multi-objective differential evolution algorithm.” Soft Computing 25 (2021): 4489 - 4502.
  16. Na, Yongsu et al. “Near real-time predictions of tropical cyclone trajectory and intensity in the northwestern Pacific Ocean using echo state network.” Climate Dynamics 58 (2021): 651 - 667.
  17. Gandhi, Manjunath. “An Echo State Network Imparts a Curve Fitting.” IEEE Transactions on Neural Networks and Learning Systems 33 (2021): 2596-2604.
  18. Jere, Shashank et al. “Channel Equalization Through Reservoir Computing: A Theoretical Perspective.” IEEE Wireless Communications Letters 12 (2023): 774-778.
  19. Jordanou, Jean P. et al. “Echo State Networks for Practical Nonlinear Model Predictive Control of Unknown Dynamic Systems.” IEEE Transactions on Neural Networks and Learning Systems 33 (2021): 2615-2629.
  20. Kim, Taehwan and Brian R. King. “Time series prediction using deep echo state networks.” Neural Computing and Applications (2020): 1-19.
  21. Simov, Kiril Ivanov et al. “A Reservoir Computing Approach to Word Sense Disambiguation.” Cognitive Computation 15 (2020): 1409 - 1418.
  22. Cossu, Andrea, et al. “Continual learning with echo state networks.” arXiv preprint arXiv:2105.07674 (2021).(有源码)
  23. Fourati, Rahma, et al. “EEG feature learning with intrinsic plasticity based deep echo state network.” 2020 international joint conference on neural networks (IJCNN). IEEE, 2020.
  24. Fourati, Rahma, et al. “Unsupervised learning in reservoir computing for eeg-based emotion recognition.” IEEE Transactions on Affective Computing 13.2 (2020): 972-984.

这篇关于【2020-2024持续更新】Echo State Network和储层计算论文汇总!包括经典ESN、DeepESN、组合ESN和综述!的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!



http://www.chinasem.cn/article/610429

相关文章

Pandas处理缺失数据的方式汇总

《Pandas处理缺失数据的方式汇总》许多教程中的数据与现实世界中的数据有很大不同,现实世界中的数据很少是干净且同质的,本文我们将讨论处理缺失数据的一些常规注意事项,了解Pandas如何表示缺失数据,... 目录缺失数据约定的权衡Pandas 中的缺失数据None 作为哨兵值NaN:缺失的数值数据Panda

Python实现精确小数计算的完全指南

《Python实现精确小数计算的完全指南》在金融计算、科学实验和工程领域,浮点数精度问题一直是开发者面临的重大挑战,本文将深入解析Python精确小数计算技术体系,感兴趣的小伙伴可以了解一下... 目录引言:小数精度问题的核心挑战一、浮点数精度问题分析1.1 浮点数精度陷阱1.2 浮点数误差来源二、基础解决

Python文本相似度计算的方法大全

《Python文本相似度计算的方法大全》文本相似度是指两个文本在内容、结构或语义上的相近程度,通常用0到1之间的数值表示,0表示完全不同,1表示完全相同,本文将深入解析多种文本相似度计算方法,帮助您选... 目录前言什么是文本相似度?1. Levenshtein 距离(编辑距离)核心公式实现示例2. Jac

MySQL 数据库表操作完全指南:创建、读取、更新与删除实战

《MySQL数据库表操作完全指南:创建、读取、更新与删除实战》本文系统讲解MySQL表的增删查改(CURD)操作,涵盖创建、更新、查询、删除及插入查询结果,也是贯穿各类项目开发全流程的基础数据交互原... 目录mysql系列前言一、Create(创建)并插入数据1.1 单行数据 + 全列插入1.2 多行数据

linux安装、更新、卸载anaconda实践

《linux安装、更新、卸载anaconda实践》Anaconda是基于conda的科学计算环境,集成1400+包及依赖,安装需下载脚本、接受协议、设置路径、配置环境变量,更新与卸载通过conda命令... 目录随意找一个目录下载安装脚本检查许可证协议,ENTER就可以安装完毕之后激活anaconda安装更

Python中经纬度距离计算的实现方式

《Python中经纬度距离计算的实现方式》文章介绍Python中计算经纬度距离的方法及中国加密坐标系转换工具,主要方法包括geopy(Vincenty/Karney)、Haversine、pyproj... 目录一、基本方法1. 使用geopy库(推荐)2. 手动实现 Haversine 公式3. 使用py

Nginx进行平滑升级的实战指南(不中断服务版本更新)

《Nginx进行平滑升级的实战指南(不中断服务版本更新)》Nginx的平滑升级(也称为热升级)是一种在不停止服务的情况下更新Nginx版本或添加模块的方法,这种升级方式确保了服务的高可用性,避免了因升... 目录一.下载并编译新版Nginx1.下载解压2.编译二.替换可执行文件,并平滑升级1.替换可执行文件

Python按照24个实用大方向精选的上千种工具库汇总整理

《Python按照24个实用大方向精选的上千种工具库汇总整理》本文整理了Python生态中近千个库,涵盖数据处理、图像处理、网络开发、Web框架、人工智能、科学计算、GUI工具、测试框架、环境管理等多... 目录1、数据处理文本处理特殊文本处理html/XML 解析文件处理配置文件处理文档相关日志管理日期和

Python38个游戏开发库整理汇总

《Python38个游戏开发库整理汇总》文章介绍了多种Python游戏开发库,涵盖2D/3D游戏开发、多人游戏框架及视觉小说引擎,适合不同需求的开发者入门,强调跨平台支持与易用性,并鼓励读者交流反馈以... 目录PyGameCocos2dPySoyPyOgrepygletPanda3DBlenderFife

SQL Server跟踪自动统计信息更新实战指南

《SQLServer跟踪自动统计信息更新实战指南》本文详解SQLServer自动统计信息更新的跟踪方法,推荐使用扩展事件实时捕获更新操作及详细信息,同时结合系统视图快速检查统计信息状态,重点强调修... 目录SQL Server 如何跟踪自动统计信息更新:深入解析与实战指南 核心跟踪方法1️⃣ 利用系统目录