LSTM-SAC reinforcement learning based resilient energy trading for networked microgrid system
dc.contributor.author | Sharma, Desh Deepak | |
dc.contributor.author | Bansal, Ramesh C. | |
dc.date.accessioned | 2025-04-16T12:19:57Z | |
dc.date.available | 2025-04-16T12:19:57Z | |
dc.date.issued | 2025-03 | |
dc.description.abstract | On the whole, the present microgrid constitutes numerous actors in highly decentralized environments and liberalized electricity markets. The networked microgrid system must be capable of detecting electricity price changes and unknown variations in the presence of rare and extreme events. The networked microgrid system comprised of interconnected microgrids must be adaptive and resilient to undesirable environmental conditions such as the occurrence of different kinds of faults and interruptions in the main grid supply. The uncertainties and stochasticity in the load and distributed generation are considered. In this study, we propose resilient energy trading incorporating DC-OPF, which takes generator failures and line outages (topology change) into account. This paper proposes a design of Long Short-Term Memory (LSTM) - soft actor-critic (SAC) reinforcement learning for the development of a platform to obtain resilient peer-to-peer energy trading in networked microgrid systems during extreme events. A Markov Decision Process (MDP) is used to develop the reinforcement learning-based resilient energy trade process that includes the state transition probability and a grid resilience factor for networked microgrid systems. LSTM-SAC continuously refines policies in real-time, thus ensuring optimal trading strategies in rapidly changing energy markets. The LSTM networks have been used to estimate the optimal Q-values in soft actor-critic reinforcement learning. This learning mechanism takes care of the out-of-range estimates of Q-values while reducing the gradient problems. The optimal actions are decided with maximized rewards for peer-to-peer resilient energy trading. The networked microgrid system is trained with the proposed learning mechanism for resilient energy trading. The proposed LSTM-SAC reinforcement learning is tested on a networked microgrid system comprised of IEEE 14 bus systems. | en_US |
dc.description.department | Electrical, Electronic and Computer Engineering | en_US |
dc.description.librarian | hj2024 | en_US |
dc.description.sdg | SDG-07:Affordable and clean energy | en_US |
dc.description.uri | https://www.aimspress.com/journal/electreng | en_US |
dc.identifier.citation | Sharma, D.D. & Bansal, R.C. 2025, 'LSTM-SAC reinforcement learning based resilient energy trading for networked microgrid system', AIMS Electronics and Electrical Engineering, vol. 9, no. 2, pp. 165-191, doi : 10.3934/electreng.2025009. | en_US |
dc.identifier.issn | 2578-1588 (online) | |
dc.identifier.other | 10.3934/electreng.2025009 | |
dc.identifier.uri | http://hdl.handle.net/2263/102140 | |
dc.language.iso | en | en_US |
dc.publisher | AIMS Press | en_US |
dc.rights | © 2025 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0). | en_US |
dc.subject | Long short-term memory (LSTM) | en_US |
dc.subject | Networked microgrids | en_US |
dc.subject | Reinforcement learning (RL) | en_US |
dc.subject | Resilient energy trading | en_US |
dc.subject | Soft actor-critic (SAC) | en_US |
dc.subject | Renewable energy | en_US |
dc.subject | Distributed energy resources (DER) | en_US |
dc.subject | DC optimal power flow (DC-OPF) | en_US |
dc.subject | SDG-07: Affordable and clean energy | en_US |
dc.title | LSTM-SAC reinforcement learning based resilient energy trading for networked microgrid system | en_US |
dc.type | Article | en_US |