LSTM-SAC reinforcement learning based resilient energy trading for networked microgrid system

dc.contributor.authorSharma, Desh Deepak
dc.contributor.authorBansal, Ramesh C.
dc.date.accessioned2025-04-16T12:19:57Z
dc.date.available2025-04-16T12:19:57Z
dc.date.issued2025-03
dc.description.abstractOn the whole, the present microgrid constitutes numerous actors in highly decentralized environments and liberalized electricity markets. The networked microgrid system must be capable of detecting electricity price changes and unknown variations in the presence of rare and extreme events. The networked microgrid system comprised of interconnected microgrids must be adaptive and resilient to undesirable environmental conditions such as the occurrence of different kinds of faults and interruptions in the main grid supply. The uncertainties and stochasticity in the load and distributed generation are considered. In this study, we propose resilient energy trading incorporating DC-OPF, which takes generator failures and line outages (topology change) into account. This paper proposes a design of Long Short-Term Memory (LSTM) - soft actor-critic (SAC) reinforcement learning for the development of a platform to obtain resilient peer-to-peer energy trading in networked microgrid systems during extreme events. A Markov Decision Process (MDP) is used to develop the reinforcement learning-based resilient energy trade process that includes the state transition probability and a grid resilience factor for networked microgrid systems. LSTM-SAC continuously refines policies in real-time, thus ensuring optimal trading strategies in rapidly changing energy markets. The LSTM networks have been used to estimate the optimal Q-values in soft actor-critic reinforcement learning. This learning mechanism takes care of the out-of-range estimates of Q-values while reducing the gradient problems. The optimal actions are decided with maximized rewards for peer-to-peer resilient energy trading. The networked microgrid system is trained with the proposed learning mechanism for resilient energy trading. The proposed LSTM-SAC reinforcement learning is tested on a networked microgrid system comprised of IEEE 14 bus systems.en_US
dc.description.departmentElectrical, Electronic and Computer Engineeringen_US
dc.description.librarianhj2024en_US
dc.description.sdgSDG-07:Affordable and clean energyen_US
dc.description.urihttps://www.aimspress.com/journal/electrengen_US
dc.identifier.citationSharma, D.D. & Bansal, R.C. 2025, 'LSTM-SAC reinforcement learning based resilient energy trading for networked microgrid system', AIMS Electronics and Electrical Engineering, vol. 9, no. 2, pp. 165-191, doi : 10.3934/electreng.2025009.en_US
dc.identifier.issn2578-1588 (online)
dc.identifier.other10.3934/electreng.2025009
dc.identifier.urihttp://hdl.handle.net/2263/102140
dc.language.isoenen_US
dc.publisherAIMS Pressen_US
dc.rights© 2025 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0).en_US
dc.subjectLong short-term memory (LSTM)en_US
dc.subjectNetworked microgridsen_US
dc.subjectReinforcement learning (RL)en_US
dc.subjectResilient energy tradingen_US
dc.subjectSoft actor-critic (SAC)en_US
dc.subjectRenewable energyen_US
dc.subjectDistributed energy resources (DER)en_US
dc.subjectDC optimal power flow (DC-OPF)en_US
dc.subjectSDG-07: Affordable and clean energyen_US
dc.titleLSTM-SAC reinforcement learning based resilient energy trading for networked microgrid systemen_US
dc.typeArticleen_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Sharma_LSTM_2025.pdf
Size:
1.5 MB
Format:
Adobe Portable Document Format
Description:
Article

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: