LSTM-SAC reinforcement learning based resilient energy trading for networked microgrid system

Loading...
Thumbnail Image

Authors

Sharma, Desh Deepak
Bansal, Ramesh C.

Journal Title

Journal ISSN

Volume Title

Publisher

AIMS Press

Abstract

On the whole, the present microgrid constitutes numerous actors in highly decentralized environments and liberalized electricity markets. The networked microgrid system must be capable of detecting electricity price changes and unknown variations in the presence of rare and extreme events. The networked microgrid system comprised of interconnected microgrids must be adaptive and resilient to undesirable environmental conditions such as the occurrence of different kinds of faults and interruptions in the main grid supply. The uncertainties and stochasticity in the load and distributed generation are considered. In this study, we propose resilient energy trading incorporating DC-OPF, which takes generator failures and line outages (topology change) into account. This paper proposes a design of Long Short-Term Memory (LSTM) - soft actor-critic (SAC) reinforcement learning for the development of a platform to obtain resilient peer-to-peer energy trading in networked microgrid systems during extreme events. A Markov Decision Process (MDP) is used to develop the reinforcement learning-based resilient energy trade process that includes the state transition probability and a grid resilience factor for networked microgrid systems. LSTM-SAC continuously refines policies in real-time, thus ensuring optimal trading strategies in rapidly changing energy markets. The LSTM networks have been used to estimate the optimal Q-values in soft actor-critic reinforcement learning. This learning mechanism takes care of the out-of-range estimates of Q-values while reducing the gradient problems. The optimal actions are decided with maximized rewards for peer-to-peer resilient energy trading. The networked microgrid system is trained with the proposed learning mechanism for resilient energy trading. The proposed LSTM-SAC reinforcement learning is tested on a networked microgrid system comprised of IEEE 14 bus systems.

Description

Keywords

Long short-term memory (LSTM), Networked microgrids, Reinforcement learning (RL), Resilient energy trading, Soft actor-critic (SAC), Renewable energy, Distributed energy resources (DER), DC optimal power flow (DC-OPF), SDG-07: Affordable and clean energy

Sustainable Development Goals

SDG-07:Affordable and clean energy

Citation

Sharma, D.D. & Bansal, R.C. 2025, 'LSTM-SAC reinforcement learning based resilient energy trading for networked microgrid system', AIMS Electronics and Electrical Engineering, vol. 9, no. 2, pp. 165-191, doi : 10.3934/electreng.2025009.