TY - GEN
T1 - Multi-Objective Battery Dispatching using an Enhanced SAC Algorithm
AU - Zendehdel, Danial
AU - De Santis, Enrico
AU - Capillo, Antonino
AU - Odonkor, Philip
AU - Rizzi, Antonello
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Renewable Energy Communities (RECs) have emerged as a promising solution to the intermittent nature of Renewable Energy Sources (RES). The stochastic nature of RES demands the presence of Energy Storage Systems (ESS) to increase self-consumption. Such systems, however, must be properly managed to enhance battery life, minimize wear costs, and ensure safe operating conditions. RECs face challenges such as low self-consumption and limited battery lifespan. This paper proposes an enhanced Soft Actor-Critic (SAC) algorithm tailored to optimize battery dispatch in RECs. Unlike standard SAC, which primarily relies on action clipping, our approach directly penalizes constraint violations within the agent's objective, guiding it toward more feasible and profitable dispatch strategies. The method maintains optimal battery State of Charge (SoC), extends battery life, and maximizes economic returns from solar energy usage. Compared to the standard SAC model, our Lagrange-SAC approach achieves an 18.2% improvement in the mean Self-Sufficiency Ratio (SSR), significantly increasing the efficiency of solar energy utilization. These findings highlight the potential of advanced reinforcement learning techniques to enhance real-world energy systems, promote sustainable practices, and improve the resilience of energy infrastructures.
AB - Renewable Energy Communities (RECs) have emerged as a promising solution to the intermittent nature of Renewable Energy Sources (RES). The stochastic nature of RES demands the presence of Energy Storage Systems (ESS) to increase self-consumption. Such systems, however, must be properly managed to enhance battery life, minimize wear costs, and ensure safe operating conditions. RECs face challenges such as low self-consumption and limited battery lifespan. This paper proposes an enhanced Soft Actor-Critic (SAC) algorithm tailored to optimize battery dispatch in RECs. Unlike standard SAC, which primarily relies on action clipping, our approach directly penalizes constraint violations within the agent's objective, guiding it toward more feasible and profitable dispatch strategies. The method maintains optimal battery State of Charge (SoC), extends battery life, and maximizes economic returns from solar energy usage. Compared to the standard SAC model, our Lagrange-SAC approach achieves an 18.2% improvement in the mean Self-Sufficiency Ratio (SSR), significantly increasing the efficiency of solar energy utilization. These findings highlight the potential of advanced reinforcement learning techniques to enhance real-world energy systems, promote sustainable practices, and improve the resilience of energy infrastructures.
KW - Battery Management System
KW - Microgrids
KW - Reinforcement Learning
KW - Renewable Energy
KW - State of Charge
KW - State of Health
UR - https://www.scopus.com/pages/publications/105023965628
UR - https://www.scopus.com/pages/publications/105023965628#tab=citedBy
U2 - 10.1109/IJCNN64981.2025.11227829
DO - 10.1109/IJCNN64981.2025.11227829
M3 - Conference contribution
AN - SCOPUS:105023965628
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - International Joint Conference on Neural Networks, IJCNN 2025 - Proceedings
T2 - 2025 International Joint Conference on Neural Networks, IJCNN 2025
Y2 - 30 June 2025 through 5 July 2025
ER -