Dynamic Flexible Job Shop Scheduling (DFJSS) is a critical combinatorial optimisation problem known for its dynamic nature and flexibility of machines. Traditional scheduling methods face limitations in adapting to such dynamic and flexible environments. Recently, there has been a trend in employing reinforcement learning (RL) to train scheduling agents for selecting manual scheduling heuristics at various decision points for DFJSS. However, the effectiveness of RL is constrained by the limited efficacy of the manually designed scheduling heuristics. Additionally, the process of manually designing diverse scheduling heuristics as the actions demands significant expert knowledge. In response, this paper proposes a Niching genetic programming (GP)-assisted RL method that leverages the evolutionary capabilities of GP to help RL solve the DFJSS problem effectively. Specifically, instead of using those manual scheduling heuristics, the RL actions are replaced with scheduling heuristics evolved by the Niching GP to optimise and adapt these heuristics based on real-time feedback from the environment. Experimental results demonstrate the effectiveness of the proposed method in comparison to the widely used manual scheduling heuristics and the baseline deep RL method. Further analyses reveal that the effectiveness of the proposed method is due to the behavioral differences among heuristics learned by the Niching GP, serving as actions for the RL. In addition, the effectiveness of the proposed algorithm benefits from the comparable percentages of contributions made by these learned heuristics throughout the long-term scheduling process.
History
Preferred citation
Xu, M., Mei, Y., Zhang, F. & Zhang, M. (2024). Niching Genetic Programming to Learn Actions for Deep Reinforcement Learning in Dynamic Flexible Scheduling. IEEE Transactions on Evolutionary Computation, PP(99), 1-1. https://doi.org/10.1109/TEVC.2024.3395699