Abstract:
When navigating in unknown environments, mobile robots are often constrained by perceptual and localization uncertainties, making robust path planning in dynamic and uncertain environments crucial. This paper proposes a path planning framework that integrates probabilistic state modeling based on Partially Observable Markov Decision Processes (POMDP) with the heuristic search of the A* algorithm. By leveraging probabilistic modeling, the framework addresses the positional uncertainty of dynamic obstacles. To enhance the algorithm's adaptability across varying environmental states, the heuristic function is improved and a multi-condition triggered replanning mechanism is introduced. The use of a KD-Tree structure for nearest-neighbor queries optimizes the collision detection efficiency from O(n) to O(log n).Simulation experiment results demonstrate that the proposed algorithm significantly outperforms comparative methods in terms of median cumulative rewards, achieving an average single-step decision time of 0.01 seconds, which is an order of magnitude faster than the POMCP algorithm. Additionally, it maintains a 65% task success rate even under observation noise levels as high as 0.45, demonstrating significant advantages in path quality, planning efficiency, and robustness.