Abstract
We study the stochastic resource-constrained project scheduling problem with uncertain resource availability, called SRCPSP-URA, and model it as a sequential decision problem. A new Markov decision process (MDP) model is developed for the SRCPSP-URA. It dynamically and adaptively determines not only which activity to start at a stage, but also which to interrupt and delay when there is not sufficient resource capacity. To tackle the curse-of-dimensionality of an exact solution approach, we devise and implement a rollout-based approximate dynamic programming (ADP) algorithm with priority-rule heuristic as the base policy, for which theoretical sequential improvement property is proved. Computational results show that with moderately more computational time, our ADP algorithm significantly outperforms the priority-rule heuristics for test instances up to 120 activities.
| Original language | English |
|---|---|
| Pages (from-to) | 226-243 |
| Number of pages | 18 |
| Journal | Applied Mathematical Modelling |
| Volume | 97 |
| DOIs | |
| State | Published - Sep 2021 |
Keywords
- Approximate dynamic programming
- Markov decision process
- Rollout policy
- Stochastic resource-constrained project scheduling
- Uncertain resource availability
Fingerprint
Dive into the research topics of 'An approximate dynamic programming approach to project scheduling with uncertain resource availabilities'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver