Maintenance strategy optimization using a continuous-state partially observable semi-Markov decision process

Zhou, Yifan, Ma, Lin, Mathew, Joseph, Sun, Yong and Wolff, Rodney (2011). Maintenance strategy optimization using a continuous-state partially observable semi-Markov decision process. In: Prognostics and Health Management. 2010 Reliability of Compound Semiconductors (ROCS) Workshop, Portland, OR, USA, (300-309). 17 May 2010. doi:10.1016/j.microrel.2010.09.023


Author Zhou, Yifan
Ma, Lin
Mathew, Joseph
Sun, Yong
Wolff, Rodney
Title of paper Maintenance strategy optimization using a continuous-state partially observable semi-Markov decision process
Conference name 2010 Reliability of Compound Semiconductors (ROCS) Workshop
Conference location Portland, OR, USA
Conference dates 17 May 2010
Proceedings title Prognostics and Health Management   Check publisher's open access policy
Journal name Microelectronics Reliability   Check publisher's open access policy
Place of Publication Kidlington, Oxford, United Kingdom
Publisher Pergamon
Publication Year 2011
Sub-type Fully published paper
DOI 10.1016/j.microrel.2010.09.023
ISSN 0026-2714
1872-941X
Volume 51
Issue 2
Start page 300
End page 309
Total pages 10
Language eng
Abstract/Summary Due to the limitation of current condition monitoring technologies, the estimates of asset health states may contain some uncertainties. A maintenance strategy ignoring this uncertainty of asset health state can cause additional costs or downtime. The partially observable Markov decision process (POMDP) is a commonly used approach to derive optimal maintenance strategies when asset health inspections are imperfect. However, existing applications of the POMDP to maintenance decision-making largely adopt the discrete time and state assumptions. The discrete-time assumption requires the health state transitions and maintenance activities only happen at discrete epochs, which cannot model the failure time accurately and is not cost-effective. The discrete health state assumption, on the other hand, may not be elaborate enough to improve the effectiveness of maintenance. To address these limitations, this paper proposes a continuous state partially observable semi-Markov decision process (POSMDP). An algorithm that combines the Monte Carlo-based density projection method and the policy iteration is developed to solve the POSMDP. Different types of maintenance activities (i.e., inspections, replacement, and imperfect maintenance) are considered in this paper. The next maintenance action and the corresponding waiting durations are optimized jointly to minimize the long-run expected cost per unit time and availability. The result of simulation studies shows that the proposed maintenance optimization approach is more cost-effective than maintenance strategies derived by another two approximate methods, when regular inspection intervals are adopted. The simulation study also shows that the maintenance cost can be further reduced by developing maintenance strategies with state-dependent maintenance intervals using the POSMDP. In addition, during the simulation studies the proposed POSMDP shows the ability to adopt a cost-effective strategy structure when multiple types of maintenance activities are involved.
Keyword Asset health state
Markov decision processes
Discrete-time assumption
Q-Index Code C1
Q-Index Status Provisional Code
Institutional Status Non-UQ

Document type: Conference Paper
Collection: W.H. Bryan Mining Geology Research Centre
 
Versions
Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 9 times in Thomson Reuters Web of Science Article | Citations
Scopus Citation Count Cited 13 times in Scopus Article | Citations
Google Scholar Search Google Scholar
Created: Wed, 18 Jun 2014, 01:01:09 EST by Rodney Wolff on behalf of WH Bryan Mining and Geology Centre