In this study, we investigate the optimal transmission policies within an energy harvesting status update system, where the demand for status updates depends on the state of the source. The system monitors a two-state Markovian source that characterizes a stochastic process, which can be in either a normal state or an alarm state, with a higher demand for fresh updates when the source is in the alarm state. We propose a metric to capture the freshness of status updates for each state of the stochastic process by introducing two Age of Information (AoI) variables, extending the definition of AoI to account for the state changes of the stochastic process. We formulate the problem as a Markov Decision Process (MDP), utilizing a transition cost function that applies linear and non-linear penalties based on AoI and the state of the stochastic process. Through analytical investigation, we delve into the structure of the optimal transmission policy for the resulting MDP problem. Furthermore, we evaluate the derived policies via numerical results and demonstrate their effectiveness in reserving energy in anticipation of forthcoming alarm states.
翻译:暂无翻译