From States to Transitions: Discrete Time Markov Chain in Affect Dynamics Psychometric Models
Affect dynamics, or the study of changing patterns of emotional responses across time, has emerged as a key field of research in Mathematical Psychology. Traditionally, Affect dynamics research has relied on the Experience Sampling Method (ESM), a data gathering technique in which participants describe their feelings, thoughts, and behaviors at various times throughout the day. This technique studies Intensive Longitudinal Data (ILD) using Mixed Linear or Nonlinear Models (MLM) or Vector Autoregressive Models (VARs) (VAR). These theories characterize emotion in terms of time and complexity. However, they fail to recognize the underlying unity of emotional dynamism: the transition between affects. Although emotions occur in a sequential sequence, the transition between them takes into account the previous state in comparison to the current one. Individuals can experience and describe many emotions at the same time, but one feeling often gains precedence, influencing or being compared to the previous one. In this paper, we will show how to use and implement discrete Markov chains to evaluate each transition between past and current emotional states, while neglecting earlier transitions in the same way that a Markov chain does. Researchers may use Markov chains to quantify the odds of migrating between distinct emotional states across time, allowing for a better understanding of affect dynamics. This method not only overcomes the constraints of traditional data gathering and processing approaches, but it also allows for a more sophisticated investigation of the processes driving emotional variations.
Keywords
There is nothing here yet. Be the first to create a thread.
Cite this as: