3. Vector Autoregressive (VAR) Model¶
The construction of a VAR model starts with the time series vector with \(K\) observables:
The DGP consists of a deterministic and stochastic part
with \(E[y_t] = \mu_t\) as the expected values. \(\mu_t\) can be a constant, polynomial trend terms, seasonal dummies or more.
The stochastic term is a linear VAR process of order \(p\)
where \(u_t\) is white noise with \(E[u_t] = 0\), \(E[u_t u_s] = 0 \forall s \neq t\) and \(E[u_t u'_t] = \Sigma_u\) such that \(u_t \sim (0, \Sigma_u)\).
It is convenient to rewrite the former expression for the stochastic part as
with \(A(L) = I_k - A_1 L - A_2 L^2 - \dots A_p L^p\). Inserting this equation into the DGP equation yields:
If the deterministic term is just a constant, i.e. \(\mu_t = \mu_0\)
where \(\nu = A(L) \mu_0 = A(1) \mu_0 = (I_k - \sum^p_{j=1} A_j) \mu_0\).
The process is stable if all roots of the following polynomial are outside the unit circle.
This is true under common assumptions:
- costant mean
- white noise with time-invariant covariance matrix
- stationarity
A stable \(VAR(p)\) process can be represented as a moving average by successive substitution. Consider the example of the following \(VAR(1)\):
If all eigenvalues of \(A_1\) have modulus < 1, the sequence \(A^i_1, i = 1, \dots\) is absolutely summable and converges to \((I_K - A_1) \nu\) for \(j\to\infty\).
We can also obtain the Wold representation of the process. Rewrite the model to
Then, let \(\phi(L) = \sum^\infty_{i=0} \phi_i L^i\) such that \(\phi(L)A(L) = I_k\). Premultiplying yields
where \(\phi(L)\) is often denoted as \(A(L)^{-1}\), meaning the inverse of the expression. \(A(L)\) is invertible if \(\mid A(z) \mid \neq 0\) for \(\mid z \mid \leq 1\), which is the stability condition. Each element \(\phi_i\) can be computed recursively with
where \(A_j = 0\) for \(j > p\). For a stable process, \(\phi_i \to 0\) as \(i \to \infty\).
3.1. Moments¶
The distribution is solely determined by \(u_t\). The mean follows immediately from the Wold representation.
The covariance is
3.2. Autocovariances¶
Suppose that \(y_t\) is a stationary and stable \(VAR(1)\) process with
where \(E[u_t u_'t = \Sigma_u\) and \(E[y_t] = \mu\). Postmultiply by \((y_{t-h} - \mu)'\) and taking expectation yields