Week 1

Time Series Introduction

Time series is not a series — it’s a sequence

<aside> 💡 A time series is a sequence of scalar or vector-valued observations recorded at “times” $t \in T$ In this course, $T \subset R$ ( other examples: $T = \mathbb{N}, T = S^2$)

</aside>

We will model time series as a realization of some stochastic process {$X_t, t\in T$}

<aside> 💡 A Stochastic process {$X_t, t \in T$} is a sequence of random variables defined on a probability space ($\Omega, F, P$) and indexed by a set $T$

</aside>

Examples:

  1. $X_1, ..., X_n$ — i.i.d. random variables
  2. $S_n = \sum_{j=1}^n X_j, n = 1,2,3,...$ — a random walk {$S_n, n \in \mathbb{N}$}
  3. The Brownian motion {${W_t, t\in [0,1]}$}
    1. $W_0$ = 0
    2. $\forall 0 < t_1 < t_2 < ... < t_k, k \ge 3$, {$w_{t_2}-w_{t_1}, w_{t_3}-w_{t_2},..., w_{t_k}-w_{t_{k-1}}$} are independent, and {$w_t - w_s$} ~ $N(0, |t-s|)$

existence —?

<aside> 💡 Distribution function of $(x_{t_1}, ... ,x_{t_k})$: ($k \ge 1, t_1, ... t_k \in T$) A stochastic process is characterized by its finite dimensional(fd) distribution $F_{t1, .. t_k}(x_1, ... x_k) = Pr(x_{t_1} \le x_1, ... x_{t_k} \le x_k)$ [高维的probability density function] In other words it’s a distribution function(d.f.) of the random vector $(x_{t_1}, ... , x_{t_k}), k \ge 1, t_1, ... t_k \in T$ {$F_{t_1, ... ,t_k}, k \ge 1, t_1, ..., t_k \in T$} is known as a family of finite dimensional distribution of {$x_t, t\in T$}

</aside>

Untitled

<aside> <img src="/icons/info-alternate_blue.svg" alt="/icons/info-alternate_blue.svg" width="40px" /> Theorem: (A.Kolmogorov, P.J.Damell) Assume that {$F_{t_1, ... ,t_k}, k \ge 1, t_1, ..., t_k \in T$} is a family of probability measure such that (a) if $\pi$ is any permutation $\pi:\{ 1, ..., k\} \rarr \{ 1, .. .,k\}$ then $F_{t_{\pi(1)}, ... t_{\pi(k)}} = F_{t_1, ... ,t_k}(x_1,...,x_k)$ (b) $\lim_{x_k \rarr \infin} F_{t_1, ... t_k}(x_1, ... , x_k) = F_{t_1, ... ,t_{k-1}}(x_1, ..., x_{k-1})$ Then there exists a stochastic process $\{ X_t, t \in T\}$ that has $F$ as its f.d. distribution (finite dimensional)

</aside>

Facts about Linear Algebra

let $A \in \mathbb{R}^{n \times n}$, then

  1. $A$ is symmetric if $A = A^T$(equivalently $A_{ij} = A_{ji}$ for all $i,j = 1, ... ,n$)

  2. $A$ is nonnegative(positive) definite ↔ $A$ is symmetric, and $\forall x \in R^n, x\ne 0, \langle Ax, x \rangle = x^TAx \ge 0 (> 0)$

    Fact: $A$ is symmetric

    Reason: $x^TAx = \sum_{i,j=1}A_{ij}x_i x_j = x^TA^Tx$ ⇒ $x^TAx = x^T(\frac{A+A^T}{2})x$ since $x^T(\frac{A+A^T}{2})x = \sum_{i,j=1} x_i \frac{A_{ij}+A_{ji}}{2}x_j$

    Untitled

    Fact: (Exercise)

    $M \in C^{n \times n}$ is self-adjoint $M = \overline M^T$ ($\overline {x+iy} = x-iy$)

    If $\langle Mx, x \rangle \ge 0, \forall x\in C^n$ ⇒ $M$ is self-adjoint

    Lemma: show that $\langle Mx, x \rangle = 0, \forall x \in C^n$ ⇒ $M = 0_{n \times n}$ ($M = 0$)

    not true for $M \in R^{n \times n}$: $M_{11}x_1^2+M_{22}x_2^2 +(M_{12}+M_{21})x_1x_2 = 0 \forall x_1,x_2$

A is nonnegative(positive definite) ↔ A ≥ 0 (A > 0)

Properties: let $A \ge 0$ ($A \in R^{n \times n}$ be positive definite), then