I need to evaluate the relationship between prices in New York (N) and London (L) using a vector error correction model adapted from Joel Hasbrook. After much research on the Internet, I still have not achieved much success, so I thought that I would ask you experts to see if I could get any direction in this model.
My dataset is a dataframe with date, time, symbol, price.
Refund (r_t) is defined as the logarithmic difference between the price for each 15-minute interval (p (t) -p (t-1)) for both New York and London (equations 1 and 2).
The model uses r_t in New York to simulate 2 return lags in New York and London (equation 3).
He then uses the rt model in London for two lag returns in New York and London (equation 4).
N and L represent New York and London, respectively, somewhere in the model, and t represents time.
r_t^N=∆ log(P_t^N ) r_t^L=∆ log(P_t^L ) r_t^N=α(log(P_(t-1)^N)-log(P_(t-1)^L))+∑_(i=1)to 2(γ_i^(N,N) r_(ti)^N) + ∑_(i=1)to 2(γ_i^(N,L) r_(ti)^L)+ ε_t^N r_t^L=α(log(P_(t-1)^L)-log(P_(t-1)^N))+∑_(i=1)to 2(γ_i^(L,L) r_(ti)^L) + ∑_(i=1)to 2(γ_i^(L,N) r_(ti)^N)+ ε_t^L
Any help would be soooooo approved. Thank you in advance for your help!
I am new to R and have worked a bit with SAS and time series procedures. I saw a link to using vars (), but the examples I looked do not seem to be applicable, so I'm pretty stuck. I made DW statistics, and there is joint integration.
I just can't figure out how to write code for this ...