If you are only interested in autocorrelation with a delay of one, you can generate a first-order autoregressive process with a parameter equal to the desired autocorrelation; this property is mentioned on the Wikipedia page , but it is not difficult to prove.
Here is a sample code:
import numpy as np def sample_signal(n_samples, corr, mu=0, sigma=1): assert 0 < corr < 1, "Auto-correlation must be between 0 and 1"
The corr parameter allows corr to set the desired autocorrelation to delay 1, and the optional parameters mu and sigma allow you to control the average value and standard deviation of the generated signal.
source share