Hi there.

I have a simple question regarding the definition of how many lags are considered when we apply a fracdiff over a time series. Is all available information considered everytime I apply the fracdiff function?

Or is there a default value for a maximum lag ? (thinking of that a fractional difference can be thought as an infinite AR operator).

Thanks in advance,

Fernando