> Hi all,
> I tried sending this email last night, but I don't think it has gotten
> through. Perhaps my membership hadn't been processed as yet.
>
> I am using the 2006 till 2009 data for the New England pool of day-ahead
> weighted average prices. I am not very fluent with econometrics, but I read
> in some papers that GARCH has been used successfully to forecast electricity
> wholesale prices. When I train a GARCH model on one year worth of data, and
> forecast for the last 3 days of training data, I get a mean absolute error
> of 3.6642%. The error increases to 27.826% when I use two years worth of
> data, and decreases to 17.123% when using three years worth of past data. Is
> this expected?
>
> Also, when I plot the actual and fitted data against time, the GARCH model
> seems to have done a really bad job, compared to a default ARIMA model. I'm
> guessing this might be because people are actually using ARIMA models with
> (added) GARCH errors, so a simple GARCH model-based forecast isn't doing
> exactly what they have done. Am I right? Why would the ARIMA be a better fit
> than GARCH?
>
> One author mentioned that they took a log of the prices (in their case it
> was hourly prices) before fitting a GARCH model. In your opinion, is that an
> important factor in the kind of errors I am getting?
> Thanks and best regards
> --
> Muhammad Saqib Ilyas
> PhD Student, Computer Science and Engineering
> Lahore University of Management Sciences
> _______________________________________________