There is a time series variable A={a1,a2,a3...at}, and I want to constraint the fluctuation of the variable. I did this in AIMMS with the following way:1) I define a variable series_mean to calculate the mean of the series, 2) I define a constraint to make the variance of the series be no larger than 10000.
series_mean = sum(t,A(t))/T;
sum(t,(A(t)-series_mean)^2)/T<=10000;
However, this seems burden the computation and the solving time soars even though the constraint didn’t work in optimization. Like in one scenario, with or without the constraint I get the same result, however the solving time with the constraint is 10 times that in the case without this constraint. (The constraint is a must though) .
So I hope to know is there a better way to do the same things in AIMMS?
Thanks.