Solved

# How to calculate variance of a time series variable efficiently in AIMMS?

+4
• Enthusiast
• 17 replies

There is a time series variable A={a1,a2,a3...at}, and I want to constraint the fluctuation of the variable. I did this in AIMMS with the following way:1) I define a variable series_mean to calculate the mean of the series, 2) I define a constraint to make the variance of the series be no larger than 10000.

series_mean = sum(t,A(t))/T;

sum(t,(A(t)-series_mean)^2)/T<=10000;

However, this seems burden the computation and the solving time soars even though the constraint didn’t work in optimization. Like in one scenario, with or without the constraint I get the same result, however the solving time with the constraint is 10 times that in the case without this constraint. (The constraint is a must though) .

So I hope to know is there a better way to do the same things in AIMMS?

Thanks.

icon

Best answer by Marcel Hunting 15 May 2023, 13:08

View original

### 2 replies

Userlevel 5
+4

Hi @Chunyang. Models with quadratic constraints are in general harder to solve than models with only linear constraints, and therefore it is no surprise that the solving time increases by adding that quadratic constraint, even if it turns out to be inactive for the optimal solution.

You could try adding the following (redundant) constraints to see whether that helps to improve the performance:

``A(t) - series_mean <= 100 * sqrt(T)   for all tseries_mean - A(t) <= 100 * sqrt(T)   for all t``

+4

Hi @Chunyang. Models with quadratic constraints are in general harder to solve than models with only linear constraints, and therefore it is no surprise that the solving time increases by adding that quadratic constraint, even if it turns out to be inactive for the optimal solution.

You could try adding the following (redundant) constraints to see whether that helps to improve the performance:

``A(t) - series_mean <= 100 * sqrt(T)   for all tseries_mean - A(t) <= 100 * sqrt(T)   for all t``

Thanks a lot!