Skip to main content
Solved

How to calculate variance of a time series variable efficiently in AIMMS?

  • May 9, 2023
  • 2 replies
  • 235 views

Forum|alt.badge.img+4

There is a time series variable A={a1,a2,a3...at}, and I want to constraint the fluctuation of the variable. I did this in AIMMS with the following way:1) I define a variable series_mean to calculate the mean of the series, 2) I define a constraint to make the variance of the series be no larger than 10000.  

 

series_mean = sum(t,A(t))/T;

sum(t,(A(t)-series_mean)^2)/T<=10000;

 

However, this seems burden the computation and the solving time soars even though the constraint didn’t work in optimization. Like in one scenario, with or without the constraint I get the same result, however the solving time with the constraint is 10 times that in the case without this constraint. (The constraint is a must though) .

So I hope to know is there a better way to do the same things in AIMMS?

Thanks.

Best answer by Marcel Hunting

Hi @Chunyang. Models with quadratic constraints are in general harder to solve than models with only linear constraints, and therefore it is no surprise that the solving time increases by adding that quadratic constraint, even if it turns out to be inactive for the optimal solution.

You could try adding the following (redundant) constraints to see whether that helps to improve the performance:

A(t) - series_mean <= 100 * sqrt(T)   for all t

series_mean - A(t) <= 100 * sqrt(T) for all t

 

2 replies

Marcel Hunting
AIMMSian
Forum|alt.badge.img+4
  • AIMMSian
  • 258 replies
  • Answer
  • May 15, 2023

Hi @Chunyang. Models with quadratic constraints are in general harder to solve than models with only linear constraints, and therefore it is no surprise that the solving time increases by adding that quadratic constraint, even if it turns out to be inactive for the optimal solution.

You could try adding the following (redundant) constraints to see whether that helps to improve the performance:

A(t) - series_mean <= 100 * sqrt(T)   for all t

series_mean - A(t) <= 100 * sqrt(T) for all t

 


Forum|alt.badge.img+4
  • Author
  • Enthusiast
  • 19 replies
  • May 15, 2023

Hi @Chunyang. Models with quadratic constraints are in general harder to solve than models with only linear constraints, and therefore it is no surprise that the solving time increases by adding that quadratic constraint, even if it turns out to be inactive for the optimal solution.

You could try adding the following (redundant) constraints to see whether that helps to improve the performance:

A(t) - series_mean <= 100 * sqrt(T)   for all t

series_mean - A(t) <= 100 * sqrt(T) for all t

 

Thanks a lot!



Didn't find what you were looking for? Try searching on our documentation pages:

AIMMS Developer & PRO | AIMMS How-To | AIMMS SC Navigator