Optimization of objective function using step functions

I asked this question in Math SE, but the answer is not very satisfactory. So I asked again here:

I have an optimization problem with linear inequalities and equality:

A*x<=b 
Aeq*x=beq

The problem is that the objective function consists of summing a number of Heaviside step functions,

Here's the pseudo code for the objective function:

function f(k, c, x)
  ffunction =0;
  for i=0;i<k.row.length;i++
     smallF=0
     for j=0; j<k.column.length; j++
      smallF+= k.row[i]*k.column[j]*x[j]+c[j]
     end 
     ffunction += u(smallF)
  end
 f=ffunction 
end


function u(x)
  if(x>0)
   return 1
  else
   return 0
  end
end

My suggestion is to approximate the step function as a smooth function and use nonlinear optimization for this purpose. But is there anything in the MATLAB toolkit that allows me to solve this problem without converting a smooth function?

+3
source share
3 answers

. Math SE; , , .

+2

Matlab . , . , x y, .

FMINCON, .

, , (, ), , , , x- xdata vectory y- ydata, " ". , , , . , , 5 ( ).

, 0. (.. y y). , .

function out = objFun(loc,xdata,ydata)
%#OBJFUN calculates the squared sum of residuals for a stair-step approximation to ydata
%# The stair-step locations are defined in the vector loc

%# create the stairs. Make sure xdata is n-by-1, and loc is 1-by-k
%# bsxfun creates an n-by-k array with 1 in column k wherever x>loc(k)
%# sum sums up the rows
yhat = sum(bsxfun(@gt,xdata(:),loc(:)'),2); %'# SO formatting

%# sum of squares of the residuals
out = sum((ydata(:)-yhat).^2);

objFun.m Matlab. xdata ydata ( ), loc ( k-by-1) Aeq , Aeq*loc==beq (Aeq [1 1 1], 3 )

locEst = fmincon(@(u)objFun(u,xdata,ydata),locInitialGuess,[],[],Aeq,5);

. , 5 - , , 5.

+1

X , O (i) .

1) Xj ( )

2) i, Xj, 1 -1, , + inf.

3) ... 0 .

4) , , .

5) Xj.

X, , . , , .


, ... , Nelder Mead downhill simplex. Matlab.


Both of these approaches will give you a local optimizer. If you really need a more global solution, you can explore Simulated Annealing or Genetic Algorithm solutions, which are likely to work very well with this type of problem.


Finally, if you have a ton of money spent (not sure if it’s free), I would look at Matlab Global Optimization .

0
source

Source: https://habr.com/ru/post/1784480/


All Articles