Difference between revisions of "Integer Linear Programming"

From Cohen Courses
Jump to navigationJump to search
Line 16: Line 16:
  
 
== Procedure ==
 
== Procedure ==
'''Input/Definitions''':  
+
'''Input''':  
 
* the objective function
 
* the objective function
 
* the constraints
 
* the constraints
  
'''Training''':
+
'''Output''':
* Generate ''m'' new training sets, ''D_i'', of size ''n_prime'' < ''n''.
+
* the assignment of unknown variables that optimizes the objective function and is consistent with the constraints
** Do so by sampling from ''D'' uniformly and with replacement (a.k.a. a bootstrap sample)
 
* Train ''m'' models, ''M_i'', using bootstrap sample ''D_i''
 
 
 
'''Output Prediction''':
 
* For Regression: Average outcome of the predictors in ''M''
 
* For Classification: Vote of the predictors in ''M''
 
  
 
== References / Links ==
 
== References / Links ==

Revision as of 02:41, 28 September 2011

Summary

Integer Linear Programming (ILP) is a method for:

  • optimizing a linear objective function:
maximize
where is known and is unknown variable
  • subject to linear equality or inequality constraints:
where and are known
  • and where can only take integer values

In other words, it is a method to find the optimal solution (i.e. the best assignment of unknown variables such as 's) that maximizes the objective function while meeting a list of requirements expressed as linear equality or inequality relationships.

Procedure

Input:

  • the objective function
  • the constraints

Output:

  • the assignment of unknown variables that optimizes the objective function and is consistent with the constraints

References / Links

  • Leo Brieman. Bagging Predictors. Machine Learning, 24, 123–140 (1996). - [1]
  • Wikipedia article on Bagging - [2]

Relevant Papers