OscaR-CP

Learning Outcomes

  • Level 1: Beginning CP Modeler

    • can create a simple model, using a default search
    • knows some important basic global constraints
    • can create an optimization model
    • can use reified constraints
    • can debug a model and measure its efficiency
    • can create a basic search heuristic (variable + value heuristic)
  • Level 2: Advanced CP Modeler

    • can create a custom search
    • can create simple global constraint
    • has a good intuition of how propagation works
    • understand the different level of propagation
    • can use large neighborhood search to make CP scale
    • knows how to use the scheduling global constraints
  • Level 3: Expert CP Modeler

    • is able to extend the library

    • can create a global constraint with incremental changes and state

    • can manipulate the trail and reversible objects

    • understand the search mechanism

    • understand some architecture and design choices of OscaR-CP

      • domain representation
      • trailing mechanism: implement your own reversible
      • reversible node
    • can solve multi-objective optimization problems

    • can use variable objective large neighborhood search

Create basic models (L1)

Declare a solver, create variables, add constraints

A first basic model is given next and commented after.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
package oscar.examples.cp.userguide

import oscar.cp._

object BasicModel extends CPModel with App {
  val x1 = CPIntVar(Set(1,5,9)) // Dom(x1) = {1,5,9}
  val x2 = CPIntVar(1 to 5)     // Dom(x2) = [1..5]
  val x3 = CPIntVar(1 until 5)  // Dom(x3) = [1..4]


  add(x1 !== x2)
  add(x1 + x2 === x3)

  search {
    binaryFirstFail(Seq(x1,x2))
  } onSolution {
    println("Solution found, value of x1 in this solution:" + x1.value)
  }

  start(nSols = 1)
}

This model mix with the trait ``CPModel ``. This trait behind the scene defines an implicit CPSolver for you allowing you:

  • to create all variables (attached to this solver) and also
  • to add every constraints of the problem you want to solve

You might view this object as the one containing all the info about your model (or more simply it is your model). Note the different ways to create the variables and their initial domains. You can either give explicitly the set of valid values (a Scala Set) or you can give a range as argument of the variable creation.

In OscaR each time you add a constraint with the method add you trigger the fix-point algorithm so that you can see immediately the effect of propagation when adding this constraint (with interleaved ``println `` for instance).

On the next example we use a binary-first fail depth first search on the array of variables [x1,x2,x3].

The start(nSols = 1) is asking to start the search and stop as soon as the first solution is discovered. To find all the solutions, you can simply use ``start() `` which is equivalent to ``start(nSols = Int.MaxValue) ``. Each time a solution is found during the search tree exploration, the closure defined in onSolution(…) is executed. In this case, we just showing the value of x1 in the solution.

Add an Objective function

The next simple model defines an objective function to maximize.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
package oscar.examples.cp.userguide

import oscar.cp._

object BasicOptimizationModel extends CPModel with App {


  val x1 = CPIntVar(Set(1,2,3,9))
  val x2 = CPIntVar(1 to 5)
  val x3 = CPIntVar(1 until 5)


  add(x1 !== x2)
  add(x1 + x2 === x3)

  // define an objective function
  maximize(x1+x3)

  search {
    binaryFirstFail(Seq(x1,x2))
  } onSolution {
    println(s"Solution found $x1 $x2 $x3")
  }

  start()
}

The start will then use a branch and bound depth first search. The onSolution block will be called each time an improved solution is discovered. The last solution found is the proven optimal one.

Custom Search (L2)

OscaR CP gives you a lot of control to explore the search tree.

A custom first fail

The search heuristic is generally applied in two steps:

  1. The variable selection: which variable to assign next in the children of the node.
  2. The value ordering: in what order should the values be tried in children nodes. In DFS (the usual search in CP) we generally try to chose the most promising value in the left most child node.

These two choices constitute the variable-value heuristic. One of the most popular heuristic is first fail principle trying to heuristically minimize the size of the sub-tree rooted in the current node if no solutions exists there.

One instantiation of this principle on a vector of decision variables x consists of selecting as the next variable to branch on, the not instantiated one with the smallest domain size.

On the left branch we can for instance attempt to instantiate the variable to its smallest possible value and on the right branch we can force it to take a different value (binary search tree). Here is a procedure implementing this strategy

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
package oscar.examples.cp.userguide

import oscar.cp._
import oscar.util._

object FirstFailSearch extends CPModel with App {

  val X = Array.fill(10)(CPIntVar(0 to 10))

  add(X(8)+X(9) === 3)
  add(sum(X) === 9) // sum must be 3
  add(allDifferent(X)) // must all be different

  onSolution {
    println(X.mkString(","))
  }

  search {
    // select first the unbound variable with smallest domain
    selectMin(X)(!_.isBound)(_.size) match {
      case None => noAlternative // no more children
      case Some(x) => {
        val v = x.min
        branch(add(x === v))(add(x !== v))
      }
    }
  }


  val stats = start() // start the search, and get search statistics
  println("#fails:"+stats.nFails)

}

Adding Constraints to the model (L1)

You can add a constraint (an object of type oscar.cp.core.Constraint) to the model either using add or post method (on an implicit CPSolver object). The add throws an exception during the modeling if the model is inconsistent, while the post does not. If you are using post, you can check if the model becomes inconsistent with the isFailed() method. Also the post method returns a CPOutcome.Failure if the solver is in a failed state.

1
2
3
4
5
6
7
8
import oscar.cp._

object MyModel extends CPModel with App {

  add(someConstraint) // NoSolutionException may be thrown

  val failed = post(someConstraint) == Failure // you can also check the consistency with solver.isFailed
}

Every call to add or post methods triggers the fix point algorithm. It allows you to observe immediately the propagation effect. Don’t hesitate to print the domains of your model after each add to debug it.

Filtering level(L2)

For some constraints, several filtering strengths are possible. Remember that behind each constraint, a filtering algorithm is implemented (also called a propagator).

For some filtering algorithms, it is possible to characterize properly what the filtering it is doing.

Let \(C\) be a constraint on the variables \(x_1, . . . , x_k\) with respective domains \(D(x_1), . . . , D(x_k)\). That is, \(C \subseteq D(x_1) \times \ldots \times D(x_k)\).

Domain-Consistency (also called Arc-Consistency)
A constraint is domain consistent if every value in each domain can be extended to a solution. More exactly, we say that \(C\) is arc consistent if for every \(1 \le i \le k\) and \(v \in D(x_i)\), there exists a tuple \((d_1,\ldots,d_k) \in C\) such that \(d_i = v\) and \(\forall j \neq i: d_j \in D(x_j)\).
Bound-Consistency
A constraint is bound-consistent if every bound can be extended to a solution considering interval domains (without holes). More exactly, we say that \(C\) is bound consistent if for every \(1 \le i \le k\), there exists a tuple \((d_1,\ldots,d_k) \in C\) such that \(d_i = \min(D(x_i))\) and \(\forall j \neq i: d_j \in [\min(D(x_j))..\max(D(x_j))]\), and there exists a tuple \((e_1,...,e_k) \in C\) such that \(e_i = \max(D(x_i))\) and \(\forall j \neq i: e_j \in [\min(D(x_j))..\max(D(x_j))]\).

Thus a filtering algorithm achieving bound-consistency shrinks the domain intervals as much as possible (without losing any solutions) while an algorithm achieving domain consistency removes all possible inconsistent values (possible in the middle of the domains).

For many constraints, reaching domain or bound-consistency is too expensive (sometimes NP-Complete). In this case, faster filtering algorithms are preferred but unfortunately it is not always possible to characterize properly the consistency level they can reach.

The stronger the filtering, the more expensive (time complexity) is each execution of the propagator inside the fix-point algorithm. The experimented users can play with the strength of filtering. Three filtering strengths are available:

  1. Weak
  2. Medium (the default)
  3. Strong

For some hypothetical constraint, one could for instance have that Strong asks for domain-consistency while Medium asks for bound-consistency filtering .

This can be specified with the add/post method:

1
2
3
4
5
6
7
import oscar.cp._

object MyModel extends CPModel with App {

  add(someConstraint, Strong)

}

Sometimes it pays off to activate more sophisticated filtering algorithms, able to remove more inconsistent values. If a constraint proposes several level of filtering strength, these should be specified in the OscaR Scaladoc.

Expressions and simple constraints (L1)

OscaR-CP does not have expression objects that can be manipulated or simplified automatically by the solver. Instead it provides some operator overloads that allow you to model easily (almost as if expressions where available).

Binary Relation Constraints (L1)

OscaR supports binary relation constraints <,>,<=,>=,==. These relations are just operator overloads methods defined on CPIntVar objects. Each of these methods return the corresponding constraint.

Arithmetic Constraints (L1)

Supported binary arithmetic constraints are +,-,*.

Example
Consider add(x+y-x > 0). This constraint is not simplified into y > 0. Instead x+y returns a variable w linked with with the constraint x+y=w added to the model for you. Then w-x return a variable z linked with w-x=z. Finally z > 0 returns a binary constraint added to the model.

Arithmetic constraints with integer operand must occur on the right of the operator (i.e. write x+1 instead of 1+x).

Since it is common to write summations over a large number of variables (like \(\sum_i x_i\)), OscaR has special constructs for that. There is also modeling constructs to sum over 2 indices. You can take the negation of a variable or its absolute value. This returns a fresh variable linked with the previous one by the appropriate constraint. An example of usage of these constructs is given next:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
package oscar.examples.cp.userguide

import oscar.cp._

object ArithmeticConstraints extends CPModel with App {

  val x = Array.fill(10)(CPIntVar(0 to 3))
  add(sum(x) === 10)
  add(sum(0 until x.size)(i => x(i)) === 10) // equivalent to the above line

  val y = Array.fill(10,10)(CPIntVar(0 to 3))
  add(sum(0 until 5, 0 until 5){case(i,j) => y(i)(j)} === 10)
  add(sum(4, 4){case(i,j) => y(i)(j)} === 10) // equivalent to previous line

  val z = CPIntVar(0 to 3)
  add(-z === -2)
  add(z.abs === 2) // (absolute value of x) equal 2

}

You can also use scala constructs combined with + (for yield, reduceLeft, foldLeft, etc.)

Reified and logical constraints (L1)

One main asset of CP is its ability to model logical constraints easily. By logical constraints, we mean constraints between CPBoolVar. A CPBoolVar is nothing else than 0-1 CPIntVar variable with 0 representing false, and 1 representing true. Notice that as the CPBoolVar class extends the CPIntVar, it can be used anywhere a CPIntVar is expected.

OscaR support the logical or “||”, the logical and “&&” and the implication “==>”.

When you express logical constraint between boolean variables, the result is a fresh boolean variable (for instance b1 || b2). Since it is very common to constrain a boolean expression to be true, you can add a boolean variable directly to the model which is implicitly constrained to be true. You can also build logical complex boolean expressions using the logical operators (&&,||,==>) such as in following This is illustrated on next example:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
package oscar.examples.cp.userguide

import oscar.cp._

object Logical extends CPModel with App {

  val A = CPBoolVar()
  val B = CPBoolVar()
  val C = CPBoolVar()
  val D = CPBoolVar()

  add(A && B) // is equivalent to add((A && B) == 1) which is equivalent to add((A && B).constraintTrue())

  add(((A ==> B) || C) && D)

  search {
    binaryStatic(Seq(A,B,C,D))
  } onSolution {
    println(A+" "+B+" "+C+" "+D)
  } start()

}

A very useful tool to model logical constraints are the reified constraints. Reified constraints are nothing else than constraints that can be negated (Note that logical constraints are reified constraints themselves since they return a CPBoolVar that you can constrain to be false). Standard reified relation constraints are also present in OscaR:

  • <== is the reified less or equal,
  • <<= is the reified strictly less than,
  • === is reified equality.

For instance you could count the number of variables taking a value between 2 and 3 (inclusive) as follows:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
package oscar.examples.cp.userguide

import oscar.cp._

object Reified extends CPModel with App {

  val X = Array.fill(5)(CPIntVar(0 to 5))
  val count = sum(0 until 5) (i => (X(i) ?>=2) && (X(i) ?< 4))

}

count is a CPIntVar representing the number of variables in X taking its value in the interval [2..3]:

  • The result of X(i) >==2 is a CPBoolVar (0-1) variable equal to true (1) if and only if X is larger or equal to 2.
  • The result of X(i) <<=4 is a CPBoolVar (0-1) variable equal to true (1) if and only if X is strictly smaller than 4.
  • The && between the two CPBoolVar’s is true (1) only if both are equal to true.
  • Then a variable is returned by the sum function representing the summation of the 5 CPBoolVar’s

Element Constraints (indexing array with variable) (L1)

CP allows you to index an array of integers or of variables with another variable. This is called element constraints. You can model with such constraints naturally using standard Scala array accessors:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
package oscar.examples.cp.userguide

import oscar.cp._

object Element extends CPModel with App {

  val x = Array(1,3,2,7)

  val matrix = Array(Array(1,3,2,7),
                     Array(4,1,2,7),
                     Array(5,3,1,6),
                     Array(3,2,1,3))
  val w = Array.fill(4)(CPIntVar(0 to 10))
  val y1 = CPIntVar(0 to 3)
  val y2 = CPIntVar(0 to 3)

  val y = CPIntVar(0 to 3)
  val z = CPIntVar(0 to 5)


  add(x(y) === z) // index on array of integers
  add(w(y) === z) // index on array of variables
  add(matrix(y1)(y2) === 3)
}

In the above example x(y) and w(y) both returns a CPIntVar.

You can also access 2D arrays of constants with two index variables as shown by indexing matrix with y1 and y2.

Global Constraints (L1-L2)

A global constraint is a constraint having potentially more than two variables in its scope. A global constraint generally offers a better filtering than a decomposition of this one into simpler constraints:

  • A better speed efficiency, and/or
  • A stronger filtering, generally able to remove more values.

Using global constraints when possible also permits to write more readable and maintainable CP models. Any good CP modeler should:

  1. know the important global constraints available in the solver and
  2. be able to recognize when they can be used (more an art than a science).

The global constraints available into OscaR are all listed and documented in the Constraints trait. The description of each of them is out of scope of this document but an advanced user should be familiar with most of them. Indeed using them can drastically impact the solving time.

Implement your own constraint (L3)

A new constraint can be implemented either in Java or in Scala but for this explanation we will consider a Scala implementation. A CP constraint has two responsibilities during the solving proces:

  1. Check that it is consistent according to it’s definition
  2. Remove inconsistent values from the domain of it’s scope

To implement a constraint you have to extend the Constraint abstract class. The minimum requirement is to override those two methods:

  • def setup (l: CPPropagStrength): CPOutcome This method is in charge of checking if the constraint is consistent according
    to it’s semantic and the current domains and it can also do first propagation to remove inconsistent values. It is also in this method that you register the constraint to potential modification of the domain of the variables in its scope.
  • def propagate (): CPOutcome This method will be called when a domain of a variable has been modified and so the constraint
    can potentially remove new inconsistent values there or detect it is inconsistent.

Both method return a CPOutcome value which can be:

  • CPOutcome.Failure: you return this to tell to the solver that the constraint is inconsistent and that a backtrack should occur.
  • CPOutcome.Success: you return this to tell to the solver that the constraint will always be satisfied if the domain reduces even more whatever the reduction.
  • CPOutcome.Suspend: you return this to tell to the solver that the constraint has done its work but if something changes, it wants to be called back again.

Less or equal constraint

This constraint asks that a variable X should be less or equal <= than another variable Y. Let us call our constraint MyLessOrEqual defined in next example.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
package oscar.examples.cp.userguide

import oscar.cp._
import oscar.cp.core._
import oscar.cp.core.variables.CPVar

object SimpleUserConstraint extends CPModel with App {

  class MyLessOrEqual(val X: CPIntVar, val Y: CPIntVar ) extends Constraint(X.store, "MyLessOrEqual") {

    override def associatedVars(): Iterable[CPVar] = Array(X, Y)

    override def setup(l: CPPropagStrength): Unit =  {
      X.callPropagateWhenBoundsChange(this)
      Y.callPropagateWhenBoundsChange(this)
      propagate()
    }

    override def propagate(): Unit = {
      if (Y.min >= X.max)
        deactivate()
      else {
        Y.updateMin(X.min)
        X.updateMax(Y.max)
      }
    }
  }

  val x = CPIntVar(0 to 10)
  val y = CPIntVar(0 to 5)

  add (new MyLessOrEqual(x,y))
}

Let’s explain each line of the propagate method first:

  • Clearly if the minimum value in the domain of Y is already larger or equal than the max of the domain of X, whatever values are removed from both domain in the future will not affect the fact that the constraint will always be satisfied. This is why we return CPOutcome.Success in this case.
  • Every values of the domain of Y smaller than the minimum of X are inconsistent. This is why we set the minimum of the domain of Y to the minimum of the domain of X using the method updateMin`. This method directly impact the domain of Y and could potentially make it empty. In case it becomes empty, updateMin returns CPOutcome.Failure` and we must also report this failure as the result of our propagate method.
  • The next case is symmetrical wrt to X
  • Finally if none of the previous cases are true, it means that we are still interested to be called later if something changes on the domain of X or Y.

In the setup() method we ask that the propagate method is called when a bound of either X or Y changes.

Large Neighborhood Search (L2)

LNS is a technique to solve difficult optimization problems with Constraint Programming. The goal of LNS is to avoid being stuck in a region of the search tree for too long by restarting frequently to explore other regions. In a huge search tree the early decisions are never reconsidered and all the time is spent searching in only one small region of the search space. This phenomenon is illustrated on the next figure where only the small green triangle is explored within the allowed computation time.

Search tree

With LNS when you realize that the search is stuck for too long, it restarts and visits another place of the search space. The search space explored is thus more similar:

Search tree LNS

The principle is the following:

  • Keep a current best solution to your problem, repeat the following two steps (relax + restart) until a stopping criteria is met.
  • relax: relax the current best solution (typically by fixing only a fraction of the decision variables to their value in the current best solution)
  • restart: Try to improve the current best solution with CP and replace it if a better solution can be found. This restart has a limited number of failures
[SHAW98]Using Constraint Programming and Local Search Methods to Solve Vehicle Routing Problems, Paul Shaw, 1998

Next section illustrates how to do that on a quadratic assignment problem:

There are a set of n facilities and a set of n locations. For each pair of locations, a distance is specified and for each pair of facilities a weight or flow is specified (e.g., the amount of supplies transported between the two facilities). The problem is to assign all facilities to different locations with the goal of minimizing the sum of the distances multiplied by the corresponding flows.

There are a set of n facilities and a set of n locations. For each pair of locations, a distance is specified and for each pair of facilities a weight or flow is specified (e.g., the amount of supplies transported between the two facilities). The problem is to assign all facilities to different locations with the goal of minimizing the sum of the distances multiplied by the corresponding flows.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
package oscar.examples.cp.userguide

import oscar.cp._

object QuadraticAssignment extends CPModel with App {

  val n = 9

  var w: Array[Array[Int]] = Array.ofDim(n,n)  //weight matrix
  var d: Array[Array[Int]] = Array.ofDim(n,n) //distance matrix

  // fill distance and weight matrix randomly
  val random = solver.getRandom
  for (i <- 0 until n; j <- i+1 until n) {
    w(i)(j) = random.nextInt(100)
    w(j)(i) = w(i)(j)
    d(i)(j) = random.nextInt(100)
    d(j)(i) = d(i)(j)
  }

  val x = Array.fill(n)(CPIntVar(0 until n))
  val D = Array.tabulate(n, n)((i, j) => d(x(i))(x(j))) //matrix of variables   representing the distances

  add(allDifferent(x), Strong)
  minimize(sum(0 until n,0 until n)((i, j) => D(i)(j) * w(i)(j)))


  search {
    binaryFirstFail(x)
  }
  println(start())

}

The first step to transform this model into a LNS model is keep track of the best current solution in bestSol array: Our lns relaxation procedure that will randomly restore 50% of the variables to their value in the current best solution: We can now implement the 100 restarts and each restart have a limit of at most 1000 failures. We do that in a simple for loop and use the method startSubjectTo to make a new run. Note that constraints added in the startSubjectTo block are reverted at each new run.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
package oscar.examples.cp.userguide

import oscar.cp._

object QuadraticAssignmentLNS extends CPModel with App {

  val n = 9

  var w: Array[Array[Int]] = Array.ofDim(n,n)  //weight matrix
  var d: Array[Array[Int]] = Array.ofDim(n,n) //distance matrix

  // fill distance and weight matrix randomly
  val random = solver.getRandom
  for (i <- 0 until n; j <- i+1 until n) {
    w(i)(j) = random.nextInt(100)
    w(j)(i) = w(i)(j)
    d(i)(j) = random.nextInt(100)
    d(j)(i) = d(i)(j)
  }

  val x = Array.fill(n)(CPIntVar(0 until n))
  val bestSol = Array.fill(n)(0)
  val D = Array.tabulate(n, n)((i, j) => d(x(i))(x(j))) //matrix of variables   representing the distances

  add(allDifferent(x), Strong)
  minimize(sum(0 until n,0 until n)((i, j) => D(i)(j) * w(i)(j)))



  onSolution {
    (0 until n).foreach(i => bestSol(i) = x(i).value)
  }
  search {
    binaryFirstFail(x,_.randomValue)
  } start(nSols = 1)

  for (r <- 1 to 100) {
    // do a new run, not limiting number of solution and with 60 backtrack limit
    startSubjectTo(failureLimit = 1000) {
      // relax randomly 50% of the variables
      add((0 until n).filter(i => random.nextInt(100) < 50).map(i => x(i) === bestSol(i)))
    }
  }

}