A suitable optimization algorithm for eq. (3) is Powell's method [13], because the optimal solution is close to the starting point. Powell's method finds search directions with a small number of error function evaluations of eq. (3). Gradient descent algorithms have difficulties, since no derivatives are available.
Powell's method computes directions for function minimization in one direction [13]. From the starting point in the -dimensional search space (the concatenation of the 3-vector descriptions of all planes) the error function (3) is optimized along a direction using a one dimensional minimization method, e.g., Brent's method [14].
Conjugate directions are good search directions, while unit basis directions are inefficient in error functions with valleys. At the line minimum of a function along the direction the gradient is perpendicular to . In addition, the n-dimensional function is approximated at point by a Taylor series using point as origin of the coordinate system. It is
(14) | |||
(15) | |||
(16) | |||
(17) | |||
(18) | |||
(19) | |||
(20) | |||
(21) | |||
(22) | |||
(23) | |||
(24) | |||
(25) | |||
(26) | |||
(27) | |||
(28) | |||
(29) | |||
(30) | |||
(31) | |||
(32) | |||
(33) | |||
(34) |
(35) |
The following heuristic scheme is implemented for finding new directions. Starting point is the description of the planes and the initial directions , are the unit basis directions. The algorithm repeats the following steps until the error function (3) reaches a minimum [14]:
Experimental evaluations for the environment test settings show that the minimization algorithm finds a local minimum of the error function (3) and the set of directions remains linear independent. The computed description of the planes fits the data and the semantic model.