A new polygonal approximation algorithm, employing the concept of genetic evolution, is presented. In the proposed method, a chromosome is used to represent a polygon by a binary string. Each bit, called a gene, represents a point on the given curve. Three genetic operators, including selection, crossover, and mutation, are designed to obtain the approximated polygon whose error is bounded by a given norm. Many experiments show that the convergence is guaranteed and the optimal or near-optimal solutions can be obtained. Compared with the Zhu-Seneviratne algorithm, the proposed algorithm successfully reduced the number of segments under the same error condition in the polygonal approximation.
|Number of pages||18|
|Journal||International Journal of Pattern Recognition and Artificial Intelligence|
|Publication status||Published - 2000 May|
All Science Journal Classification (ASJC) codes
- Computer Vision and Pattern Recognition
- Artificial Intelligence