This paper combines inductive learning techniques from Artificial Intelligence (AI) with traditional Operation Research (OR) methodologies to solve scheduling problems. The focused problem is sequencing jobs on asingle processor in order to minimize total tardiness, termed as(n/l/t). Motivation for this study arises from the recognition that most traditional optimization algorithms are usually computationally inefficient because theytreat problems uniformly and solve them with identical procedures. Thisresults in the fact that one may waste time by applying a complicated algorithmto solve a relatively simple problem. This paper concentrates on dividingthe problem space into a number of sub-spaces and developing multiplealgorithms to match it. Used methodologies are 1) the Li’s Algorithm forsequencing jobs to minimize total tardiness as the basis of optimization techniques and 2) the Attribute-Value object representation techniques as the toolto decompose the domain problem space.
|Number of pages||7|
|Journal||Journal of the Chinese Institute of Engineers, Transactions of the Chinese Institute of Engineers,Series A/Chung-kuo Kung Ch'eng Hsuch K'an|
|Publication status||Published - 1992 Sep|
All Science Journal Classification (ASJC) codes