This study establishes a correlation between the physical properties of SO samples of yam and human perceptions of these qualities by using a "back-propagation neural network training technique". Human perceptions were divided into two categories: touch perception and touch/visualization perception. Ten pairs of adjectives were offered as choices for each category for human assessment of the yam samples. The results from the questionnaire on the 50 samples were separated into a training group (45 samples) and a verification group (5 samples); this served as the output layer for the back-propagation neural network. The physical properties in question included color, texture, and the material/appearance of the yarn samples. Color-coherent vectors of die yarn-sample images represented the color features of the yam samples. Image analysis methods such as LBP (Local Binary Pattern), and three center-symmetric covariance measures (SCOV, SAC & VAR) were used to retrieve the texture features of the yam samples. Meaa Time Eiterprise Co., Ltd., a yam manufacturer, provided information on the material/appearance of the yam samples. In the touch/visualization perception experiments, seven different combinations of physical properties were chosen as the input layers for the back-propagation neural network trainings. Three different combinations of physical properties were chosen as the input layers for back-propagation neural network training in the touch perception experiments. After the training processes, the verification group (5 samples) was used to verify the accuracy of the trained neural networks. It is important to note that the Type VII neural network had good predicted results (accuracy rate of 88.7%) in die "touch/visualization perception" category when the color features, texture features, and material/appearance of the yarn samples were used as the input layer. The Type VI neural network also had an excellent predicted result (87.4% accuracy rate) in the "touch perception" category when texture features and material/appearance were used as die input layer. This is the first research to combine the color features, texture features, and material/appearance of yam samples in the back-propagation neural network training of Kansei Engineering1. With the trained neural networks, designers can predict the new Kansei perception data by using information on the color features, texture features, and the material/appearance of new yarn samples. In addition, designers can retrieve recommendation samples from die yarn database by inputting die 10 pairs of Kansei perception values they prefer. This will increase efficiency during the yam design process and enhance the design capability of yam designers.