Single-image super-resolution is an important technique for high resolution display related applications. Example learning-based approaches can provide plenty of image details by using trained dataset. The regression based methods reduce the memory storage size by training mapping functions rather than using a huge dictionary. However, the speed of searching the nearest cluster for the desired mapping function is still the bottleneck of the system. This problem is getting critical when the number of mapping functions is increased. This work presents an operator denoted as local multi-gradient level pattern to fast yet effectively describe the patch local geometry for a cluster of patches. The corresponding cluster can then be quickly identified by a simple lookup table. Furthermore, the potential cluster misclassification problem, induced by adopting the simplified clustering feature, is relaxed by applying the proposed model combining scheme. Simulation results show that the proposed one can achieve about 8 times speedup with even higher SSIM as compared to the related k-mean based method.