The Conformal Predictions framework is a recent development in machine learning to associate reliable measures of confidence with results in classification and regression. This framework is founded on the principles of algorithmic randomness (Kolmogorov complexity), transductive inference and hypothesis testing. While the formulation of the framework guarantees validity, the efficiency of the framework depends greatly on the choice of the classifier and appropriate kernel functions or parameters. While this framework has extensive potential to be useful in several applications, the lack of efficiency can limit its usability. In this paper, we propose a novel kernel learning methodology to maximize efficiency in the CP framework. This method is validated using the k-Nearest Neighbors classifier on three different datasets, and our results show immense promise in applying this method to obtain efficient conformal predictors that can be practically useful.