The geometric nearest neighbor problem is to choose from a fixed collection of points (the library) in a 'd' dimensional space that point nearest to a given query point. Many classification and pattern recognition problems are exact analogues of the geometric problem if coordinates and distance measure are suitably defined. Most cutoff algorithms effective in low dimensional spaces deteriorate for d greater than 10. Analysis of these algorithms in terms of mapping from a high to low dimensional space shows that if the mapping function and image space are suitably chosen, cutoff algorithms can be effective at higher dimensionality. It is shown that in most cases, the image space must be of dimension approximately d/2, and that principal components analysis can be used to optimize the choice of the image space. Results were published in the Journal for the Institute of Mathematics and its Applications.