To facilitate collaboration between computers and people, computers should be able to perceive the world in a manner that correlates well with human perception. A good example of this is face image retrieval. Mathematically-based face indexing methods that are not based primarily on how humans perceive faces can produce retrievals mat are disappointing to human users. This raises the question "Can human faces be automatically indexed in a manner that correlates well with human perception of similarity?" Humans use words to describe faces - words such as braided, graybearded, bearded, bespectacled, bald, blondish, blond, freckled, blue eyed, mustached, pale, Caucasian, brown eyed, dark skinned, or black eyed. Such words represent dimensions that span a shared concept space for faces. Therefore they might provide a useful guide to indexing faces in an intuitive manner. This paper describes research that uses descriptive words such as these to index faces. Each word guides the design of one feature detector that produces a scalar coefficient, and those coefficients collectively define a feature vector for each face. Given these feature vectors, it is possible to compute a similarity measure between pairs of faces, and to compare that computed similarity to the similarity, as perceived by humans.