Online dating platforms such as Tinder, Bumble, and Coffee Meets Bagel have become not only culturally accepted, but incredibly common methods of meeting new people. However, many of these online dating platforms user profiles rely heavily on visual data. This reliance creates a barrier to access for individuals with visual impairments. Using voice-over systems with any of these applications leads to a very confusing interaction that withholds a lot of the available data from users with visual impairments. We propose a method of providing users with visual impairments access to the data encoded in the images of other users profiles. To learn what information people are most interested in knowing about the person in the dating profile, we surveyed both individuals with and without visual impairments to learn what visual features are important in an individual's dating profile. We then built a series of classifiers for the most popular features that were identified by the survey. Because the APIs for the dating applications mentioned above are not public, we created an application that works via screenshots. When the user would like to know more about a given profile, they just need to activate the native screenshot functionality on their phone, and our application will find the image in the screenshot folder on the phone, send it to our web service where the image is securely analyzed and output text is formed. This text is sent back to the phone and read out, providing the user with a coarse idea of what the image contained and helping them to make a more informed decision as to whether they are interested in the individual in the profile or not.