In order to carry out research on audio-visual affect recognition, suitable databases are essential. In this work, we present a re-acted audio-visual database in Turkish, consisting of recordings of subjects expressing various emotional and mental states. The database contains synchronous facial recordings of subjects with a frontal stereo camera and a half profile mono camera. The subjects first watch visual or audio-visual stimuli on a screen in front of them, which are designed and timed to elicit certain emotions and mental states. The subjects answer questions about the visual stimuli in an unscripted way. The target emotions that we want to elicit are the six basic ones (happiness, anger, sadness, disgust, fear, surprise) and additionally boredom. We also aim to elicit several mental states such as unsure (including confused, undecided), thinking, concentrating, interested (including curious), and complaining. The database also contains short acted recordings of each subject.