Social interactions require continually adjusting behavior in response to sensory feedback. For example, when having a conversation, sensory cues from our partner (e.g., sounds or facial expressions) affect our speech patterns in real time. Our speech signals, in turn, are the sensory cues that modify our partner’s actions. What are the underlying computations and neural mechanisms that govern these interactions? To address these questions, my lab focuses on the acoustic communication system of Drosophila. We have discovered that male song patterns are continually sculpted by both interactions with the female, over timescales ranging from tens of milliseconds to minutes. On the listener side, we have found that courtship song representations are widespread throughout the brain, but that subsets of neurons are critical for extracting complex song features and driving responsive behaviors. In this talk, I will focus on how internal brain states modulate these processes, and affect the dynamics of social interactions.