Our Connection Machines project investigates nonverbal and verbal behaviors of two partners in a conversation setting in the context of rapport. We have collected data on teenagers conducting a peer tutoring task in linear algebra. We are currently investigating nonverbal behaviors, such as smile, gaze, prosody and pitch, and verbal content, such as second person pronoun use, of the participants in the dialogue. We are also interested in designing and implementing automatic detection of features that arise from the interaction of the two interlocutors, such as entrainment and mimicry, in a multimodal view, which means looking verbal and nonverbal behavior at the same time. Another interest of ours is using machine learning techniques to correlate the multimodal inputs with objective measures, like rapport of the two participants, to make predictions of new data. We are also interested in investigating the relationship between rapport and student learning gain. Based on the human-human interaction data, we will elicit verbal and nonverbal patterns that serve as guidelines for virtual human design. Ultimately we will create a fully automatic embodied conversational agent (ECA) that could replace one of the participants in the peer tutoring setting and fulfill the task of engaging students in a learning task.