In order to provide adaptive instruction in heterogeneous classes, teachers have to consider performance-related information for their decisions on which task difficulty level fits best for a particular student. Focusing on prospective teachers’ data-based decision-making, we aim to investigate their use of proximal and distal indicators of students’ ability and readiness to deal successfully with domain-specific tasks that are available for 32 student cases in a digital test environment. To address several hypotheses on the participants’ information-processing behaviors and decisions, we conducted a within-subject experiment in which we systematically varied the informational consistency of the presented student cases. We used a mixed-methods approach to measure observable information-processing behavior and decision-making. We assessed the effects of consistent/inconsistent cases on the amount, sequence, and perceived relevance of selected information, the type of processed information, the time needed for information processing, and the subjective confidence when making a decision via linear dynamic panel-data modeling.