In many ways, human cognition is importantly predictive (e.g., Clark, 2015). A critical source of information that humans use to anticipate the future actions of other humans and to perceive intentions is bodily movement (e.g., Ansuini et al., 2014; Becchio et al., 2018; Koul et al., 2019; Sciutti et al., 2015). This ability extends to perceiving the intentions of other humans based on past and current actions. The purpose of this abstract is to address the issue of anticipation according to levels of processing in visual perception and experimental results that demonstrate high-level semantic processing in the visual perception of various biological motion displays. These research results (Hemeren & Thill, 2011; Hemeren et al., 2018; Hemeren et al., 2016) show that social aspects and future movement patterns can be predicted from fairly simple kinematic patterns in biological motion sequences, which demonstrates the different environmental (gravity and perspective) and bodily constraints that contribute to understanding our social and movement-based interactions with others. Understanding how humans perceive anticipation and intention amongst one another should help us create artificial systems that also can perceive human anticipation and intention.