As Virtual Reality (VR) is becoming more viable for creating immersive learning environments, a promising area of study is in monitoring and assessing learner’s engagement. Several related works in this area have used gaze tracking. Unfortunately, gaze tracking in VR is currently limited by hardware availability. This paper presents an approach for engagement monitoring that does not require the traditional costs of more sophisticated hardware setup. A simple combination of video processing techniques was used to show which contents capture the user’s attention, similar to the information current gaze trackers provide. However, success with this method was only evident with a sample video; refinements are still needed to cater to unpredictable changes in field of view. After fine-tuning the process, this approach may potentially benefit instructional VR content developers in determining which content grabs their learners’ attention effectively.