At that time, the computing curriculum was only just being developed for lower secondary, and so we based our Key Stage 3 curriculum on the Open University’s My Digital Life course. This was the first time I saw the benefit of using physical computing to teach computational thinking. The SenseBoard, developed by the Open University, allowed students to see that there were other input and output devices beyond just the keyboard and the monitor. We were able to create projects such as a voice game controller and a noisy classroom indicator, and even allow the students to send a word between two computers using infrared light.
The department has developed an ethos of giving students opportunities to use computational thinking and coding in as many different situations as possible, allowing them to find the area of computer science that interests them. As part of this, we had started looking at developing a robotics curriculum.
Challenges in the first national lockdown
When schools moved to remote learning, we needed to find a way to offer students an experience that was as near to that of the classroom as possible. Using our normal resources would not be an option, because we knew that not all of the students could run the Sense programming software on their own computers. We also knew that we would not have sufficient resource to supply all students with a SenseBoard.
Therefore, we decided to find an online solution that would run on a reasonable broadband bandwidth and had a similar coding environment to the Scratch-based Sense software. For the first few weeks, we didn’t teach live lessons, so the choice of coding environment was very important. Students needed to feel comfortable with the software, as they could not get spontaneous feedback to a question or reassurance from the teacher as they could in the classroom.
The other challenge was having an environment that recreated an authentic illusion of physical computing.
I had previously come across the block-based environments used by Snap! and Robot Mesh, and the department could see a use for both in a classroom, in different ways. Snap! would give us the ability to develop the students’ knowledge using its more complex blocks and libraries, but it did not have the ability to use a variety of sensor inputs and outputs, like the SenseBoard. The Robot Mesh software offered a possible solution to our problem in that it has a realistic display called a Mimic that allows you to run your program on a virtual robot. It also enables you to create your own obstacles and barriers, and can be downloaded onto an actual VEX robot. However, the software required us to create the robots and fields that we needed ourselves. From our experience, we knew that the Mimics needed to be created using a computer with a fast processor speed, which our laptops didn’t necessarily have.
At school we have been involved with the VEX Robotics Competition for a few years, and through the VEX community I was introduced to the newly released VEXcode VR. Like Snap!, the software uses a blocks environment, but unlike Robot Mesh, it has precreated robots with a wide range of sensors from push buttons to distance sensors and gyroscopes. It also has several fields, known as Playgrounds, with obstacles already created. VEX had also created a number of activity worksheets with differentiated activities linked to the Playgrounds.
How we used VEXcode VR in lockdown
From our pre-existing worksheets and the activities created by VEX, we created a mini robotics course taking the students through the basics of sequencing and selection. The number of Playgrounds in VEXcode VR allowed us to create different exercises for each topic area and use them to scaffold the learning for the students. Additionally, we developed a website containing videos to introduce the activities and give students support in completing the tasks. The students immediately engaged with the new environment and liked the way that it allowed them to see the solution working in a physical environment. They also liked the way that it gave realism to the problem — environment conditions meant the robot sometimes didn’t perform exactly the same all the time, just in the same way as a real robot. As students became more motivated by the environment, we were able to send them the more complex activities already created by VEX.
Using VEXcode VR back in the classroom
Throughout the pandemic, VEX have continued to develop the software and increased the number of Playgrounds and associated worksheets. This has meant that we’ve been able to look at the software as a more long-term part of the curriculum. Since September, we have altered our lower secondary curriculum so that the students are able to learn and develop their computational thinking skills using Snap! and VEXcode VR, choosing the most appropriate environment. We can continue developing the physical computing ethos in the lessons, knowing that students can continue to work on projects at home without needing access to physical hardware. By the time our students reach Year 9, they should have enough familiarity and confidence with the software, and coding skills, to further develop their prowess using the physical VEX IQ robots that we’ve bought.
The ability to write programs in Python and the complexity of the Playgrounds mean that we can use the software with Key Stages 4 and 5 (ages 14–18). For example, the Playgrounds will allow us to create maze algorithms in Key Stage 5. I can see us being able to extend the use of the VEX IQ robots across all year groups, which is a very good point to be able to make when trying to get funding for physical computing equipment!