Early June brings around the Apple World Wide Developers Conference (WWDC) and with it the new updates to Apple’s operating systems, iOS and iPadOS, WatchOS, and macOS. Ever since its release in 2010, I have been fascinated by Apple’s iPads. I’ve had the opportunity to use many different versions, and my current primary device is a 12.9” iPad Pro. I think the iPad is one of the best-designed devices and the one that has the possibility to reinvent the use of technology in education.

One of the best, probably misunderstood, and sometimes disliked commercials is “What’s a Computer?” that Apple made to showcase the power of the iPad in education. Apple was showing us the unfulfilled potential of not just the iPad in education but of technology in general. Too often in education, we are limited in our vision of what technology can and should do for us. One of the most powerful models for the use of technology in education is the SAMR Model, which talks about the four different levels of technology integration: substitution, augmentation, modification, and redefinition. Often, the use of technology in education is stuck at the substitution stage (e.g., we type instead of handwriting). One of the biggest complaints about iPads has always been the lack of an integrated keyboard. We want our students to type the answers to their test or their assignments. Substitution is a great initial step in the introduction of technology, and Apple has accommodated those requests with the integration of both Apple and third-party keyboards. As a matter of fact, I’m typing this post on an iPad with an external keyboard. However, as the commercial encourages us, we need to move beyond the idea of just substitution to augmentation, modification, and ultimately redefinition. It is those last two steps that are transformative rather than just an enhancement.

The commercial talks about the power of mobile computing – the ability to do our work any place any time. It talks about the power of using the cameras to interact with our environment and augment our educational process. It talks about the use of publishing to enhance and redefine the tasks at hand so that our students can publish their work to a wide world audience and receive feedback from other students all over the world. It hints at the ability of VR and MR to completely redefine the educational experience, to bring us to new and old worlds without having to leave the classroom. Underpinning all of these things on the iPad is the iPadOS.

iPadOS is still an operating system in transition. While at the start of the iPad revolution, the devices relied on the same iOS operating system that the phones did, Apple has slowly transitioned it to its own operating system. It’s one that is still growing and changing, and I’m glad to see Apple experimenting with new options in iPadOS 16 (a misnomer since it’s only been a couple of years since it’s split from iOS). It’s clear that Apple is trying hard to find a place for it between iOS and macOS, and I completely agree. We don’t need the iPad to run macOS, that’s what we have laptops for, and we don’t need it to run iOS; that’s what we have phones for. It needs to take advantage of the interaction with the large touchscreen and the amazing pen that Apple created for it. One of the places that the OS has stumbled is in its ability to multi-task. iPadOS 16 is taking another crack at switching between apps with a new way of seeing the various apps open on the device. It’s also enhancing the ability to resize the windows of the apps to take advantage of the larger screen real estate and possibly making multi-tasking easier. It’s adding Freeform, a collaborative whiteboard, it’s adding the ability to lift subjects from the background in a picture and use them in other applications, and many accessibility enhancements that continue to make the iPad usable by all.

I’m a huge fan of the iPad in education, and I’m excited about the changes that iPadOS 16 will bring. What are your thoughts on the iPad? Do you think it has untapped potential in education? If yes, how can we unlock that potential?

While not over, the pandemic has given way to a new normal. Remote work and online courses are now part of our vocabulary more than ever. As we learn to live in this new world, hybrid seems to be the key word.

Hybrid courses with some meetings online and some in-person, and hybrid work schedules with some days in the office and some working from home seem to be the compromise that’s acceptable to a lot of people. CEOs of big companies want employees to return to the office full time. A lot of employees on the other hand, are not so thrilled about the prospect. It seems the compromise is hybrid work.

In higher education, hybrid courses seem to be the future as well. Online courses, while useful in a pandemic, and convenient in many respects, is a form of learning that requires a completely different approach than in person. Faculty cannot just take an in-person course and transfer it online and students have to adjust their learning styles to a modality that requires more independence and self-motivation. Hybrid can help bridge that gap.

There are a few different hybrid models worth mentioning. One, is a traditional hybrid model where some classes are in-person and some classes are online. Students have to arrange their schedules to participate in the on campus classes. This model allows the instructor to employ a flipped model of learning, with lectures being recorded and class time used for interactive activities such as discussions or hands on exploration.

A second hybrid model is the Hyflex model. While much harder to implement, it offers students much more flexibility. In the Hyflex model, students have the choice of attending in-person or online. The instructor is present in the classroom for all classes during the semester and those who chose to participate in person may come to campus as well. Classes are streamed online as well using video conferencing software and cameras placed in the classroom. Students who chose to participate virtually can join using video conferencing. The Hyflex model is highly dependent on technology access, both in the classroom, and at home. While institutions of higher education can chose static cameras in the classroom, the instructor will have to constantly adjust the camera angle to capture either the blackboard, the instructor, or the rest of the peers in the classroom. This puts an undue burden on the instructor and can be a distraction from the learning experience. More expensive, auto tracking cameras can be installed and they will track the instructor as they navigate between the podium and the blackboard making the experience easier to navigate. However, the ability to conduct group work will be limited and the students at home may feel more isolated than their peers in the classroom. While students can bring laptops and join the video conference to work with their remote classmates, the process will take longer to set up and will require more advanced technology skills, both on the part of the instructor and on the part of the students. Technology hiccups can occur leaving the remote students stranded from the rest of the class.

Regardless of which hybrid model is chosen, training both in terms of technology and in terms of pedagogy is important. Technology can make our lives both easier and more complicated and it’s our job to make sure it’s the former.

Digital technology has long held the promise to “revolutionize” education. A quick search in Google Scholar shows articles going as far back as the 1960s. However, as much as the topic has been discussed, results have been spotty. The current pandemic has thrown both k-12 and higher education in a remote environment, trying to use digital technology to teach their students. Again, the results have been spotty. Students and faculty access to technology has been an impediment, as has knowledge of how to use digital technology to teach and learn.

So what can the future hold?

VR – virtual reality, AR – augmented reality, MR – mixed reality are terms that have been bandied about recently. While the many applications revolve around games, this virtual interaction with our reality can have powerful applications in education. Imagine that while sitting in your living room, you put on a pair of goggles and you’re instantly transported into your classroom. Every bone in your body is telling you that this is your classroom. You see your students, you see the desks, you can interact with materials on your desk, you pick up a piece of paper and you show it to the class. Teaching geography? Push a button and all of a sudden the entire class is transported to Antarctica. Teaching history? Push a button and your instantaneously transported in the middle of the battle of Waterloo, examining the unit placements of the British French and Prussian armies. Teaching biology? Be instantly transported through the human body.

With mobile technology available today, with the addition of Google Cardboard, most students can transform their smartphone into virtual or augmented reality viewers. For example, the Google Arts & Culture allows students to virtually tour museums across the world, Google Expeditions allows students to explore a wide variety of historic and geographic locations, and the New York Times VR can allow students to visit Mecca during the religious pilgrimage. How else could this technology enhance our classrooms? What potential might it have for our students’ future? What does that mean for learning and engagement?