A common misconception is that the limiting factor for new technologies are the computers they are running on. It’s understandable because many of us (myself included) grew up before the Internet and the cloud. Computing paradigms have been through many evolutions, and evolving yet again. This new evolution actually is a throwback to an older computing model.
In the 1970s and early 1980s we had the age of the Mainframe. Mainframe computers were very expensive and were the size of a room so companies could only afford to have one. In order to let many people use them, a machine called a Terminal was invented. This is basically a monitor and a keyboard that then connected to the mainframe. You were able to use it like a computer but the real computation was happening on the big machine. What is important to note here is network performance was much better than computational performance. In other words it was cheaper to do it this way. Ironically, even though the mainframe itself was expensive, overall IT costs were lower.
In the 1981 IBM created their Personal Computer. This miniaturized the mainframe and allowed a smaller version to be placed on the desks of the users. While each computer was slower than a mainframe, together they added up to much more performance. This jump in computation meant that most data was stored on the local machine because networks couldn’t keep up with the speeds necessary. Local computation therefore had leapfrogged networking. We saw a proliferation of disk drives and tape drives for transferring data between machines.
Increasing network speeds then created the Internet age in the mid 1990s. This model was more of a hybrid because the remote webserver was serving the content but the local computer had to display it. Both sides of the system were sharing the computation fairly evenly.
That brings us to today. While everyone can see the dramatic speed increases in computers (and smartphone and tablets) we are not able to witness the even faster increase in network bandwidth that has happened in the last 15 years. This is pushing the balance between network and local computation back onto the network. We have seen this happening with the iPhone’s Siri. When you ask Siri a question, the phone transcribes your question (so is processing locally) but then it sends your query to the main Siri server to actually find the answer. It’s not quite a mainframe but it is a significant offloading of computation to the cloud. (I wrote more about Siri here)
Another example is a $4,829-per-hour supercomputer built on Amazon cloud to fuel cancer research. When you connect a whole bunch of computers into one large virtual “cloud” supercomputer you are creating a virtual mainframe model.
This brings us back to my original point. When you are looking at what is possible with mobile devices, you can’t just look at the CPU inside. Once you connect to the cloud you are connecting to a massively powerful virtual computer to which you can offload big computing tasks. This trend will only increase as network speeds continue to improve.
The reason this matters for education is for 1:1 and BYOD initiatives. If much of the computation is offloaded, then the computers don’t need to be that fast anymore. The iPad is a great example. While fast, it is in no way as powerful as a state of the art PC. Many of it’s best features (YouTube, Netflix) are all cloud based, therefore the iPad doesn’t need a harddrive. If computers don’t have to be on the leading edge of performance, the price drops significantly. More importantly, the IT costs plummet. Already many schools are adopting cloud based email systems and some are even using cloud LMS systems.
This leads me to the reason I wrote this post. Using computers to translate voice and text is going to be hugely disruptive to education because it opens up the whole world as an education community. It looks like the technology is almost ready. Check out this video. (h/t Engadget)
Notice what the scientist says at the end. He said the system will live on the ATT network. It will be a service that will take advantage of the computing power of the cloud and be available to any device on their network. Certainly everyone else will have products like this soon (I know Google is working on it). This sort of disruptive technology will enable overseas competitors to enter the US education market at a much lower cost. It will happen quietly at first. US schools will outsource for tutoring services first, then maybe some fringe classes will go online because of budget cuts. Of course it can work both ways. US teachers will be able to teach classes of overseas students if the costs work out.
This is why I have been referring to the Internet as the disruptive agent, not computers or smartphones. Just remember, in the oncoming Education Stormfront, the biggest disruptive force is the Cloud.