Posted by: crudbasher | May 17, 2012

When Information Has Context

Most people think of Google as a search company but I think of it as V’Ger.

Bonus geek points if you know what V’Ger is…

H/T Memory Alpha

V’Ger was the alien threat in Star Trek the Motion Picture from 1979.  Without giving the whole movie away V’Ger was a space probe with a simple mission. To learn all that was learnable and return home. Google is like that. It doesn’t just want to search the Internet, it wants to search EVERYTHING. To know everything, all information, anywhere, from anytime.

It’s a big job with several stages. The first stage is just to ingest information. Google has been doing this from the beginning and has a vast amount of information. The second part is to start to connect up data to derive additional meaning from it. They do this when they put traffic on Google Maps. What happens at the third stage then is to start to assign contextual meaning between pieces of data. Google is now doing this with a project called the Knowledge Graph. (another name for it is the semantic web) For example, you can Google search for the Eiffel Tower and it will come up with a bunch of info on it. With contextual meaning though, the search engine will understand you are looking for an architectural item, who it was built by, what it was made out of, how many people visit each year, what city it is in, etc… The list is almost endless when you start to connect data together. In fact it’s an exponential expansion.

Here’s a video from Google explaining the Knowledge Graph.

Computers are getting better at determining this sort of information automatically. For example they are starting to be able to look at photos and videos and determine what is in them. This will be a vast new source of data for the Internet.

So how does this apply to education? As I have mentioned before, our devices will be listening and watching what is happening pretty soon. They will then be able to start understanding what is happening too. Right now I can record audio on my phone but it’s just a data file, the phone doesn’t understand it. Imagine if it could… I could have an app that would take notes for me in class. Not just a transcription but a linked list of the main ideas and concepts. It could go and find other resources for me, perhaps it could even create a lesson based on additional information we find online?

What happens to schools when we can look at something in the world and say to our phone, teach me about that? This will be learning on demand, anywhere, anytime.

This are about to start moving very quickly folks. Buckle up.



  1. Your phone already understands. If you tell your phone (iPhone, anynow) “remind me to buy milk when I leave work.” it will use its GPS to figure out when you are leaving and then tell you, “remember to buy milk”. It can then let you know where to buy it, if you want.

    • Hi mweisburgh,

      You are correct your phone understands location info now. When you tell it to remind you about getting milk it doesn’t understand that. It just shows you the reminder you put in when it reaches that spot. It doesn’t know what milk is really, that’s my point. Does that make sense?

      Thanks for commenting!!

  2. […] what are they up to? Well I wrote about it a bit here but basically they want to be able to have a digital, contextual representation of the whole world, […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s


%d bloggers like this: