« My Little Halo Pony | Main | Hamlet, the indie game »

August 04, 2010

Comments

Kim

Lovely chart

>It'd be amazing to see this graphic done year in, year out.

In gapminder no less!

Without looking at the background info, does "youtube" fall under video/movies or social networks/blogs (knowing that sometimes these things get classified funny.

If it's in videos/movies then HA! Suck it youtube! :-)

bob_d

I've only seen this data compared to the previous year, but, unsurprisingly, in this latest year time spent on social networks and with games went up, while time spent on portals was reduced.

Richard Bartle

In the early 1980s, when there were few graphics on the Internet, protocol analysis was used to determine what users were spending their time doing. It was discovered that 10% of bits being passed around the Internet belonged to MUDs.

While using a different method of calculating usage, nevertheless the figure of 6 minutes in an hour for games is strangely satisfying.

Richard

Alice

I work for a broadcaster. Making games (mostly). So the games vs video slice reinforces a bunch of stuff, very useful.

Meanwhile Eric S today, blowing my mind with DATA....

Richard Bartle

Something else to note, by the way, is that 40% of Facebook usage time is spent playing games. I'd hazard a guess that they haven't factored that into the data they used for their pie chart.

Richard

bob_d

@ Alice:
When they say, "Every two days now we create as much information as we did from the dawn of civilization up until 2003," they of course mean digital information which is a lot less impressive (as for most of human history we didn't produce any) and slightly misleading, given the discussion. Actually, to be fair, the information being discussed is not just previously digital information, but also that which was both recorded before the digital age and was considered worth digitizing in the digital age, which means the majority of that information was simply lost or ignored. (I mean, flint arrow heads embody information, but we have no file format to digitize it.) Last I heard, most of the information being generated was automatically produced by software, not directly by people, so if it's true that user-generate content is now a significant part of that, it would have to be YouTube, rather than tweets and IMs as they imply. In which case, we're talking about file sizes per medium. If we had a file format for arrowheads we might well be saying, "a neolithic hunter produced more information in one year than a modern person does in their lifetime."

My grandmother no doubt produced a great deal of information as a teenager, at least as much as a modern teen, especially since she grew up in an era when the idea of being a consumer of culture, rather than a producer, was an alien one. None of the information she produced was saved or digitized however, and thus it did not exist in an infrastructure that allowed infinite copying, instant searching, or procedural transformation of that information. And that's what the real discussion there seems to have been about - that we, as a culture, are unprepared for that, as at no previous time in human history has that been the case, and we lack the cultural mores needed to respond to that situation. Reducing it down to "information produced" simplifies the discussion in a misleading way, I think.

Alice

Yep, I agree. Someone said, "we've always had conversations. But now they're all recorded.", which sums it up nicely.

Or, horribly.

Sulka Haro

bob_d - I think you're incorrect. The data estimates I've seen have actually tried to estimate how many bits of information people used to create in the past by transforming the analog data into bits, and turns out most people didn't actually create that much persistent data.

One huge factor here is that the modern types of data consume a hell of a lot more bits than what data used to. You don't have to go that far back in time and you enter a period where photography didn't exist, and most people didn't even ever write anything down. One high-resolution photo from a modern camera employs more bits than what most people write in their lifetime.

IMO the flint arrow head analogy is not appropriate, unless the specific shape in form of the exact curvature of the head etc actually meant something, which I doubt it did. Hence the head itself wasn't a piece of information as such. If the user of the head made a marking to the head to indicate a kill - that's information.

I think what you're saying is, there should be a distinction between the quality and the amount of data that's been produced over time. You're absolutely right, and the key thing here is whether we're talking about the amount of data produced, or the amount of information produced.

I'd argue that the data that's remaining from the past was more information dense overall, and hence more valuable. To give you an example of that, in the past when film was expensive, it was typical to take just one photo of a given subject. Now with digital photography, it's easy to take a hundred photos just to make sure you can pick a perfect capture later on. Here, the amount of data produced is hundredfold for a digital camera, but the amount of information conveyed is probably the same.

The same goes probably for a lot of the text produced as well, where we're creating a lot of data out there, but the amount of information value is not that great, overall.

The comments to this entry are closed.

Recent links