As social networks are increasing its use and social media get most of the traffic on the Internet, scientists are developing new theories which explain how the Cyberspace work. Some of them, as the long tail or de 6 degrees, are quite known. But there are many more.
1. Dunbar’s Law. It states that no human being can handle more than 150 trustful relationships. Facebook says the limit is 5,000, but it has never explained why and it is clear there won’t be any trust in most of them. 150 is the Dunbar number, named after British anthropologist Robin Dunbar. Curiously, BT’s innovation head JP Rangaswami thinks social software might help raise the Dunbar number.
2. The Long tail. It states that there is more value in the sum of things that get small demands than on the big demand things. It has been applied to contents (sites getting most of their traffic out of Google searches), advertising and electronic commerce (Amazon getting more sales out of unknown books than from bestselles). Wired magazine editor Chris Anderson made a book out of this theory and maintains a blog with those two words.
3. The six degrees of separation theory. It refers to the idea that there are no more than 6 people between you and anybody else on Earth. American psychologist Stanley Milgram is the most known theorist of this.
4. Zuckerberg’s Law of Information Sharing. It states that online content (pictures, videos, etc.) doubles every year. Mark Zuckerberg is the founder of Facebook, the main social network.
5. Metcalfe’s Law. It states that the value of a network increases exponentially with the number of users it has. It is an old theory applied to telecommunication networks (for example, the telephone), but it is nowadays valid also for social networks.
6. Nielsen’s Law of 1/9. It states that 10% of users create 90% of online content. It is a version of the Paretto’s Law (20/80 ratio). Jakob Nielsen is a usability expert. Anyway, I think this law might be changing because of the easiness with which social networks provide their users in order to upload videos or photos. There are less lurkers than before.
7. Permanent beta. It states that a site is always being tested by its users, so that it will never be in a final version. This is because users can so easily report errors os suggestions and innovation is permanent. It is not clear who invented this theory, though Google’s Gmail was the first popular product that applied it.
8. Crowdsoucing law. It states that products made out of the collaboration of several people are better than individual ones. There is even a book about this theory (a good one, BTW): “The Wisdom of Crowds”, by James Surowiecki. “Why the many are smarter than the few”, he explains. There is also a blog about Crowdsourcing by Jeff Howe.
9. Hackers’ Law. It states that hackers and geeks establish trends in technology. It explains why people working for most Internet companies never wear ties and their uniform is based on tennis shoes, jeans and t-shirts. I think that Tim O’Reilly is the person defending more aggresively this principle.
10. Everything is free. Because duplicating and even creating (when a user does it) online content is free, it should never be sold. This takes Internet companies to earn money out of advertising, which also creates problems in crisis periods. Charging users is outdated, though will become more popular as the current crisis keeps worsening.
I could add an extra one: The Wikipedia’s Law. It states that somebody is not so important until he is on the Wikipedia. It brought many people to create or alter their own entries, which is not appropriate for the community. It could also be called Jimmy Wales‘ law, after the founder of the Wikipedia. He has been accused of altering the entry with his name.
Hi Jose,
Thanks for this interesting recap of laws of social network theories. Some of them you mention, 80-20 law for instance, could be aggregated in what is also refered as a Power Law distribution. See http://en.wikipedia.org/wiki/Power_law fore more info.
Regards and keep rockin´
Álvaro
Hi Jose,
Nice compilation of Cyberspace laws. Well dont u think that Crowdsourcing law and Nielsen’s law of 1/9 are contradictory to each other. One says that the better the size of crowd the better the result and other says that only the top 1% always contributes to the online content. So if the same top 1% contributes always then how can u have crowd sourcing?? I would be very thankful to you if you can provide your thoughts on this….
Not necessarily. You can extract lots of info from passive users. For example, where you click, how much time you spend on a site, which site do you visit the most…
Hi,
Thanks for your reply. I agree with you but crowd sourcing means people contributing towards a particular thing instead of we extracting the info from them. My basic question is can we apply both of these laws in an online business and derive some benefits out of them.
Also is there any reference of the 11th law i.e. Wikipedia’s law. Can we verify it??
No reference so far 😉 I made it out.
I don’t think that crowdsourcing is necessarily an active action. I think it can also be a passive one. It’s actually what Google does with his algorithm.