Experts Vs Crowds

Experts fighting back
The current balance of power between crowds and experts hugely favours crowds, largely because the social media and networking movement of recent years has given the crowd the tools and mechanisms it needs to build efficient methods of collaboration and communication, and this has naturally been focused on the lowest common denominator. Experts absolutely have a role to play and now that the crowd has benefitted from practically all recent internet innovations, attention appears to have shifted towards providing the same for experts, essentially intentionally or unintentionally correcting the balance. This is predictably being advanced principally in the two fields of search media and social media.

Within the realm of the search engines, today's prevailing algorithms have focused on harnessing the crowd to determine where a site should rank in a search

engine results page. Loosely, this works by looking at the number and quality of sites linking to another site as a determinant of where the latter site should rank in results. No real distinction is made in those algorithms with regards to whether the links are trustworthy; instead algorithms like Google's classic version of PageRank look at links as votes from one site to another, with certain votes carrying more weight if they are in turn linked to by other sites. This generally works well but doesn't sufficiently apply any weighting for trust or genuine authority when determining the power of those links.

Within that context, search engines have made solid movements towards factoring in more trust and authority into how they operate, and part of that is about validating the views of the crowd, and not being blindly led by mob rule. For example, Google's most recent algorithmic update of note, the so-called Vince Update, places more emphasis on how trustworthy links to a site are, and how much authority they pass. For example, we can speculate that Google is now using a variation of TrustRank, which is an algorithmic approach that allows Google to pass authority via links, diminishing that authority as it filters through the Internet. Site A, for example could be flagged as an expert in its field and inherently trustworthy and sites that it links out to identified as such too, by virtue of the link relationship, and then sites linked down from those would benefit slightly less so, etc.

Wolfram Alpha, a relatively new entry into the search space (or rather, the information retrieval space) also places more focus on trust and experts. In that instance they do not search the web; instead they have a number of databases that one can search, databases that have been approved as expert documents or expert data.

Bing, Microsoft's new search engine, delivers information on symptoms, diagnosis, and medical procedures from nine trusted medical sources when you make health related searches, as opposed to leaving it to the armchair doctors amongst us to pollute the returned listings.

In the realm of social media and social networking, innovations are moving in the right direction too, towards platforms that are limiting the effects of the unsolicited crowd. For example, LinkedIn allows its members to ask questions within their own network of contacts, essentially to people within their direct sphere of trust. Experts asking questions to other experts and receiving expert responses represents the momentum there, which is producing a wealth of expert information on all manner of topics. However, LinkedIn is a closed community where you can only see that knowledge if you are part of that group, so it has the access to experts' problem that we have already brought attention to.


Expert Crowds & Curators
Most sites and networks rely on either crowds or experts, rarely both. The Guardian Online and The BBC rely on experts for most of their content (political correspondents, economists, etc), with moves towards augmenting with crowd content. Amazon relies on crowds for its reviews, but makes very little utilisation of experts in any form. The use of one or the other actually represents two extremes, with neither extreme being right in every case.

For example, think of the difference between arguably the two most popular sources of movie reviews - IMDb and Rotten Tomatoes. The first uses crowds to score movies where star ratings are then pooled for an aggregate score. The second looks to movie critics (the relative experts) and aggregates their scores to give a freshness rating. The scores between both sources tend to correlate highly, but sometimes differ wildly, for example 2008's Seven Pounds, starring Will Smith, is given an IMDb score of 76% versus just 27% at Rotten Tomatoes, which is essentially the difference between saying a film is one of the greatest ever made, or one of the worst.

These two extremes pose a serious problem - Is Seven Pounds a good movie? Should people 50 years from now, exploring our cinema through researching their version of the internet, stop and watch Seven Pounds or ignore it? How we determine who is right and who has the loudest voice here, determines what gets passed on to our children, and theirs. Maybe the integration of these extremes, the bringing together of expert and crowd data, solves this problem.

Let's take the role of search engines in this scenario - a search for 'seven pounds' in Google retrieves IMDb as the first result, with Rotten Tomatoes at sixth. The visibility of a position 1 listing is exponentially greater than a position 6, so IMDb (the people) has far more influence than Rotten Tomatoes (the critics) over the overall perception of Seven Pounds. What this means is that search engines favouring IMDb in rank positions, raises the visibility of Seven Pounds as a good movie, as opposed to a bad one, as more people are seeing the positive IMDb ratings for it than the far less impressive critic scores from Rotten Tomatoes. So the crowd being more visible than the experts essentially defines Seven Pounds' place amongst every film ever made, according to our collective store of knowledge that will be passed on.

The verticalisation of the Internet will catalyse an integration of the extremes, producing a fairer assessment of society's view of Seven Pounds, or a book, or a political standpoint, etc. by bringing these disparate viewpoints together. With verticalisation we'll see the emergence of Expert Crowds - groups of experts empowered by networking technologies that will allow them to have a louder voice online, more influence, and the tools required to catalyse focussed discourse around their specific areas of expertise.  'Every crowd has a silver lining' as P.T. Barnum once said.

Glimpses of how these expert crowds might work are now materialising, albeit within less serious pursuits. US cinema ads, for example, compelled people to visit a particular URL online as part of the Halo 2 computer game's launch. The site purported to be that of a beekeeper that had mysteriously gone missing. Her honey-based recipes on the site were replaced by what appeared to be random lists of numbers.

Over the next four months 600,000 people joined in to solve the mystery. The participants set up blogs, forums, and a central group of 4,000 called 'the Beekeepers' became the core nerve centre. In an impressive display of collaboration they determined that the numbers represented 210 global coordinates, each representing a pay phone somewhere in the world. The game continued with challenges that required serious collaboration involving 1,000 pay phones and information to be passed between people answering those phones with information only provided within 15 minutes of it being required.

If game developers are capable of orchestrating this kind of mass collaboration within an expert crowd (in this case computer geeks), surely the same approach could give us the structures we need to make significant progress with more serious pursuits, such as finding the cure for diseases, setting up and running charitable projects, even selecting the perfect movie recommendation for you. Wikipedia is the closest we have right now to an open network of verticalised experts and I don't think that anyone can claim that Wikipedia isn't game-changing, but Wikipedia is just the seed of what's to come.

Going one step further, a person could have their own personal 'expert crowd', i.e. a group of people mathematically determined as being influential to you personally and that you are knowingly or unknowingly a follower of. Sometimes I find Rotten Tomatoes' critics share my view of a movie, sometimes IMDb is closer to the mark. The most accurate measure though is to ask the five of my friends, who I've always had similar tastes to (essentially my expert crowd for movies), and their assessment is almost always more accurate.

Above these crowds and expert crowds will sit the trusted sources, the 'curators', making sense of what's happening in these complex communities, summarising their findings and contributions, essentially filtering useful findings to the masses from the crowd and the experts, and filtering out the noise. The natural heirs to that role are the likes of the BBC, Wikipedia, The Register, and the respectable press  - entities that are implicitly trusted and have as close to perfect editorial integrity as you're likely to find. In fact their existence will rest on their impartiality.

On this basis the battle between crowds and experts, and the rapid evolution of our technologies and capabilities, will create a new entity, the personal expert crowd, and you'll never need to sit through a movie which you hate ever again. This concept will pervade multiple platforms and impact upon how search engines work, social networks operate, and how we create and store data for future generations.


Filter posts

Powered by

Back to Top

© Copyright 2018 Greenlight. All Rights Reserved Terms & Conditions