This post has 406 words. Reading it will take approximately 2 minutes.
Or, rather, easier access to information means we get a less balanced world-view, thereby making us less overall informed.
The idea is really rather simple: if we see something we disagree with, we are more likely to investigate it. A direct consequence is that we are more likely to find evidence against it. This is simple confirmation bias.
Broadening the scope a smidgen, people we disagree with are more likely to find evidence against something we agree with than somebody we agree with (because they would be more likely to disagree with it and therefore investigate it as argued above). It is therefore easier to dismiss the argument as “political” or “biased.” This is a bit of groupthink, naive realism, and in-group effect.
Combining this with a bit of filter bubble and outgroup homogenity, this means there is only ever a need to believe what we already believe and question what we disbelieve, and we are unlikely to ever see anybody questioning what we believe.
On top of that, there will always be a study stating what we believe. That doesn’t even necessitate evil actors, but is a simple consequence of the fact that statistically, one in twenty studies are wrong (the number is really higher, because researchers similarly to above will search for errors in their experiments when they get unexpected results but not when they get the expected result).
Easy access to information means it is easy to find all of the studies and cherry pick the ones disproving what we disagree with. Any study disproving what we agree with is always disproved by another study or can, using a bit of naive realism and out-group homogeneity effect, simply be dismissed because it is shared by somebody who is wrong.
Thus, our actual knowledge is severely harmed; we never question what we know and agree with and can easily dismiss what we disagree with.
I don’t believe there’s a simple solution, but that’s part of the point: we need to not search for the simple solution. We need to not dismiss the study disagreeing with us because of where it comes from, nor to trust what we agree with. We need to not extrapolate a study from a highly editorialised and click-optimised headline of a reprint of a telegram written by an article farm based on a Wikipedia article about a blogpost mentioning a tweet about a study in passing.
Time person of the year 2006, Nobel Peace Prize winner 2012.