Are Google Algorithms Reinforcing Anti-Science positions? 3 Feb. 2021
As the anti-science movement seems to gain strength and undermines the campaign for COVID vaccination, there has been increased interest in the origin, strength and tactics of this.
It is blamed on the Russians, who presumably are trying to weaken and divide the West, and on civil libertarians, who want to politicise medical common sense. But when it helped by people like Trump in the White House and Kelly in Australia the conspiracy theories are put into perspective, as the anti-science views are given legitimacy.
But in the fuss about Google withdrawing from Australia, or not covering Australian politics, I wondered what effect this might have and tried a different search engine, duckduckgo. The difference is that google gives me a personalised feed, but duckduckgo gives everyone the same information for the same key words.
Search engines at a basic level give a ‘top pops’ of popularity of a topic in that those with the greatest number of clicks go to the top. This may be fine if you are looking for a movie review, but if you want older material it will be a long way down. Scientific articles are a lot further down than mainstream ones, and the algorithm is influenced by the viewer’s previous viewing habits. If a person has viewed a lot of conspiracy articles, it is presumably then likely that these are more likely to come up again and reinforce the existing views of the viewer. If the feed is continually biased to a point of view, the viewer is likely to come into contact with more of this view and people who share t, so that they are eventually in a bubble or subculture of people with this belief, and are unaware that their reality has been changed.
As an example my son went to school with a boy in NZ whose father controlled feral pests for a living, which meant shooting rabbits, ferrets, deer, pigs, cats and possums which are predators on various farms in NZ. He kept in touch with his friend and they played video games online. But his friend went shooting quite a lot with his father, joined a gun club and started to receive the literature of this subculture. His previously non-political, mainstream views are now hugely influenced by the American gun lobby and rabidly right wing. This is quite unusual in rural NZ. My son commented, ‘In the end, you think what you get in your feed’.
The algorithms exist to make you happy and to keep you clicking in order to get you to buy things. But the result might be quite different- a creation of a bubble environment where everyone’s opinion tends to be magnified, sometimes going in a bad direction.
How this can be controlled is a question- if we all got the same feeds, would the sensible people make sensible articles come up first? Presumably; if most people were well educated. We had better go there also. Which Big Brother will tell google how to do its algorithms?
(The longer version of this attached article is available via a link at its end).
This entry was posted in Accountability, Accountability, Civil Rights, Constraints, Education, Accountability, Government, Health, Public Health and tagged in Accountability, Algorithms, COVID-19, Fake News, google, Social Media.