visit
However, I am not discussing about that kind of a scenario in this article. This article was written to discuss something else; something about AI which is more relevant to modern times.Artificial Intelligence has reached a point where people have various applications and platforms suggesting them content that they like based on their history of activity in those applications and platforms. Systems which make this possible are known as recommendation systems. These systems make extensive use of proven, effective algorithms of Machine Learning: a major sub-field of Artificial Intelligence.
Content recommendation systems are mostly used in shopping and social media platforms. Shopping is something which is a very subjective activity, many e-commerce platforms have recommendation systems which observe patterns of purchases by their users, and recommend products based on these patterns.This is a great feature that has been for a long time.
They recommend or generate a variety of content based on the activity or preferences outlined by a particular user profile.Social Media platforms and e-commerce platforms can be regarded as prime examples of the effectiveness of recommendation systems. Recommendation and suggestion algorithms are now being adopted in several other facets of the internet.
Technological caveats of these systems are perhaps widely discussed, but that is clearly not the subject of this article. The subject that is discussed here is a more of a timeless character (or flaw) of the human condition, and how it is amplified by a cutting-edge AI technology.
It is simply clouding your judgement with certain beliefs, emotions and convictions that may distort rational thinking.
Of course, one could always choose who and what to follow in Social media as purely a matter of individual choice.There’s a saying that in this information age, ignorance is a choice. Given the advancements in technology, there’s both truth and untruth in it. The truth is that you have swift and easy access to information and knowledge in the present. You can delve more into information in order to refine your opinions, should you wish to do so. The untruth is that there are ways which could reinforce your ignorance.
You have access to ideological echo chambers at your fingertips, you can choose to filter the information you want easily, and of course, the through the ideological lens of those who provide it.With the advent of content recommendation systems, you can get information automatically tailored to your leanings. This enforces cognitive biases such as confirmation bias. The more content that you receive in relation to a viewpoint in your profile that is detected by machine learning algorithms, the more you make projections that this particular viewpoint is shared among others in general. Social Media platforms such as Facebook, Google and Yahoo employ a concept called “” to tailor various content according to the preferences of their users. Facebook has its own tools to filter the information you want to see, to decide whose content you should read and, of course, automatically generate content.
Search engines such as Google and Yahoo are said to use the filter bubble concept to tailor search queries according to the preferences of a profile.
The caveat of this is that machines will have a profound influence on human perception.Currently, this isn’t a very big downside of the use of social media platforms. One can still choose to resist cognitive biases by simply perusing different sorts of information from different parties on social media. It’s of best interest to users to be aware of the nature of content recommendation in social media. In the end, it’s only us who hold the ability to mitigate our own biases.