Banner 3 Banner 2
ArticlesTechnology

Algorithms in Tiktok and Challenges

TikTok recently announced that its European Union users would soon be able to disable its infamously engaging content-selection algorithm. Tiktok Algorithms

This transformation is being driven by the EU’s Digital Services Act (DSA), which is a part of the region’s larger initiative to regulate artificial intelligence and digital services in accordance with human rights and values.

TikTok algorithm picks up on user activities, such as how long they watch a video, what they enjoy, and when they share it, to produce a highly customized and immersive experience that can influence their mental states, tastes, and behaviors without their full knowledge or agreement.

The essential freedom to self-determination over our brains and mental experiences, or cognitive liberty, is well protected by an opt-out provision.

Users will have access to popular videos in their country and language, as well as a “Following and Friends” feed that includes the creators they follow in chronological order, rather than being limited to algorithmically generated For You pages and live streams. By doing this, popular content from their area is given precedence over content chosen for its stickiness.

The rule also prohibits targeted advertising to users between the ages of 13 and 17, and it offers more details and reporting mechanisms to help users identify inappropriate or harmful content.

Protecting cognitive liberty is becoming more and more important as artificial intelligence, big data, and digital media continue to influence our society.

While such legislation and plans are making progress, they frequently concentrate on certain aspects of the issue, such as privacy by design or data reduction, rather than laying out a clear, all-encompassing strategy for safeguarding our right to free thought.

The creators and suppliers of these technologies may avoid responsibility if there aren’t strong legal frameworks in place everywhere. This is why little, gradual modifications alone will not cut it. The business mechanisms that underpin the tech ecosystem urgently require revision by legislators and businesses.

Read Also: Calls for a Ban of Tiktok in Kenya

A well-structured strategy calls for a mix of rules, rewards, and commercial redesigns that emphasize cognitive liberty. User interaction methods, information exchange, and data privacy must be governed by regulatory norms.

There must be robust legal protections against invading the privacy of the mind and manipulating it. Companies have a responsibility to evaluate, disclose, and implement precautions against improper influence. They also have a responsibility, to be honest about how the algorithms they use work.

Similar to corporate social responsibility requirements, businesses should be legally obligated to evaluate their technology for its influence on cognitive liberty, offering openness on algorithms, data use, content moderation procedures, and cognitive shaping.

With an emphasis on transparency, data practices, and mental manipulation, an impact assessment tool for cognitive liberty would particularly examine AI’s impact on self-determination, mental privacy, freedom of thought, and decision-making.

The required data would include in-depth explanations of the algorithms, information on data sources and collection, and proof of how the technology alters user cognition.

Technology firms ought to use design principles that support cognitive liberty. Steps in the right direction include having options like customizable settings on TikTok or more control over notifications.

Other self-determination-promoting features, such as asking users to engage critically with an item before sharing it, or identifying material with “badges” that indicate whether it was produced by a human or a machine, should spread across digital platforms.

In conclusion, let’s embrace technology and its inputs in a controlled manner and this will help us uphold our cultural and social norms. Tiktok Algorithms

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

You cannot copy content of this page