Naver announces new feature to combat comment manipulation: All you need to know

thumbnail

On April 28, Chosun Biz reported that Naver, the popular South Korean search engine and news portal, will launch a special service to target 'manipulated commenting.' Manipulated commenting is a phenomenon in which many comments appear on an article to sway public opinion. With this service, media outlets can filter comments if they spot 'manipulated commenting.'

On April 28 itself, Naver announced via its affiliate channel that:

"A new feature will be added to the Smart Content Studio on the 29th to alert media companies when there is a sudden surge in user reactions in the comments section."

This feature will notify media outlets about any sudden surge in likes or dislikes within 24 hours.


What led Naver to adopt a new feature for filtering comments?

View this post on Instagram

Instagram Post

Recently, with the news of Kim Soo-hyun and Kim Sae-ron's dating scandal and President Yoon's impeachment, netizens have taken to social media platforms to express their opinions. However, with the rise of sub-sections within online communities and the increase in the use of AI, there have been noticeable instances of mass comments on a particular topic with the intention to change or sway public opinion.

Such comments manipulate opinions and also lead to confusion when picking a side. Hence, the company has devised a remedial measure to combat mass manipulation.

This new feature to filter 'manipulated' or 'targeted' comments will grant media outlets access to change the comment sorting method for articles suspected of being prey to 'manipulated' comments. Explaining the concept, the company said:

"The list of detected articles can be checked in the newly revamped comment menu, and each media company can directly adjust the comment sorting if deemed necessary for a particular article."

The criteria for determining comment manipulation will be based on certain pre-defined standards. The South Korean online giant will launch this service in beta, with the scope to expand and refine it further.

This service was first mentioned on April 18 by Choi Soo Yeon, CEO of Naver, who appeared before the National Assembly's Science, ICT, Broadcasting, and Communications Committee and said:

“We are aware of the issue of targeted commenting and deeply regret not having taken technical measures in advance.”

The company had planned around April end as a timeline "to alert media companies to any abnormal activities and implement technical measures to immediately notify users."

Adding further to the implementation, Lee Jung Kyu, head of Naver's Service Operations Support Division, said during the committee's inquiry that the team at Naver is working to create a system that will help the media outlets moderate comments according to their operation policies.


These actions will prove instrumental in filtering mass comments on a topic after a scandal or a controversy.