By using our site you accept the terms of our cookie policy

Managing the Rise of Criminal and Abusive User Generated Content

April 9, 2018

Image

Managing the Rise of Criminal and Abusive User Generated Content

Patrik Frisk, CEO at Besedo provides his opinion on how to manage the rise of Criminal and Abusive User Gernerated Content

The amount of User Generated Content (UGC) across the internet has grown incredibly quickly over the past few years. Consumers use UGC, especially reviews, to make purchasing decisions with research showing that 92 per cent of people trust recommendations from individuals (even strangers) over brands. A positive or negative review can therefore make a huge difference to a business with consumers acting almost as a marketing function.

With such an increase, perhaps inevitably, there has also been a substantial rise in the amount of criminal, fraudulent or offensive posts. This can cause companies real issues in terms of reputation and financial loss unless they are able to quickly identify and remove such posts.

However, with the sheer amount of UGC now appearing on sites, the job of moderating has become a crucial, but resource heavy role.  We have seen the likes of Google commit to employing 10,000 people  specifically with the role of moderating YouTube videos. This came after the severe criticism from both the public and advertisers regarding the amount of highly offensive content that remained on the site.

Such a commitment, whilst admirable, is a necessity in this case. The continuing loss of advertisers simply could not continue and Google had to be seen to be solving this high-profile issue. Most organizations, of course, do not have the budget to deploy such a sizable workforce. Every website that allows UGC will have to deal with a percentage of negative content. Adding to this challenge is the increasing amount of regulation being introduced by governments in order to crack down on negative, offensive and criminal UGC.

In Europe, the German Government at the beginning of 2018 introduced regulation that means websites that operate in the German market now have 24 hours to identify and remove UGC related to ‘hate speech’. The law initially at least is restricted to sites with over two million users, but those who fail to act will face fines of up to 50m Euros. This would seem to be only the beginning of a number of regulations that will sweep across all countries and is also likely to impact smaller sites as the efforts to combat offensive UGC continues.

So, both in terms of reputation and adherence to the latest regulations, the moderation of UGC is now a crucial element of any online retailer/marketplace. Unless you have almost unlimited resource, companies have to find ways of making this process efficient.

This is where Artificial Intelligence (AI) can play such an important role. It is a hugely efficient way of quickly identifying possible abusive or inappropriate added content, allowing it to be taken down before it impacts end-users. AI is constantly learning and so the process will become increasingly efficient and effective. However, without the input of human expertise, AI, on its own, can only play a limited role. Using experts who have knowledge of the type of language used and the changing trends of criminal’s use of sites alongside AI technology strikes a good balance; allowing legitimate posts to be approved and added in a timely manner, whilst weeding out the abusive or criminal posts and removing them quickly.

Up until recently AI has been fairly prohibitive for all but the biggest companies. The financial investment needed has meant smaller companies have been unable to take advantage of the technology. Furthermore, providing AI with the huge amount of data sets needed for it to act independently can be near impossible for smaller volume sites. This is changing though and it is not just huge enterprise sized organizations that can benefit. Indeed, there are some solutions that are now “off-shelf” ready for businesses to use, pre-loaded with AI modules constructed to deal with most general content moderation challenges for online marketplaces. These can be immediately put into action combatting criminal and abusive posts.

Undoubtedly, UGC can provide huge benefits for online businesses and can act as one of the pillars for success for many. The independent third-party endorsement it brings from customers recommending good or servicess is unbeatable marketing for businesses. However, the inevitable rise of criminal, offensive or fraudulent UGC has meant that companies have to manage this carefully and effectively. With the regulatory landscape changing all the time and the perpetual threat to reputation as a result of inappropriate UGC appearing and remaining on site; action must be taken. Using technology alongside human expertise is increasingly being considered the most effective method of moderating content effectively and efficiently.