In June 2019, a Supreme Court ruling in New South Wales created new worries for every single business that uses social media to communicate with its users.
Judge Stephen Rothman ruled that operators of commercial Facebook pages are legally liable for defamatory comments made on their pages, even if those comments are made by members of the public who are unconnected to the business. His argument is that publishers can use a hack to pre-moderate comments, even though Facebook is not designed to allow it.
Unfortunately, this ruling is highly impractical. Pre-moderating all comments is too expensive and time consuming for all but the largest companies. The only other reasonable alternative is to block comments altogether, which defeats the purpose of social media as a communication tool and would likely end the utility of Facebook as a means of marketing.
Can a company reasonably be expected to decide if a comment is defamatory?
One defense against libel and slander is that what you said is true, and companies cannot be expected to know the situation of every individual. Automated filters can catch certain things (such as the use of profanity), and while it is not unreasonable to turn off comments on specific posts that might cause a problem, the fact is that companies can’t easily control what is said in the comments on their page. A fight between two users that breaks out over something seemingly minor could easily lead to a lawsuit.
Given how unworkable the ruling is, it will almost certainly be appealed and there is a high chance it will be overturned. However, there is a growing move to hold social media platforms responsible when hate, violence, and libel are spread using their systems. The Rothman ruling suggests that the answer is to punish and hold responsible individual users and companies. Of course, this will hit smaller businesses harder than, for example, large media conglomerates who could theoretically hire moderators to cover pages 24/7. For small business owners, who have to sleep some time, this is unreasonable.
However, even the smallest companies will have to make an effort to be less hands off when it comes to handling social media. Hiding all comments prior to review is impractical, and in some cases would require multiple full-time employees just to deal with the moderation problem. Facebook’s moderation tools are not (yet) up to the task of filtering out certain classes of comment. While, in theory, the less than anonymous nature of Facebook should stop people from causing problems, the evidence is that people are quite willing to say awful and untrue things to and about strangers on the internet. Quite apart from any legal issues, hate speech on a page will drive away potential customers.
What can a company do to protect themselves and improve the user experience on their page?
Here are some tips:
1. Join the push for Facebook and other platforms to improve their moderation tools. Right now, the only comment moderation tools offered by Facebook are a word filter to block certain words (the hack Rothman suggests is to automatically hide all comments until a moderator has reviewed them), or a profanity filter which offers medium and strong settings, but gives no information on which words are considered “profanity” and the ability to control who can post to your page. You can also block specific problem users, but that is very much after-the-fact moderation and might not protect from liability. Posts to your page can be pre-moderated, comments cannot. This also doesn’t prevent users from editing their unhidden comments later and changing the content to something that could be considered defamatory.
2. Set filters and keep them updated. You might want to filter out slurs, but you might also want to filter specific words that relate to a recent post. For example, if you posted an article about a public figure, you could put their name on moderation to catch at least some of the things being said about them.
3. Regularly review and remove comments. If a post is causing problems, you can turn off commenting for the post on certain platforms.
4. Hire a professional community manager to do all of this for you and provide expertise. Working out what should be filtered and knowing when to delete comments are skills. If you cannot afford a full-time manager, you may be able to outsource this to a community management company.
5. Use an archiving system so you have a record of hidden and deleted comments. This can be used to prove that your moderators deleted a comment quickly and provide a record of why certain users have been blocked. It also makes sure that if you have a team of people handling your page, everyone knows exactly what is going on. In the event of legal action, robust archiving will help protect you from inadvertently destroying evidence and possibly facing further legal charges. Comments can thus be deleted without any fear of losing a record of the engagement. A good archiving system will also keep a record of the entire thread.
This might not protect you from the current ruling although, again, it is very likely to be appealed and reversed. The decision does not gel with how things are handled in other democracies and presents an undue burden for companies, especially smaller ones, who use social media to communicate with their customers.
In the long term, all businesses in Australia should treat this as a wake-up call. You can no longer afford to simply make a social media post and then forget about it. Going forward, businesses must be careful to keep an eye on what is being said in the comments on their pages. If you are moderating and deleting comments, though, you will need to archive those comments so the record can be retrieved in the event of legal action.
Brolly offers the first Australian social media archiving tool, designed to ensure compliance with all Australian regulations. This innovative archive service keeps a track of everything, including website links and rich media. If you are a company with a public facing Facebook page and have concerns about compliance or simply tracking actions taken by social media managers and moderators, contact Brolly today to find out how this innovative, secure service can help you.