Facebook has recently come under attack for failing to enforce its own guidelines on hate speech and violent imagery. Is it a website’s job to moderate the content users post, or should users have freedom to say what they want? Is there a happy medium? If so, how would you structure it?
I have long been an advocate of people monitoring their own material. We each have the option of turning off or looking away from anything we do not want to be exposed to. Should Facebook monitor? Of course but with the millions and possibly even billions of Facebook users things will fall through the cracks. As adults we should take the initiative to dismiss those stimuli that offend or upset us. I don't agree with many of the views that people express on Facebook, but I certainly don't blame Facebook when someone pisses me off. With that being said who gets to be the moral compass for Facebook whilst they monitor?
I do find the Facebook rules of conduct a bit laughable based on what I see on Facebook, that is my opinion though. I don't know that a happy medium or structure is very possible when you allow the user so much freedom with what they post on Facebook, and maybe that's what makes it interesting. Esentially it is a really large group of people saying whatever they want and with today's technology that makes it really difficult to structure.
In closing, if you don't like what you see on Facebook throw on your grown up panties and turn it off, unfriend somebody or simply ignore it. There are far too many issues to worry about to waste your energy fighting against someone else's “opinion”. Have a great day 🙂
I'm not the only one with a thought on this. Check out the Daily post for links to others 🙂