In the wake of the Steve Stephens murder video and other viral video posts showing suicides and the beating of a disabled man, should Facebook be punished or penalized for not filtering this sort of content?

Prescott Valley, AZ Correspondent-Facebook should be penalized in some way for not filtering this sort of content, which has been going on for entirely too long. Ever since Facebook Live was launched almost a year ago, predators, degenerates, sociopaths and other criminals of varying persuasions have been enabled in their pursuits to publicize and live stream their crimes. Shootings, rapes, torture, suicides and other related situations have been provided by witnesses that have posted to Facebook but Steve Stephens preparing to commit murder and uploading the act itself was out of bounds and something that Facebook was not directly prepared to take down.

Mark Zuckerburg, Facebook’s founder, actually launched “Live” with intentions of “supporting whatever the most personal and emotional and raw and visceral ways people want to communicate are as time goes on.” He should have anticipated that this kind of communication would be a recipe for disaster but being the liberal ideologist that he is, consequences are inconsequential, until they happen in real time and on your watch.

Zuckerburg simply didn’t foresee a Steve Stephen’s scenario, though Facebook did pull the video almost immediately after it was posted, and Facebook stated, “This is a horrific crime and we do not allow this kind of content on Facebook. We work hard to keep a safe environment on Facebook, and are in touch with law enforcement in emergencies where there are direct threats to physical safety.”

Apparently Facebook does try to filter violet, hard core video postings, but with Facebook being a widely disseminated social media network with millions of users that have 24 hour access, the removal of the Stephens’ video could not have come soon enough as thousands had already viewed it before it could be deleted.

Facebook has resisted demands to use procedures to censor videos because of free speech violations, but it has also explained that trying to instruct computers to pick up on a real time incident like the Stephens’ one is a difficult task. Facebook utilizes employees to sift through endless videos uploaded to its site. With any video that is received as well as live ones, the job becomes harder and harder for Facebook employees to watch and remove in time. Apparently, employees aren’t able to watch a video before it even posts, so many times they rely on Facebook users to flag videos that need to be removed.

Instead of relying on steadfast and honest Facebook users to flag videos that need to be taken down, Facebook needs to do it themselves and not rely on users and understaffed others to do their sifting, sorting, and removing for them. They need to change their guidelines and policies to help them make critical decisions in the case of a Stephens’ video and others like them.

Yes, Facebook must filter their content themselves or risk being fined through rules and strict regulations that are enforced on them to straighten up and fly right. Find the right blocks and decline processes that will immediately decline a Stephens’ video from even appearing, period. The First Amendment and other liberal policies should not give Facebook the excuse for murder and other horrendous crimes to be broadcast in real time. Facebook needs to get on it big-time, in real time.

Owatonna, MN Correspondent-Facebook has become one of the most popular social media sites in the world, reaching billions of people. Until recently, it was only a social networking site. With the advent of Facebook Live, real-time videos can be posted. This brings Facebook up to the level of broadcasting companies such as TV stations, even though the live content on Facebook is not professionally recorded, reported, edited, or vetted.

This double standard of not playing by the same “rules” as network and cable TV stations is what allowed the above-mentioned violent and deadly videos to be seen by thousands, if not millions, of people before they were removed by Facebook officials. Any network or local affiliate news station broadcast of something like the Stephens video would have resulted in immediate censure and fines or other disciplinary action by the Federal Communications Commission. Intentionally broadcasting live video to a mass audience of a real person being killed or raped is not tolerated.

Yet, there is another double standard at play. TV stations aren’t averse to showing previously recorded videos of violence, such as police shootings of civilians, when they feel it qualifies as news because the public wants and needs to be informed of police brutality, especially against minorities or unarmed civilians. One that leaps to mind is the policeman who fired more than a dozen shots and killed a black man in Chicago who was apparently not an immediate threat to the officer. The implication here is that broadcasting certain violent images is permissible if a valid reason exists to show such footage.

But someone has at least made an editorial decision and is theoretically held accountable for that decision. With Facebook Live, there is no filtering, no editorial staff who decides what is allowed, no consequences for the aftermath of what has been posted.

To insist that Facebook filter all content through an editing or vetting process before publication hints at censorship and suppression of an individual’s freedom of speech and expression. It would also cost an enormous amount of money and take an equivalent amount of time to clear the massive volume of information that is posted every day on Facebook. This could either make Facebook decide to close down or force the company to charge a subscription fee to users—a move that might drive away users and ultimately put the company out of business.

Facebook should take all reasonable and necessary steps to ensure that live videos like the one Stephens posted never make it onto the site, but they should not be penalized after the fact because they couldn’t or didn’t foresee something like this happening.

Gastonia, NC Correspondent-Facebook and other social media platforms such as Flickr, Snapchat and Twitter find themselves on the horns of a dilemma when it comes to content uploaded by users. Until recently, it has been sufficient to let the user community police content that’s put into the systems, with the offended parties able to “flag” objectionable content for perusal by company monitors. The content would then be allowed to stay or removed based on the monitor’s judgment.

With cases like the Steve Stephens murder committed while he broadcast it on Facebook Live, it becomes clear that a more active sort of supervision needs to be in place. However, I would recommend that it only apply to streaming video content until screening algorithms can be improved to the point that photos and images uploaded can be screened by some form of artificial intelligence. Having humans vet every single picture uploaded worldwide would take a ridiculous amount of time and manpower.

There is far less video content, however. I would think building in a 30- to 60-second delay before a stream hits the Facebook page would allow a monitor to click on and see what’s being broadcast before allowing it online. Yes, this would be adding a layer of “Big Brother” to what’s previously been a user-monitored environment, but something like the Stephens incident must never again be allowed to happen. I’m honestly surprised there haven’t been a dozen copycats since that video ran. The sickness and depravity of our fellow human beings, I have learned, is a hole with no bottom.

In the interest of decency and upholding some vestige of community morals, there has to be an impartial set of eyes on all streaming content, hopefully someone who can tell the difference between a gang murder and a hip-hop music video.

Sheffield, Jamaica Correspondent-Those experiences are all tragic and grievous. I do feel compassion for those who have watched videos and viewed images without wanting to.

However, I do not believe that Facebook should be penalised for failing to filter content of that nature.

I’ll present two reasons why I take such a disposition.

Facebook is a news outlet. Yes, those pictures and videos might be graphic in nature, but those videos are able to reach a wider audience. In some cases, people who aren’t abreast with what’s happening in the news (as viewed on a TV screen) will be able to stay in the loop. I’m such a person. Facebook is my news outlet. If it wasn’t for Facebook, I’d not even know about the Steve Stephens situation and many others. Thank you, Facebook.

Facebook is also a public space, with a lot of privacy features. If you’re not up for viewing certain images or videos (especially of suicides), you can opt not to. What I’m simply saying is, some people don’t mind but others do. If you don’t wish to view graphic content, people being shot, or committing suicide, use Facebook’s privacy option, scroll pass as you encounter them, or deactivate your Facebook account. You’re not being forced to watch those videos or be on Facebook.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s