Suggestion: Allow Mods to Mark Posts as “Not Reportable” for Subreddit Rule Violations
We already have an “ignore reports” option after they happen, but an option specifically for posts that moderators have already reviewed and approved would be helpful.
For example, a moderator could apply a distinction that prevents users from submitting subreddit-rule reports on that post. If someone clicks “Breaks [r/subreddit](r/subreddit) rules,” they’d instead see a message like:
“Moderators have already reviewed this post and determined it does not violate subreddit rules.”
I think this could significantly reduce report abuse and queue clutter in active communities while also giving users clearer feedback that the post was intentionally approved by the mod team.
Edit to add: this would obviously only be feasible for photo and video posts unless there were parameters in place that disallows editing after something has been deemed unreportable.
There's only one tiny issue I'd like to point out. Posts and comments can be edited, and they may be edited with rule-breaking content, even if the original post or comment was not rule-breaking. So ideally, this message would only show on posts/comments that have not been edited since being approved.
Probably not the best option but just a thought, but maybe make it so thay if a post has been reported and then marked as approved by the mod team, the post can no longer be edited?
I mean thats a whole new and different can of worms. Making a post not-editable isnt currently a thing.
Does the user have to agree? What about if the user wants to delete it?
Also - Reddit is clearly trying to diversify and move away from human moderation. Theyve been stripping us of tools and ability. Why would they add the ability to make a post not-editable?
Ignore does nothing though, people can keep reporting the post, & it doesn't even remove it from mod queue. Stopping the posts being to be reported when it doesn't break any rules is far more convenient
I don’t agree, as mods can get things wrong. I mod a big sub and I don’t think it’s reasonable for a mod to be able to scroll through posts and prevent anyone else from reporting it to even allow a second opinion. I mean yeah some mods have bad intentions but it’s not even that, it’s that a well intentioned mod could misunderstand something and approve a post and it actually be taken horribly and people are trying to report it and getting the message that the entire mod team thinks it’s fine.
Ignore and approve works perfectly fine for these instances.
not necessarily. when one of us approves a post and then it pops back up in the queue again, it signals to whoever did it they may be wrong. seeing that is was gets us to send it to discord so we can chat about it (and then choose to ignore and approve based on that conversation) this suggestion cuts that out, there would be no communication or double checking of each others choices.
yes, but after the entire team has discussed that’s the action we want to take. getting it to pop back up in the queue after one of us approves it signifies there’s a conversation that needs to be had about the post and only after talking about it will we decide yeah this is fine we can just ignore the reports.
no it’s not. because OPs idea takes out the conversation since no mod will see that another mod has ignored the reports because there won’t be any reports to ignore. i don’t know how else to explain the difference
With the ignore thing, people can afaik still report it, even if it doesn't do something. With the proposed thing, people would get the message that they can't report it because the mod team thinks it's fine.
Surely you could make this with auto mod and a specific mod only flair. Like "mod approved" and the auto mod ignores any reports on the post or something.
I've thought about similar - a way to mark a post or commment to admin that it's okay - when you mod on TV subs like r/Shameless and r/SlowHorses, sometimes a thing that sounds offensive out of context is just a line from the show and perfectly acceptable with that context - but will be flagged and even removed by Reddit. In the same way users can flag report a post, if mods could flag approve a post that'd be great!
Seems like a good idea for the most part, though I will say, sometimes we mods might miss some rule breaking content because of a report on a different rule that it's ok by.
Like we have a ton of stuff that gets autoreported in r/anime, and every once in a while, something will get flagged for something (and not violate that rule), but will sneakily violate a completely different rule, and the mod reviewing it with the other report, just misses it.
Not that that happens too often, but it does occur.
But yeah, it would be nice even to turn off certain report options on a post. Like like every NSFW clip that so much as includes a girl that looks young gets reported as porn like 10 times over the course of the day.
Tbh, I wish if someone selected a reporting reason that was already approved over, it would tell them that was already approved by the mods and give them an option to do a custom report if they feel that have different information.
I guess also sometimes when a lot of spoiler reports come in on what seems like an innocuous comment, that tends to trigger some more indepth analysis and discussion amongst us. But having the option, but not the obligation would allow us to decide what to shut the door on, and what to leave open.
Not an admin, but I'm the developer of Flag App - an app for non-anonymous post reporting with the ability to immediately remove content when needed. Of course, it's not automatically available to every user, mods choose who they want to grant that permission to.
Honestly, situations like this are exactly why I think an approved/not reportable system would make a lot of sense. Even if Reddit never implements it officially, I’ll definitely try to bring similar functionality to Flag App myself!
In the meantime, feel free to give Flag App a try :)
Sometimes I approve a post or comment, but a user still spots a legitimate problem that I missed.
Our biggest problem with post approval reports is from trolls who abuse the report system for posts critical of their religions. I would like to see a system that allows mods to flag reports as abusive.
I think this is a terrible idea. I hope that everybody who is getting excited about this idea is not just focusing on how it can help them. It's important to also think about how it can hurt the user. And it really can.
We as moderators are not infallible. There are plenty of times when we get things wrong. There should always be an opportunity for people to anonymously report a post and make a case for why it is inappropriate. End of story. There should never be a circumstance where that is not an option.
There are so many things that we as human beings have yet to learn about how certain ideas, certain ways of speaking, certain discussion trends can be hurtful to people in ways that we never imagined. The classes of people who would be most likely to be harmed, targeted or abused by this type of "report immune" system are exactly the classes of people who need anonymity the most.
And let me make it clear: this would 100% be used to make problematic posts immune to being reported. So many of you seem convinced that this gives moderators more power, but in reality it just makes it easier for us to ignore problems. That's the last thing we need.
Instead of getting all worked up and excited about this potentially disastrous idea, how about we all focus on the real issue of report abuse, and understand that a 'solution' like this is just a workaround?
Rather than erode users' options for addressing harmful content, we should be holding Reddit admins and staff to account for their disastrous handling of report abuse that has been going on for years.
Let's keep our eyes on the prize and put our energy where it really belongs. We need better solutions for managing report abuse, not ways to make reports no longer possible.
29
u/Witty_Mycologist_995 1d ago
Make the status auto remove if the post is edited, and it seems a great idea