From Neos Wiki
Jump to navigation Jump to search

The Neos Moderation Team's Weekly office hours.

Here are the notes from 29 May 2022.

These are rough notes typed by CanadianGit, shamelessly stealing ProbablePrime's formatting. If there are errors please edit away!


Please do remember to check our Roadmaps as a lot of questions here are partially covered there. Do continue to ask them but the roadmaps are there for you to read:

Please make tickets! We love tickets!.

The next Moderation Team Office hours will be back next week!


DoubleStyx — Today at 16:05 The updated guidelines state, “If Adult Content is present on the avatar in any capacity, regardless of the system or method intended to hide it or remove it, it can only be used in sessions that are not publicly visible, only contain users over 18 years old, and if all users have consented to the presence of Adult Content.”

I’ve bolded the phrase I’m commenting on. Does it imply that an avatar cannot have any features that facilitate the use of Adult Content, even if those features aren’t inherently Adult Content? An example would be a set of dynamic variables under a slot that provide configuration data for a slot or system consisting of Adult Content.

  • No - it relates directly to features defined as Adult Content. If a feature contains no Adult Content, such as a plainly named snap point, it is allowed.

epicEaston197 — Today at 16:05 Q: is it okay to crash somebody as the host? Because that happened to me once and I had to restart my whole computer

  • No, crashing users remains against the guidelines even when done by the host or session moderators.

Lexevo — Today at 16:10 Are there any separate moderation policies/techniques that you guys internally use for any things that are unsupported in Neos? (For example, using mods to access things that you shouldn't) Or do you guys just follow the same set of guidelines?

  • The current guidelines apply to most actions a user could take in Neos. We have additional policies for making mods, and will write more if other require additional clarity.

rampa_3 (UTC +1, DST UTC +2) — Today at 16:10 Any chance of set of LogiX nodes to write automoderators coming in the future? (e.g. autokick people who use systems disallowed in the world by host, etc.)

  • No current ETA, but it sounds like an excellent Github addition.

Lexevo — Today at 16:12 Rephrased Q: If there's some very technical guidelines breaking which very few people, and perhaps many moderators might not know could even happen, is there any special way you guys handle that?

  • Moderation responds to tickets issued. If users encounter some one breaking guidelines, even if a very technical sense, we'd be happy to look into the issue.

Earthmark — Today at 16:14 Not sure if this is a valid rephrasing, but I read this as: If a user makes a mod that lets them bypass world permissions does that have additional moderation consideration?

  • This would violate our current guidelines and modification policy.

Lexevo — Today at 16:14 Like, someone with technical understanding knows that someone else is breaking guidelines, but any moderator doesn't have the understanding to take any action

  • If a ticket is submitted where the moderation team fails to understand the technical aspect of the report, we can bring dev team members into the ticket to provide input. If they still cannot understand the issue, we will respond to the ticket asking for clarity.

Lexevo — Today at 16:16 There's certain categories of tickets that not even mods see right? Like security ones?

  • Correct - exploit reports are generally sent to a dev-only response category.

Lexevo — Today at 16:16 Would dealing with spam/advertisements in VR/in world, be any different than handling it in text chats and other traditional services?

  • Unwanted advertisements being spammed into worlds the host does not want would constitute harassment, and the moderation team would be happy to assist with reports on it.

Earthmark — Today at 16:21 As a prelude to this, I recall there was a section of the moderation team to handle concerns with users who may be more vulnerable than your average users.

If we see interactions with users that raise eyebrows in a suggestive way, almost grooming like behavior, how can I report that to moderation? It's not a direct conflict issue, it's more like I have a feeling like "This person is probably not doing anything wrong, but it feels like they're possibly taking advantage of this other person who may not know better."

In this case it felt almost like a scam call, it was a conversation that was funky enough that normal users would be like "nah bruh" but users who are vulnerable enough to not realize that funk may be put in a risky situation.

  • Direct through the ticket system - whatever details you can provide. Understandably, when it comes to hunches, we're not going to issue account restrictions, but having that information brings the concern to our attention. Letting moderation be aware of potential concerns like that may be essential in filling in the broader picture for other incidents reported at a later date.

Lexevo — Today at 16:23 An example I could think of is someone hops between a bunch of public sessions, and drops something annoying in all these worlds advertising/spamming something in every session they hop to. Would that take longer to deal with than for example, someone spamming an ad in every discord channel?

  • Probably a bit longer, in the sense that the discord mods can often clear a scam within minutes if not seconds of being pinged about them, but it would be similarly straightforward case of violating our guidelines. A report to the moderation ticket system should yield a quick response.

Lexevo — Today at 16:28 (As a world host) Would having something that some people would find annoying (like forcing people in to a certain spot) to make them accept a host's additional rules, before they can move on, have any grounds to be breaking guidelines? Like is there a point of how annoying it could be before it might be breaking guidelines?

  • Annoying your guests is not against the guidelines, though obviously some users may decide to leave rather than join. This would only become problematic if whatever system you force on users joining your world is so disorienting or otherwise abusive as to be potentially harmful.

Earthmark — Today at 16:29 Because my brain wants to take this to a funky extent:

When a new user joins my world, am I allowed to lock them in a small room for an hour before they can move around the world? They can still use their dash, but can I put them in a box for a while.

  • Barring the box being somehow particularly disorienting or, as above, otherwise harmful, this would be perfectly acceptable - even if many users would probably leave before the hour expires.