From Neos Wiki
Jump to navigation Jump to search

The Neos Moderation Team's Weekly office hours.

Here are the notes from 06 Feb 2022.

These are rough notes typed by CanadianGit, shamelessly stealing ProbablePrime's formatting. If there are errors please edit away!


Please do remember to check our Roadmaps as a lot of questions here are partially covered there. Do continue to ask them but the roadmaps are there for you to read:

Please make tickets! We love tickets!.

The next Moderation Team Office hours will be back next week!


3x1t_5tyl3 This is a multipart question with 2-3 questions and very likely a complex answer. I have noticed that a large part of the community is straight up not giving a single fuck about the rules and do whatever they want (as moderation is only enforcing upon ticket).

Does moderation have a plan to change this so that rules are generally more enforced or does this general "No Ticket; No problems" mentality become more common? And will this also render the guidelines basically useless given that most rules are broken anyhow with moderation not actively seeking out anyone who is breaking them

Context: I had a debugging software attached to my game and I noticed a grotesque amount of adult content being loaded in public sessions with avatars. A lot of people seem to just not care as "Disabled means hidden right?"

The secondary issue that arises is that as soon as you want to actually enforce this rule in your own world you get ostracized for asking people to take a look into their avatar to make sure they're up to the neos guidelines and your own world. And to be quite honest it's currently the main reason why I'm staying away until either moderation becomes more active or there's better tools to enforce these more "vague" rules in your own world. As currently there's no tools other than external debuggers (Such as renderdoc) or modded clients to ensure that people keep within the rules. As looking into avatars is generally seen as a tabbo (for what I believe are obvious reasons; And I agree that me a normal user shouldn't be doing any of this shit.)

I also realize that me asking this will just make more people ban me from their session for saying all of this. But I genuinely think that the community is steering into a dangerous direction with breaking the rules.

  • The issue raised here is one the Moderation Team is concerned with, as well, and we're looking into answering it in three ways:
  • 1. Revisions to the guidelines, where we're looking to empower hosts to feel more confident in establishing session rules of their own in addition to the Neos Usage Guidelines. A subsection will be included in harassment on users deliberately disregarding rules set out by the host.
  • 2. Improvements to user tooling - upcoming features on blocking and hard permissions are expected to help users better deal with problematic individuals they encounter, and let hosts better curate what can be done in their session.
  • 3. Community PSAs. We start each of our office hours with repeating announcements about community issues we have in Neos, and encourage users to push for change. Part of any issue stemming from norms in the community will be the community itself pushing for change, and we try to encourage this.
  • 4. We're considering ways the Moderation Team can help identify rules being broken in public sessions, but we feel there are still a number of issues with deploying mods as 'beat cops' or free roaming session moderators.

Earthmark if this is a cross neos policy, it may be the host in volation anyways. What if I as a host say "screw that rule"?

  • Hosts are allowed to add rules in their session beyond those listed in the Neos Usage Guidelines, but they cannot create sessions ignoring those rules. If you encounter a session host declaring that sexual harassment is fine in their world - report them.

rampa_3 (UTC +1, DST UTC +2) Well, the age is a problematic thing Veer... How you check age for sure without asking users to spawn ID pics?

  • We do not collect information on potentially underage users, but a number have openly admitted to being under our age requirements and received account restrictions to keep them off platform until they're of age.

Earthmark But how could the host be moderated? There is the force-unlisted thing I think? Hmm...

  • Users in sessions seeing hosts engage in behaviour violating Neos Usage Guidelines are encouraged to report those hosts, and account restrictions can prevent a user from inviting others to their world, or joining based on invitations.

3x1t_5tyl3 Tickets are easy enough for me tbh. But it's easy to just triangulate who reported someone with timing. SO it's easy to find out who reported anyone

I had several people pre-ban me because of this.

  • Users are allowed to do this, of course, but we're trying to push a shift in community culture to encourage users to make use of the reporting system and enforce their own world's rules as a host.

Revi woulden't "beatcops" require something along the lines of "force Builder", witch would deffinetly be exploited by modders

well how could they moderate? wouldent people get banned if its been discovered that they are a cop, a lotta people would Pre-ban

  • Moderators do not have permissions above any other user when they join your world, and we have no plans to change that. This is consistent all the way up to the dev team, who are identical to regular users who join your worlds in regards to permissions.
  • This is one of several reason we're hesitant to employ mods as active enforcement, patrolling open worlds.

Revi are there checks to stop people just making a new account in the case they get banned? its a free game afterall

  • Yes, though we cannot go into detail on the system that handles it.

VRGameZone How would you find out if someone is lying about their age?

  • A surprising number of young users openly admit it, and it leads to account restrictions until they'll be of age.

Earthmark Given issues like the IPv6 in the JP community, they've needed to build their own world manager/listings, would users having their own private unlisted world listings still be applicable to the main neos moderation guidelines? For instance if a world list allowed mature content, and verified that users using the list were mature, would that be allowed, or even governable, by moderation?

  • Systems like this can still be governed by users who engage with it submitting tickets if they see elements of the system violating Neos Usage Guidelines
  • However, this sounds like a system that, if working correctly, would meet our requirements for hosting adult content.
  • Our broad rule of thumb for adult content sessions is: 1) Make sure it is not publicly visible, and 2) Make sure no minors or other individuals who do not consent to see that content can't see it.

epicEaston197 Q: if a user does something "Bad" how do I prove it like user did X and Y what is the standard for proof? videos, photo metadata?

  • There is no single standard of proof we look for, it is contingent on the guideline being broken and what is appropriate proof for the type of action. That said, provide all information you can in reports, and we will frequently ask for anything that may have been missed.

vilk are there any plans to scale moderation if neos's user count goes up by an order of magnitude?

  • Yes. A moderation application is in the works and will be released when a need for additional mods approaches.

Nammi What law does Neos usually go for when it comes to moderation, UK law?

  • I am not a lawyer, but Neos has a company in the UK and Czech Republic.

TheBasementNerd (she/her) How is community moderation, in the form of users trying to settle debates or assist in handling of issues between users across the community, viewed by the moderation team?

  • It is happily encouraged, both on an individual level to help resolve conflicts in the community, and to help us push PSA's for the platform as a whole.

Charmizar Piggybacking on Eastons question, how can you determine the validity of proof being in a very creative platform and using its tools to either twist the truth or make your own scene?

I know a lot of questions would be asked but the possibilites of copying avatars, names and badges.

  • We look for corroborating evidence whenever we can and place a higher degree of trust in evidence that would be more difficult to fabricate.
  • All users have the option to appeal any account restrictions if they believe they have been issued based on fabricated evidence.

Dionamus Suggestion: Marking sessions as 18+ and then enforcing that through account flags that aren't exposed in sessions?

  • The Neos Team has not yet finalized design plans for adult content tagging systems, but they remain a topic of discussion, including that solution.

epicEaston197 Q: what if a group of people or 1 person gets a group of people to mass report someone? like person A makes a report to C. A tells B do the same and it keep the cycle going this might result in C getting banned

what would you do to prevent or know about this?

  • Account restrictions are all handed out by people, none of it is automated. Mass reports can substantiate a claim if evidence is provided in each of them, but they do not increase the restrictions we would provide.
  • Mass fake reports constitute harassment, and are very easy to spot by real staff, not automated systems.