From Neos Wiki
Jump to navigation Jump to search

<< Previous Next >>



In this Office Hours, Probable Prime talks about:

  • Saving items
  • Exporting Items
  • stencils
  • Performance


(Please note that the transcriptions will not be 100% accurate)

Okay, welcome to Prime Time. It is 4 p.m. so we need to go ahead and get started.

We've got some questions lined up in the Office Hours text chat, so if you have additional questions, please line them up.

I'll get to them in the order that I appear. I'm just gonna @tag Office Hours one more time to let them know that we're starting, and then we'll get going.

All right, so questions. So, Alex says, "Hey Prime, how does Neos generate thumbnails for inventory items?"

I know approximately, but I don't know where in the code it might be.

I can probably get there if I go to the inventory browser and I look at the code for the plus button.

I think that should be easier to find than the context menu one, so I'll look for that.

And I'll save and held items to the inventory. Here it is.

Add item. Here it is. Here we go. Add item.

inventory link.

Link for save and handle.

Here we go. Item helper.SaveItem. This looks like a good thing.

Save item internal. Reference proxies can save item slot.

It's an avatar. If there's metadata.

Okay, if there's no thumbnail...

Yeah, so there's a function on slots called renderToAsset, which will basically take a screenshot of a particular slot hierarchy.

That's a cool function.

It does that by looking at your head position, but it also tries to take into account bounding box and stuff like that.

Which gets annoying sometimes when you've got really large items, and sometimes there's like...

Either a baby or a small deer is injured in the corridor. I apologize for that delay in my brain.

You know when deers make really loud noises? That's what I just heard from my corridor.

Anyway, yeah. So it basically just takes a photo of the item from your head as like a source position, but trying to encompass the entire item's bounding box.

It is similar to the renderToTextureLogix node. Everything's kind of like the same code path.

I wouldn't be surprised if they used like the same code. I'd have to dig into it more though.

I've stepped into about five functions. I think I'll do for now.

So we'll move on to the next question.

So Erin says, "How do stencils work?"

I actually found a good guide to stencils in Unity, which helped me understand them inside Neos.

So I looked for stencils from probable crime in our Discord. No, not pro scrub. I don't know who pro scrub is, but hooray.

Is that the guide I meant? I don't know.

This is the one that I previously linked to a few people. It's a guide about how stencil shaders work.

It isn't directly applicable to Neos, but you can kind of figure it out if you are technically minded and you take a look at that.

Additionally, do check a look at Visualibo's public folder. They've got a bunch of stencil stuff there.

A variant of which is inside my meme or cheese folder, which is the cheese hole, that again is a stencil.

They are like weird ways to write on top of or below the current render buffer, so you can do really cool stuff.

It's similar to how portals and stuff work, although portals are usually just cameras these days.

He wanted a link to Visualibo's public folder. Okay, let me see if I can get that as well.

I feel like a weird data retrieval person right now, because my tax accountant is asking for 2022's documents today.

I've spent three hours just downloading PDFs and documents from various places I don't remember my passwords for.

So that's been fun.

I need every document that has a single dollar sign on it ever. Please, please.

Oh, Froppy's got it. Fantastic.

We'll move on to the next question, depending on what you want to do, Aaron. Maybe I can help you out later.

Fuzzy says, "Is there a way to edit something in Neos that has been baked already?"

Depends what type of baking you're after.

If you mean the sort of baking where you're baking a procedural mesh, you can't unbake a procedural mesh. Nope.

If you mean an avatar into a static avatar like the statue maker, then again, nope. That's not possible.

So it really does depend on what you mean explicitly, but usually the answer is no, you can't unbake it.

You can take it to Blender and then do stuff there, resplit it apart.

Do remember that if you export it with a type of model file that supports complex stuff, then you'll be able to do more than if you export it with something that does simple stuff.

For example, if you export as SDL or OBJ, it will usually collapse the materials down to something sort of unusable.

But if you use glTF, which I recommend, then you should be able to pull apart the different materials that exist up there.

Those who are unaware, glTF for everything, please.

If you have any issues with glTF, please do let me know because it worked for everything I could do.

GLB is OK. Basically, if your file format starts in GL, you're totally fine.

If it starts in FBX, come on guys, it's 2023. Let's make 2023 the year of the glTF.

Ozi asks a good question actually. I read it really fast and now I have to read it out loud for the recording.

Hey Prime, a bit of a cloud feature question, but from my understanding, the nearest cloud has a lot of moving parts and services.

Is there any plans to expose the state of health of these individual services similar to how the other cloud-based services show such?

I would love to do that. One of the initiatives I want to do once this whole mess, C announcements channel, is fixed is sort of up our transparency on stuff.

Like there's of course stuff that we can't tell you, for example like legal stuff or anything like that,

but there's a bunch of stuff that we just aren't telling you for just sort of like, we're just not, and we totally can.

Examples there might be what percentage of the Patreon fund goes to paying for the servers? That's good information.

We could even do what percentage goes to wages if we want to.

That wouldn't be what individual people's wages were, that would just be like,

"Patreon, this percentage goes to the actual developers so they can buy cheese." It's all information that we can do.

Of course, with each of those pieces of information, I have to get clearance from the rest of the team, and fruits of course, to share it,

but there isn't anything preventing us from sharing some of that information.

I always post this, and people get tired of it sometimes, but if you take a look at the platform Glimmish,,

they're one of the streaming platforms for video game streaming that sprung up when Mixer died.

They try and do everything as open as possible, so their board meetings are just entirely open, you can go watch a YouTube video of them,

and I don't see why that can't happen with us. And then they share metrics.

They share their daily active users, their monthly active users, how much money they made in t-shirt sales, how much money they made with subscriptions.

And this is above and beyond what I think we'll be able to do, but there's no reason why we can't.

So I'd like to see what we can do in that regard. So hopefully server status will be included in that.

So Rickerbus asks what normals are, and a few people answered below.

So Lou said normals are textures, that's a normal map, not normals.

Normal maps are a way of mapping normal data into an image.

So, how to explain this about visuals.

Okay, so if you shine a light at a mirror, let's say a laser actually, that's a bit better.

A laser at a mirror, it will reflect off of that mirror, according to a certain mathematical calculation that we have determined.

So if you do that in the real world with a laser pointer, if you've got one, I have one, long story, it'll just work, right?

But in computing and in games, we need to write code and maths to make that work.

And part of what we need to know there is the normal, which is sort of like the way that the surface of the object points.

So if you think of that mirror, the normal of the mirror is straight back towards you.

That's the normal of that mirror, because it's a flat rectangular surface pointing directly at you.

The normal of that surface is straight back at you.

However, let's think about the normal of maybe a sphere, right?

That's complicated, because for each point that you hit in the sphere, it's got to then bounce straight back up towards you from every point.

And that's complicated. So if you go into Blender, you can go ahead and just add a sphere, and then exactly exactly exactly what Froppy has posted for a sphere.

So that's the normal for a sphere.

So the normal map then encodes that data from your texture creating application or your baking application into an image, which then saves the data.

Because we don't need so much data in the actual mesh, we can use it from the normal map.

So for example, on Froppy's visual there, you'll see that there is quite large quads, like for example, the quad right in the middle of it.

Just pick any of the quads right in the middle of it.

There is only one normal for that quad.

But if you have a normal map with an appropriate UV, you could have, I don't know, 10, 20, 30, 50 normal values for that quad that is in the middle of that sphere.

And that's why you might get, I don't know, a divot or a raised part, like maybe it's been roughed up with some sandpaper or something.

And that's why when you combine all of that together, you get normal maps, which allow you to make surfaces rough or patchy and stuff like that.

Do take a look inside the Neosessentials materials, and then just pick something that you wouldn't think would have normal.

So pick metal or something like that.

And then try one of the really rusted metals in there and look at its normal map.

And then look at it on a flat surface like a quad.

And then you'll start seeing, remove the normal, add the normal, you'll start seeing what's going on there.

Moving forward, the programmer says, "Is there a way to save the texture of a material back to the computer?"

Yeah, sure. So if you grab the texture, and you need to make sure you grab the texture itself.

So like drag it out from the inspector or something.

Like if you grab like a quad that has the texture on it, then that might not work.

So make sure it's like the texture itself.

Like as an example here, if you bring in an image and then you grab the image, that's the texture itself.

Like it's, but if you put that image on another object, then that might not work.

If you want more on that, I'll make a video or something.

Anyway, you grab the image itself, you go to the file browser, and then you hit like the plus in the file browser.

That'll let you save it.

There's also, depending on the component setup, it's a exportable component.

There'll be like export in the context menu, that's usually for screenshots and things like that.

It'll be able to export that image back there.

Same with meshes, again, grab the mesh, go to the file browser, hit plus, select glTF.

Moving forwards, I found there's a tween node. This is from Tiki, sorry.

I found there's a tween node to transition from one value to another, and it seemed very interesting to drive a float for some effects,

but the output seems to be only possible on a component variables and not a generic register.

It's another node that would work around, so I don't need to create a component.

I thought tween would work for registers. I don't know why it wouldn't.

I'm not in Neos right now, but if anyone is inside Neos, could you try making a tween node and targeting a value register?

Thanks, that should work.

If it doesn't work, you can also inspect the register in the inspector and find the component which represents the register,

and then just tween the value of the register from the component version of the register.

Because remember, Logix is just components with squares.

Ogeba says, what is the maximum number of trackers Neos can use?

There isn't a limit. There probably is a limit. It probably is close to a binary number, I think like 32, 64, 128, something like that.

But you'd probably run out of body space before you run out of trackers inside Neos.

The reason I say that is because at a certain point they'll start overlapping in terms of their dots and stuff,

and then the base stations will get confused or whatever.

11 point tracking is all we recommend for avatars, but you can add additional trackers to track things such as,

someone has a drink tracker, someone had a cat tracker, someone in the really, really, really early days of Neos had a chair tracker,

which I thought was brilliant. They put one on their desk chair, and then they knew where their chair was if they needed to sit down in VR.

And I've had that problem where I don't know where my chair is, and I've had a couple of close calls with the floor.

You know, you keep lowering your butt and you're like, that's not a chair, that's not a chair, that's not a chair,

and at some point you're like, I'm going to fall over if I keep going. Let me double check where my chair is.

So they did that. Random anecdote here, we do have time though.

I saw on a TV show called Station 19, which is a firefighter show, which was spun off from Grey's Anatomy,

they were doing firefighter training where they had the nozzle of a firefighter's hose, a fake one of course,

but it had a vive tracker on the end of it, so they could track where the nozzle was.

And I was like, cool. That's actually what people at Vive thought that you would use vive trackers for.

They thought that you would be adding it to objects such as firefighter hoses, tools, maybe even sort of, you know,

mock weapons, et cetera, for training and management, et cetera.

And then VRChat and Social VR came along and was like, we're going to put like 11 of them on us,

and then lie in front of a mirror for 12 hours. And Vive's like, what?

And that's why sometimes when you're like, Vive stuff seems a bit weird.

That's because they clearly were targeting a more professional market to start with.

That's also why the Vive ones exist and are really bad. We're targeting a non-gaming market by looks of things.

Anyway, moving forwards, Kazu says, for import of GLTF data, and you know,

sometimes it emits shape keys from the original data with this mechanism. Also sometimes Neos emits necessary shape keys.

How to disable this emitting heuristics. I'm not sure. We'll have to get a Neos GitHub issue on that one with a model that has that issue.

And then I can run it through the importer whilst adding some debug stuff.

Rampa says that, I mentioned that FBX is a fossil. The problem is that people are used to Unity.

Unity does FBX. Yeah, I get it. Like, Unity does FBX. But when you're doing like, Neos stuff,

what I want you to do is when you export from Blender or something, take like a split second and be like, hmm, FBX? No. GLTF.

Skant says, for meshes exporting, you need to grab the slot, right? There are like seven different ways to export a mesh.

And sometimes you literally get the mesh export dialog exported.

And it's a sort of running joke or problem because every time Fruxereye adds an additional way to export mesh,

someone else finds an additional way to export the mesh exported dialog. And it does get frustrating.

Like sometimes I'll get the mesh orb. That's the green orb with the mesh inside it.

Sometimes I'll get the mesh exported dialog. A couple of times I've got some tools.

Just try grabbing, if in doubt, find the static mesh component, grab that component and export that. That should work.

Interesting that the tween node doesn't accept registers. If someone could make that a bug on the GitHub, that would be good.

I'm certain that that is a bug.

So a group of us says, how does peer-to-peer networking system works?

Rooters don't let other computers connect to my computer. How does Neos make that possible?

So Neos uses UDP for like most of its main core networking.

That isn't for like downloading assets or messages or anything like that.

But if you're in a session and you're looking at me, talking to me or handing me an object, that's going over UDP.

So UDP is the, I don't want to say newer because I don't know, but it is newer than the traditional TCP, transfer control protocol.

This stuff's all kind of like computer science-y weirdness stuff.

Essentially, TCP is like more controlled. It's like a conversation.

You're like, ah, yes, I would like the next packet, please, waiter.

And then the server's like, here you go. It's the cheeseburger you ordered. Please eat it.

Whereas UDP is basically people screaming.

Like you just go into a room, there's people just screaming and projectile vomiting everywhere. That's UDP.

So that's UDP. So that's the first benefit that we get there is UDP.

And then what we do there is we put UDP over what's called NAT punch through or universal plug and play stuff.

And what that can do is it can sort of set up a way to get through your router's port forwarding and stuff like that, such that you don't need it.

If that doesn't work or your router is incompatible with that, then we also do what's called UDP relay, where we go via a relay server.

So your computer receives the UDP stuff from a relay server.

You can find out more information about that on the network information page.

I made a pig's ear of explaining that because it's very complicated.

But here you go. Networking information. You can see all the information there.

People often ask like, what port? And it's like, whatever port works. Networking is very complicated.

I'm so glad the days of port forwarding are behind us though. It's good.

I do hope at some point we'll be able to move forwards with Steam networking sockets for like everyone.

I know some people don't like those or something. I don't know. We'll figure it out.

All right, moving down, making sure we're not missing any questions.

Froppy posted the cat tracking video. Thank you.

So Svekn says, "Am I right in thinking that drives aren't really affected by desync but impulses and writes are?"

Yes. Okay. So I've been trying to explain this in a way that makes sense for a long time.

Each time I do it, I try and get a little bit better at doing it, so I'll try again.

Drives are local. And so it's the equivalent of, let's say, the server host to the session host telling your computer,

"This cat that is spinning is spinning at this speed. Take care of it. Don't talk to me about it. Don't inform me about it.

Don't tell me about it. Just keep the cat spinning at this speed."

And that would be the spinner, right? The spinner is a drive component.

That doesn't have any network traffic, right? For the rest of time, your computer is like, "All right, I'm going to make the cat spin."

Until such time as maybe you, as a Logix developer, you write the enabled property to false using an impulse.

At that point, there needs to be a network packet that goes from the person that initiated that write or that impulse

through the server host and then back down to you, which says, "Yo, bro, stop spinning the cat."

And then your computer is like, "Okay, I've stopped spinning the cat." That's the difference there.

So desync isn't affected by that. But the control signals for those drives are.

For example, if you're desynced, that "please stop the cat spinning right" packet might not come through

because it's queued behind, I don't know, 5,000 other packets that are more there.

I hope that made sense. If it didn't, I don't know. Ask again. I'll try again.

Lou, let me just double check. I haven't missed any questions before. Lou's question, I haven't.

So, oh, Specs does say that glTF will darken vertex colors. Do you know if that's glTF itself,

or is that something Neos is doing? If that's something Neos is doing, make sure there's a bug open up on our GitHub.

We'll take a look at vertex colors. If it's something glTF is doing, then maybe we can detect vertex colors on glTF

and brighten them inside Neos. I don't know. More information, GitHub was, if there is already one, great.

I don't know if there's actually one. So Lou's question says, when teaching new users about things that involve the dashboard,

we're usually leading blindly. Is there a way to show a mirror of the dashboard in world space like the latency inventory?

There is not, but what you can actually do, and some people have done for parts of the dash,

is you can kind of export that dash into world space as a copy of it. Lots of stuff won't work, but the visuals will.

And so you can basically be like, this is your dash, this is what it looks like.

You can also take videos and pictures and stuff like that to kind of help.

There is not a way to show it, like, your actual dash inside world.

Tiki says, in telephony, stun and ice are usually used for networking.

I have heard so much about stun and ice for a long time. I don't want to know.

You can look them up. There are networking protocols used for telephoning.

Moving forwards, I'm Aaron says, do you think there will ever be the functionality to transfer hosts to another player?

Yes, there is a GitHub issue open for that. Go ahead and look for it.

Arigabus asks, is there a way I can check my messages and neos without opening neos, like a web interface or something?

There is not an official one. There are some community-made things that will allow you to do that.

I can't vouch for them. I haven't used them. I don't know how they work. I haven't read their code.

But not currently. We do have plans for one in the future, but we don't have one currently.

I'd like to get it to the point where we have a similar web UI to VR chats, hopefully better.

I'd like it to be similar in some respects to Steam's, but you don't have to go to a website. It's like an application.

It's like, here, people are here, people are here. You can talk to them.

All sorts of stuff like that. I'd love to get that in the future.

Kazu says, update Blender to see if the glTF is better there.

glTF is a new standard, so there are people that are doing it more and more.

For example, the glTF standard is why VRMs exist.

VRMs are basically just syntaxical sugar on top of the glTF spec.

And the glTF spec, by the way, has, like, it can have behaviour in it.

It's not just models and armature and blend shapes. It can have behaviour in it and lights and cameras. It's really cool.

There's also this other thing, like sometimes you'll see it when you download from Sketchfab.

It's like another file format. I don't know anything about it.

I don't even know if Blender or Neo supports it.

I'll try and get its name. I need to log into Sketchfab to get that.

I'll log in with Google on Sketchfab. Yeah, I do.

Welcome to Prime Time, where I log into Sketchfab.

Yeah, USDZ. Universal Scene Description.

Don't know anything about it, but something it's new and cool to.

Tiki talks about ice and stone and turn not being used yet. Probably not.

You can take a look at network information for all the information there. If you've got more questions, please let me know.

Zvekin says, "Is it the USDZ?" Yeah, it is. It's the USDZ. Yeah.

I don't know anything about it. I just know that it's also on Sketchfab and that Sketchfab were trying to push it at a certain point.

So, yeah.

Oh, I missed a question from Skant. I'm sorry. Let me take a look.

Oh, I did. I did. Yeah, it didn't have the question emoji when I went through it last time.

"Does Neo see a VPN network like Tailscale or Zero-Tiers local networks?" I don't know.

I'll go ahead and try it. I'm not quite sure how LAN works in that regard.

I want to try Hamachi. Someone should try Hamachi.

Sometimes when I think of Hamachi, I think of Hibachi, which is the food restaurant.

Let's do Hibachi.

Anyway, moving forwards.

Rigby says, "Is there a way I can block someone's avatar? Sometimes people join with avatars, lots of shaders, and my FPS goes to zero."

Not currently, but there is a user-to-user blocking system which is on our feature list.

And that will be more for sort of moderation, personal space, harassment reasons, rather than performance reasons.

I'm not sure on our current reasoning about performance blocking.

I haven't spoken to any of the team about performance blocking features that are similar to VRChat.

For those who are unaware who maybe don't use VRChat, there is a performance blocking system there whereby

if an avatar is at a certain limit in terms of mesh, shaders, renderers, materials, particle effects, and stuff like that,

you can elect to not see it.

And I have my settings on very high, and 95% of the avatars on VRChat are not rendered, so it's really fun.

But there we go. Maybe that will be a feature in the future.

A feature in the future, not a future in the future.

I'll have to look through the GitHub and see if there's anything open about that.

Specs says that Blender 3.4 might have fixed the vertex colors.

That's a really short fix. I have no idea on the context, but it sounds good.

Kazi says VRM is compatible with GTF, so they've been telling people to rename VRM to GLTF and try and put it into Neos.

That's a good thing, yeah, that might work.

Tiki asks, there is a normal scale slider in most materials but couldn't figure out how to make it work.

I don't see any change in normal heights while changing it.

That slider might not be big enough, so try typing in a really large number and a really small number and seeing if there's any changes.

I have definitely seen changes.

Usually what it'll do is make things look really, really weird at a certain place, and then you know you've got sort of the sweet spot.

And again, try it with some of the CC0 materials that we've got inside Neos Essentials, because those have really good normals.

Rigabet says, does Neos use mono or .NET cores at C# runtime? It uses mono as part of Unity, unfortunately.

We would love to not be dependent on mono, we would love to be free to do all the cool new C# stuff, but we are stuck on C# .NET Framework 4.7.2 or 4.7.6, I can never remember which, because Unity is stuck on that one.

4.6.2, you see, I can never remember which way around it is, like 4.7, 4.6, I know it's 4-something.

When I need to check, I just right-click the project in Visual Studio and I go like, "Here, what are you again?"

And it's like, "Yeah, I'm 4.6.2." It's like, "Thank you for telling me I will forget this in three seconds."

Seeing the Hibachi GIF above from Fuzzy reminds me to remind you all, if you haven't seen it yet, go watch the film Everything, Everywhere, all at once.

And you'll eventually see a hanamachi scene, which is really good, and won't make any sense without you seeing the film.

So please go look at the film, it's a good film.

Rampa doesn't want to throw dirt, but Unity not only has bad rap, but I feel it's more duct tape together with each version.

The problem there is stability. What they need is, they've got so many moving parts that they need to keep working, and that's why they are down-rev.

If you think that's bad, remember that there are still many, many organizations on the planet that require Internet Explorer 6.

Even though 6 is just completely gone, 7's gone, 11's gone, I remember having weird arguments with the web dev team, like the front-end team at Mixer, about our support of Internet Explorer 11.

I'm like, "You know, like, 1% use it or something?" And they're like, "Show me the data." And I'm like, "Okay, 1% of our viewers use 11. Can we just drop it, please?"

At a certain point, what you have to do in web dev, especially in web front-end, is basically be like, "No."

Because if you're like, "Yes," then you start going like, "Oh, but we don't support Safari Mobile 2!"

"One person who lives in the middle of nowhere uses that, don't worry about it."

You can always go to, where are we,, is it? Yeah,

That shows you if you can use various browser features. It even has a browser usage table.

Ramper points out that ActiveX is the reason why IE6 is being used. Yep, ActiveX is. ActiveX is really insecure. That's cool.

I was reading some messages from someone the other day that said that they were using a DOS program at work.

And this is like a big company that they work with. I won't tell you what company it is or anything like that, but you know,

it's a big company that they worked at, they needed to use a DOS program.

So Rigibus says, "Actually, we're bad on time here, so Ramper, Lax, and Tiki, if you have a question, go for it, otherwise we'll go ahead and stop."

Kazu will also add you to that, because I was mid-saying. So any more questions other than those people? Nope, we are running out of time.

So Rigibus has a question which says, "How does the voice transmission component of Neos work?

Is it like Discord with Web or to see a WebSocket? So what type of data does it transmit? Is it pieces of an audio file?" No.

So the voices are an Opus stream. And again, streams are literally like walking into a room and everyone's vomiting data everywhere.

It's not transactional. WebSockets are transactional, WebRTC is not, it's Opus stream. Over that UDP.

Lexis, "Speech to text for browsers is Chrome only." Interesting. What can I use for that?

Speech recognition, "Method to provide speech input in web browser." It is partially supported everywhere.

And not supported everywhere else. It's only partially supported in Chrome, wow. So yeah, no one really supports that.

Speech synthesis though, everywhere except IE supports that. Or Opera Mobile doesn't support it.

So you know, if you use Opera Mobile, it uses that.

Speech to text is also supported by Edge. Well, Edge is Chrome, so yeah.

No, it says, uh, Edge speech recognition, Edge version 108 does not support it. Maybe that's out of date, I don't know.

I haven't looked at, uh, can I use in a long time, because I don't really do much web development these days.

Anyway, that is all the questions that I have got here.

On another note of DOS and ActiveX and old software, take a look into what happened with Southwest. Southwest did a bad bad during the winter storm in America and cancelled like 2,000 flights or something.

It was a complete mess. And the reason behind that is that their scheduling software was incredibly ancient and just couldn't cope with the load.

So look into that if you want to hear horror stories about how out of date software is.

Other than that, yep, I will see you again next week. I'll be processing the notes and recording.

I'm using, just a shout out here, I'm using Lex Super Cool Script to process all the Office Server stuff now that Sounder has shut down. Very annoying.

That does mean, however, that we are not on any podcasting platforms or Spotify or RSS feeds right now.

I'll also be making slight edits to Lex's script such that we don't have the naming issues we had last week, but you have already fixed them. I don't know, I'm just going to make it anyway.

I'm looking for an alternative source, but whilst this current issue exists, like it's just, the Wiki seems like the place to go.

Once we start getting back into the swing of things, maybe we'll do something else.

Like I would love for these office hours, for example, and for Canadian Gits Office Hours to be actually uploaded to an official Neos channel.

Like maybe there's an official Neos presence on Spotify or other podcasting platforms and all of the office hours go there.

You could go there and be like, yo, give me Prime's office hours, give me the moderation office hours, give me the Workshop Wednesday stream, the Friday stream, just like, wah wah wah wah wah wah wah wah.

Anyway, I will see you guys next week. Spotify is just like, it's a podcast aggregator, right? It'll collect it from wherever the hell it is. Like you can listen to other platforms.

It's just an example. When I say Spotify, I mean like streaming things. Anyway, see you next week.