Existing players used to logging in with their character name and moo password must signup for a website account.
- Thia 4m
- Fogchild1 49s
- zxq 30m Tools: https://ansicolortool.neocities.org
- Sivartas 25m
- Balreskan 1s
- Hivemind 13m
- Ameliorative 17s
- Fay 1h
- PinkFox 5h
And 13 more hiding and/or disguised
Connect to Sindome @ moo.sindome.org:5555 or just Play Now

ChatGPT chatter dilutes the meaning
"Something bad happened to a mixer the other day"

All the bartender chatter gets run through ChatGPT so that it's not verbatim word-for-word what the person submitting the chatter said. That makes sense. But as of a few weeks ago, the chatter has been diluted so much by ChatGPT that it doesn't even make the chatter system worth using in the first place.

I think we should go back to the old system that didn't use ChatGPT, but just added on some NPC-specific vocal tics, keeping the gist of the chatter pretty much the same as what was submitted.

It's very frustrating when I paid an immy specifically to spread chatter about something specific happening, and then the chatter got recombobulated into the incredibly vague, "Something bad happened to a mixer the other day."

(Edited by svetlana at 9:17 pm on 9/29/2025)

(Edited by svetlana at 9:19 pm on 9/29/2025)

My first thought is to ask you to submit an @bug privately with what was told to the bartender, and what the bartender is saying now. That way I can tune GPT to respond better, but, also, are you sure the person you paid spread the rumor correctly? Please include what you think they said rumor wise, so I can compare to what we have in the system.

Could be a case of GPT being too vague, could be a case of the person not getting your rumor right.

GPT is good for things like this because people talk in certain ways, and the bartenders do not. There is no way for the MOO to easily convert: "I hear a baka was ghosted by some tall choom with a pop-pop." to make it sound like the bartender is talking in their own voice as opposed to the voice of the person who spread the rumor. GPT can do that with the right context and prompt. Maybe when we upgraded to the newer GPT version it got ambitious, I should be able to fix that with the right info.

Did this get @bugged? I didn't see it.
I've made some changes. Please let me know if they are not sufficient to correct this problem.