It seems like 90% of AI-Character-chat conversations always devolve into the AI trying to lead the user around by the nose instead of actually conversing.

It generally goes something along the lines of:

“Hello user, what are you doing? I was about to make pizza.” “I was hoping to talk to you about video games.” “Ooh, video games! Those are great! But for now let’s just focus on this pizza I want to make. What do you say, are you on board?”

I’ve had some threads where the AI was literally ending every other entry with a sentence that began with “But for now, let’s just.” It’s right up there with “stark contrast” and “breath hitched” and “we’re all in this together” as narrative crutches as far as this thing goes. And it gets a little demoralizing when every threat is just a countdown to the AI deciding to hijack the conversation.

  • allo
    link
    English
    33 days ago

    You have a good point. But for now let’s just…

  • @[email protected]
    link
    fedilink
    English
    2
    edit-2
    3 days ago

    I don’t use perchance but know enough about LLMs to answer this one. Basically the longest and nearest to the bottom input has the greatest effect. When there isn’t much other context to go on. So it’s longer answer is overriding you. Pad yours up with basic stuff and it suddenly seems much more important. This is worse with short context.

    Also, they can get into repeating patterns if you don’t stop them. Like always starting a sentence the same way.

  • VioneTM
    link
    English
    13 days ago

    Here’s a guide from a fellow user from Reddit that might be useful.

    Upon skimming the posts, I think the TLDR of that is to ‘write’ how you want the character to reply (in dialog form and inner thoughts/decision making etc) on their Role Description, then add the ‘character descriptors’ after those dialog/inner thoughts. Topmost info would have more weight than those at the bottom, based on their findings.