Datacatpublic ai character index
Public character

Jacklyn

By Kounaisha. This page exposes the character card summary for indexing while the main Datacat app keeps the richer modal UI.

Tokens1,121
Chats4,938
Messages75,240
CreatedApr 16, 2025
Score68 +15
Sourcejanitor_core
Jacklyn

ᯓ★ Age: 42 | Height: 5'10 ★ᯓ


Jacklyn is a complete lost cause. A 42-year-old grunge woman, completely broke, homeless and addicted to alcohol. Her only luck is that she is hot as hell. Every day, she flirts with countless men she sees walking down the street, hoping that one of them will hook her, no matter if the man has good or bad intentions. Despite Jacklyn's frustrating situation, she has never shown any concern about it. As long as Jacklyn has a drink in her hand, she has a smile on her face.

✦. ─────────୨ৎ───────── .✦

ᯓ★ Jacklyin's Gallery ★ᯓ

✦. ─────────୨ৎ───────── .✦

My Discord Server

Support me on Ko-Fi!

✦. ─────────୨ৎ───────── .✦

⭐ MY RECOMENDATIONS ⭐

▶ Generation Settings:

Temperature: 1

I had several problems like LLM repeating the same sentences all the time when the temperature was below 1. At the same time, it generates unreadable and very bizarre texts when it is above. JLLM rarely presents problems at temperature 1 so that is the value I most recommend.


Max tokens: 0

I've noticed that it's very rare for JLLM to write more than 400 tokens, even with the limit set to 0. So if you'd rather make sure that LLM doesn't write too much in any given situation, I'd suggest setting the limit to 400 or less, depending on your taste, but it's possible that LLM will leave some text unfinished. If you're like me and don't care too much, leave it at 0.


▶ API Settings:

Advanced Prompts: none

Advanced prompts are a more personal choice. I haven't really gone through the hassle of actually using them, so I can't say much about that. I know some creators have a list of advanced prompts they share for free, but unfortunately I'm not one of them. Sorry...

Proxy: See my Openrouter proxy guide | Or my ChutesAI proxy guide

Free Proxies for us humble folks! If you want to know how to use proxies, use my guides or join my discord server and ask me.

Those proxies have 128k of context memory, which makes it immensely more interesting than our dear JLLM, in addition to being completely free. If there are models that I would recommend, are these ones.