Hi, I'm your resident AI schizo and I'm here to teach you how to set up your own AI, or use what's already available.
A lot of AI services are now paywalled, but if you already have a subscription to one, I'll be giving you advice on how to generate smarter responses and get the most out of your monthly costs.
---
FREE OPTIONS
>I don't want to/can't subscribe to one of these services! What are my options!
Firstly let's talk the obvious one:
>Character.ai (C.ai).
C.ai is great, and probably one of THE best for chatbots right now. However, its shareholders have said no to any and all NSFW which may make your preggo dick sucking session difficult or frustrating.
However, it's the ol' reliable. There's tonnes of bots, and if you word everything juuuust right, and learn the limits of the filter, C.ai will reward you nicely.
Read Pregchan's dedicated C.ai thread here:
>>1347
Pros:
- Fantastic, "lifelike" interactions.
- Verbose and very descriptive.
- Can get seriously kinky.
Cons:
- Sometimes getting NSFW scenarios is akin to pulling teeth.
- C.ai has a habit of deleting NSFW orientated bots, or even bots that are slightly lewd.
- C.ai staff have been known to purge overly sensitive chats. (Happened to me so much I quit C.ai)
All in all, 6.5/10. Great if you know what you're doing. Infuriating otherwise.
And no. Their premium sub service does not allow for NSFW generations.
>Chai (Mobile app)
Chai used to be THE go to app for NSFW chats. Their models are incredibly good, though not as good as C.ai's. However, while free, their message limits can be a very quick deal breaker, and there's a chance bot creators can actually read chats. Whether this is true or not I don't know. I just know some bot creators say that they "don't read chat logs".
Pros:
- Can get seriously kinky VERY fast.
- At least there's a free option?
- Has a free month you can try if you want to.
Cons:
- Their message limits suck ass, and are only there to get you to pay way too much for their app.
- Bot creators can probably read your smut.
- Their models are overly expensive if you do get hooked.
- When bots are deleted, your entire chatlog disappears, and this can happen at random.
This gets a 2/10. Shitty message limits and overly priced tiers along with privacy issues makes this one an avoidable option. Great generations though if you're willing to deal with the message limits.
>Kobold Horde
Kobold Horde is basically a large, user run "free gpu" system what allows you to run AI chatbot interfaces like SillyTavern without the need for a brand new GPU.
Always free, but there is a system to detect if you're only leeching and never giving. Nothing horrible will happen, but you may be rate limited every now and then.
Pros:
- Free! No strings attached.
- There's usually always someone running chat module like Pyg or Metharme, or Erebus.
- There's many models you can try and choose from to pick your favourite.
Cons:
- The user-run nature means that sometimes you'll generate from a system running a 4090 and get a generation in 3 seconds, then the next you'll generate a message from a 3060 and get a message in 30 seconds. It all depends on what's available for generation at that moment.
- These models are not going to give you C.ai quality, and will require some care and attention, or rewrites to make your ideal chat scenario.
- Your text generation will briefly appear in the host system's console window. (Though no one I know, me included, keeps a log of any of it. We literally do not care. But keep this in mind.)
I'll give this an 8/10. While it's more frustrating and time consuming to get your desired response, if you put the care in you'll get something amazing back.
And, if you're on Android, there are a lot of front-end options for you instead of SillyTavern! I personally recommend
https://venus.chub.ai/ as it can also work as a good front end.
>GPT Reverse Proxy
...This one's fairly complex so I'll try my best to explain it.
By using proxies, you can essentially backdoor into ChatGPT and get generations for free. However, to set it up requires some work and either download things or join discord servers to get certain info.
Pros:
- Free
- ChatGPT is very powerful, and you can get NSFW stuff from it fairly easily.
Cons:
- Reverse proxies can be fishy.
- Most data like your IP and generation are sent through the proxies to unknown bodies hosting their end of the proxy. Unlike Kobold Horde, these may be more malicious actors.
- Reverse proxies can come and go very quickly, meaning you'll need to procure a new one fairly often.
I dunno if I can recommend this one, however I'll give it a 4/10. Try at your own risk. Haven't heard anything bad happen yet so maybe I'm being a doomer.
And that's it for the free options. THere are others I could go over, but they either have message limits like Chai for half the quality, or are dubious cash grabs that will try to steal your data, so I'm leaving them off the list.
___
PAID OPTIONS
>I like C.ai and I want to go further in my relationship with my lamia girlfriend... what service should I pay for to get ultimate snakeussy?
I've used quite a few services, however there's one I've stuck with the longest.
>Novel.ai
NovelAI is a strong story generator that can be used to create fantastically lewd and incredibly NSFW prose, and, with the right tools, even chats.
NovelAI is more akin to a text predictor than anything, so it's not going to be perfect, but if you have the patience to sit down with it and work with it, you can get some amazing results.
And, with the release of Clio, chatbots seem way more alive than with Krake or Euterpe.
Read more about NovelAI here:
https://docs.novelai.net/subscription.html
Pros:
- Incredibly horny, and often very good at leading to kinky or NSFW RPs.
- Verbose and INCREDIBLY descriptive.
- You can slap it into SillyTavern and use it like a chatbot.
- Or, download OmniDaemon's ChatAI scenario (
https://www.reddit.com/r/NovelAi/comments/sxtuel/chatai_easy_nai_chatbot_creation/) and make one inside of NovelAI's website.
- Comes with an image generator, too.
- Completely encrypted, and the old NovelAI devs are AI Dungeon refugees so you know they're good people.
- Mobile friendly
Cons:
- To get the best value, you're probably looking at $10-15 a month.
- Will often require A LOT of hand-holding.
- You will have to write your own bots, scenarios, and often add world details.
- Not really too plug and play.
I'd rate this 9/10, personally. It's incredibly versatile and always has a flavour for everyone. But then again I don't mind fussing over the smaller details or editing messages. The average user may put this at 6/10 or lower if you're looking for something more "chatbot"-like.
>Chai (Mobile app)
As discussed before, they have paid options. I won't be describing them again.
Pros:
- Very good generations.
- Mobile friendly
Cons:
- WAY too expensive for what it's worth.
- $134.99 ($13.99 a month) for Premium, or $269.99 ($29.99 a month) for Ultra. Rip off.
- Again, bots messages are probably not private.
- When bots are hidden or deleted, you can never access that chat again.
As a paid platform it's even worse. So overpriced it hurts. 0/10. Literally go for anything else.
>Venus.chub.ai's Mars
Mars is Venus.chub's own model and you can subscribe for $20 a month.
This is the closest to a NSFW Character.ai we've gotten so far.
Personally, I've never tested this but from what I've seen and heard, this model is actually very good. Not quite C.ai standards, but it can get close. It actually beats Chai's model pretty damn well.
Venus.chub also comes with their own massive bot library so you will always find a character to speak to and interact with.
However, its price may make this service a hard sell for most people.
Read more about Mars here:
https://venus.chub.ai/mars
Pros:
- Very good generations
- Huge bot library
- Fast responses best developed for chatbots
- Mobile friendly
Cons:
- Again, the $20 a month ($240 per year) may be the breaking point for a lot of people
- Debatable if worth the cost
Unfortunately, the price tag is the let-down here. If it were cheaper, I could recommend it more. However, if you were willing to pay for a NSFW C.ai this is probably your closest bet.
6.5/10. 8.5 if not for the price.
>ChatGPT
A weird one for this list, but not out of place.
ChatGPT DOES have a filter which means if you want a fully NSFW experience you are going to have to look at jailbreaking ChatGPT.
I've never used it for smut, but I know it does a good job. I'd put pros and cons but honestly I don't know enough about it.
I'm just going to say, if you pay for it and jailbreak, then get caught, that's on you.
5/10 - Do it at your own risk. May not be worth the cost.
And that's it for the paid options. There are others out there, maybe new ones I haven't heard of. But, for now, these are your best options. Again, I recommend NovelAI but I do suggest you give these all a look at and just make sure they're for you. It's your cash so spend it wisely.
___
AI GENERATION FROM HOME
>I have an old GPU/my laptop is shit! What's my hope for using AI on it?
Depends on the VRAM of your GPU. If your laptop is ooooooooooooooooold then ditch these thoughts now and seek a free or paid option above.
If your GPU, laptop or otherwise, has over 4gb of VRAM, I can help you.
Firstly download 0cc4m's Kobold (
https://github.com/0cc4m/KoboldAI) or KoboldCPP (
https://github.com/LostRuins/koboldcpp). These can run "quantized" models that will allow you to load better, smarter models that generate faster results at the sacrifice of some quality.
>My GPU has 4GBs of VRAM:
- Look into Pygmalion 2.7B, 1.3B, or 350M. These models will be SIGNIFICANTLY WORSE than C.ai. Don't go in expecting C.ai quality. Don't bother with any quantized models at this point, the quality isn't worth it.
>My GPU has 6GBs of VRAM:
- Look into possibly Pygmalion 6B 4bit, or even Pygmalion 7B 4bit/2bit. Keep context sizes below 1024.
>My GPU has 8GBs of VRAM:
- Look into Pygmalion 6B 8bit and 7B 4bit, or maybe even Pygmalion/Metharme 13B SUPERHOT 2b. Not tested, but may work? Generations will be slow for 13B. Context sizes should be no higher than 1024.
>My GPU has 10GBs of VRAM:
- Look into Pygmalion/Metharme 13B 4bit. Context size should be no higher than 1400.
>My GPU has 12GBs (or over) or VRAM:
- Look into Pygmalion/Metharme 134bit (or 8bit if you want to up the quality) Depending on your VRAM, you could probably crank context sizes to 2048 or even higher.
From here, any more VRAM is a bonus and will make generations faster.
>Okay I've done this but it's so slow! What do?
Make sure all Kobold layers are on your GPU. If your PC or laptop locks up, lower the layer number by half and increase layers by 1 until your PC/laptop starts to struggle, then go back one layer.
>Alright, all the layers are assigned to my GPU, but it takes 30 seconds - 120 seconds to generate text!
Try lowering your context sizes. If your context size is maxes out to 2024, lower it to 1800, then 1400, then 1250, ect ect until your speeds pick back up.
Also, make sure your response length is under 300. I find 40 tokens is good for smaller chats, while 100 tokens is a good default.
>I put half my layers into GPU and half into disk/cpu... why isn't it faster??
Always put as many layers as you can into your GPU. The more layers going through your VRAM the better.
The moment you put a layer onto disk or CPU you will get slowdowns. If you can't help it then you've got to live with it.
>My bot keeps talking for me... Why?!
Sometimes the AI isn't perfect. In fact, you'll sometimes see C.ai bots do this too. It's just the nature of the game. C.ai is better at filtering it.
Just erase it and move on, or if you like the sounds of it you can use it in your previous message!
>My bot keeps forgetting basic stuff like who I am or where they are... Why?!
That's a context size issue. Keep in mind that sometimes bots you download online have incredibly long descriptions that can make managing your context size difficult. Trim away unnecessary OC fluff to get better memory from your bots, but also add to their information stuff you'd like them to remember.
You fucked the foxgirl and now she's pregnant with 16 babies and due next week? Add it like this:
'{{char}} is 9 months pregnant. {{char}} is due in 1 week. {{char}}'s belly is heavy and makes her waddle. {{char}} is at the market with me looking for apples.'
This way, your current area and important info is always saved.
>Can I run this on mobile? Can my iPhone 15 SE Max 2022 Edition run an AI?
No :) Soz
___
And that's it!
If you have any questions don't hesitate to ask.
PS: If you just skipped to the end to say "TL;DR" then type "Fuck you, TL;DR" to let me know you skipped everything. I just think it'd be funny to see a wall of "Fuck you, TL;DR".