It’s there for every time after the first. My first two I did manually. Excellent learning experience, very glad to do it. I’m old and lazy now.
New Instance, Same Grenfur
- 0 Posts
- 8 Comments
Grenfur@pawb.socialto linuxmemes@lemmy.world•Every now and again a voice comes into my head telling me to install Arch.27·1 个月前Here’s the thing. When I talk to friends interested in Linux, it’s always Debian or Fedora that I suggest. I think they draw a good line for what the average user wants and needs and they’re stable. In fact, I used Fedora for a long time, and all my homelab stuff runs Debian. It wasn’t until computers themselves became a hobby that I switched to Arch. And I think that’s likely the cutoff. If you’re a computer user, stable distros are great. If you’re more a hobbiest… Well, the Arch wiki can own your free time.
Grenfur@pawb.socialto linuxmemes@lemmy.world•Every now and again a voice comes into my head telling me to install Arch.11·1 个月前I want to switch to Nix… the idea of Nix is compelling. In practice every time I try and test it out I remember that I’m an idiot with a keyboard and I should stop.
Grenfur@pawb.socialto No Stupid Questions@lemmy.world•how do i make my own limitation free ai?3·1 个月前Most of the options mentioned in this thread won’t act independent of your input. You’d need some kind of automation software. n8n has a community edition that you can host locally in a docker container. You can link it to an LLM API and emails, excel sheets etc. As for doing “online jobs” I’m not sure what that means, but at the point where you’re trying to get a single AI to interact with the web and make choices on it’s own, you’re basically left coding it all yourself in python.
Grenfur@pawb.socialto No Stupid Questions@lemmy.world•how do i make my own limitation free ai?6·1 个月前Ollama can be run from CLI.
Grenfur@pawb.socialto No Stupid Questions@lemmy.world•how do i make my own limitation free ai?11·1 个月前Not entirely sure what you mean by “Limitation Free”, but here goes.
First thing you need is a way to actually run a LLM. For me I’ve used both Ollama and Koboldcpp.
Ollama is really easy to set up and has it’s own library of models to pull from. It’s a CLI interface, but if all you’re wanting is a locally hosted AI to ask silly questions to, that’s the one. Something of note for any locally hosted LLM, they’re all dated. So none of them can tell you about things like local events. They’re data is current as of when the model was trained. Generally a year or longer ago. If you wanted up to date news you could use something like DDGS and write a python script that calls Ollama. At any rate.
Koboldcpp. If your “limitation free” is more spicy roleplay, this is the better option. It’s a bit more work to get going, but has tons of options to let you tweak how your models run. You can find .gguf models at Hugging Face, load em up and off you go. kobold’s UI is kinda mid, and though is more granular than ollama, if you’re really looking to dive into some kinda role play or fantasy trope laden adventure, SillyTavern has a great UI for that and makes managing character cards easier. Note that ST is just a front end, and still needs Koboldcpp (or another back end) running for it to work.
Models. Your “processing power” is almost irrelevant for LLMs. Its your GPUs VRAM that matters. A general rule of thumb is to pick a model that has a download size 2-4GB smaller than your available VRAM. If you got 24G VRAM, you can probably run a model that’s 22G in download (Roughly a 32B Model depending on the quant).
Final notes, I could have misunderstood and this whole question was about image gen, hah. InvokeAI is good for that. Models can be found on CivitAI (Careful it’s… wild). I’ve also heard good things about ComfyUI but never used it.
GL out there.
Grenfur@pawb.socialto Games@lemmy.world•The Expanse: Osiris Reborn Announcement TrailerEnglish491·3 个月前The Expanse is by far my favorite Sci-Fi universe ever. I’m not sure how it’ll go for Owlcat, but damn does this make me excited.
So I recently switched to vim as my text editor. And started using vimwiki for notes. But I must know what insanity could one possibly do with a Text editor other than… Well text edit.