[-] catty@lemmy.world 1 points 18 hours ago

Damn, I should have ended the post with /s for people like you.

[-] catty@lemmy.world -1 points 18 hours ago

See here's the thing. Why would anyone want to host ALL the stuff on one pi? That is not what they were designed for. Ollama on a pi? Are you out of your mind? I'd run the biggest model I can on a modern gpu not some crappy old computer or pi....Right tool, right job. And why is dropping containers "less secure"? Do you mean "less cool"? Less easy to deploy? But you're not deploying it, you're installing it. You sound like a complete newb which is fine, but just take a step back from things and get some more experience. A pi is a tool for a purpose, not the end all. Using an old laptop is not going to save the world and arguing that it's just better than a pi (or similar alternative) is just dumb. Use a laptop for all I care, I'm not the boss of you.

As for an arr stack, I'm really disappointed with the software and don't use it and those who do have way too much time to set it up, and then make use of it!

[-] catty@lemmy.world 0 points 22 hours ago

I can self host what I want on a pi zero. But, I do have some 30 years of experience so can probably do things some won't understand / bother with.

[-] catty@lemmy.world 5 points 23 hours ago

I'm sure silicon valley are stepping on each other, vying to get their hands on these super cheap laptops for their 24/7 AI training.

[-] catty@lemmy.world 1 points 23 hours ago* (last edited 23 hours ago)

It's even worth pointing out you can disable various parts of the pi so it uses / needs even less juice.

[-] catty@lemmy.world 1 points 23 hours ago

Pi’s are ARM-based, which still to this day limits the scope of their applicability.

Untrue.

Also, you should absolutely inspect a laptop before buying. Many, if not most, of old laptops will run just fine for the next few years.

Until the battery needs replacing, costing more than a pi, one key on the keyboard dies, etc.

[-] catty@lemmy.world 0 points 23 hours ago

Please be specific rather than referring to 'raspberry pis' together. Different models have way different characteristics.

[-] catty@lemmy.world 2 points 23 hours ago

This is generally not true. A small server running on an old pi when idling will have hardly any draw. It will cost literally pennies to run for the whole year.

[-] catty@lemmy.world 1 points 1 day ago

But... that's so uncool...

[-] catty@lemmy.world 1 points 1 day ago

That's only start up cost. What about ongoing 24/7 costs after 2 years?

[-] catty@lemmy.world 2 points 1 day ago

ODroids don't meet European legal hazard levels on poisonous fumes. I bought one back in the day and they explained they won't apply for the test because of "the cost"... not that it uses cheap solder that don't meet lead limits.

[-] catty@lemmy.world 8 points 1 day ago* (last edited 1 day ago)

I dislike posts like this. Technology moves quickly. PIs are great for hobby electronics where you need a little computer. Want a cheap computer to run a few things 24/7 and know what you're doing? Pi it is. You don't need to run containers on a pi because you have the skills to install the dependencies manually. They cost pennies to run 24/7.

I think of pis as beefed-up calculators. I have made lots of money using a pi zero running code I needed to run 24/7. Code I developed myself.

Having an old laptop with outdated parts taking up lots of space, weighing a lot, and having components like fans, keyboard, and mousepad most-likely soon dying and needing replacing is an additional concern you don't want.

Someone below saying use an old laptop if you're living with parents and don't pay the electricity bill is a bit lame. Do your part for the world. Someone will be paying for it.

Ultimately, use what you want but if you're just starting with servers, use a virtual machine on your computer and log in to it. You can dick about with it as much as you want, and reset back to a working state in seconds.

1
submitted 3 days ago* (last edited 3 days ago) by catty@lemmy.world to c/selfhosted@lemmy.world

I was looking back at some old lemmee posts and came across GPT4All. Didn't get much sleep last night as it's awesome, even on my old (10yo) laptop with a Compute 5.0 NVidia card.

Still, I'm after more, I'd like to be able to get image creation and view it in the conversation, if it generates python code, to be able to run it (I'm using Debian, and have a default python env set up). Local file analysis also useful. CUDA Compute 5.0 / vulkan compatibility needed too with the option to use some of the smaller models (1-3B for example). Also a local API would be nice for my own python experiments.

Is there anything that can tick the boxes? Even if I have to scoot across models for some of the features? I'd prefer more of a desktop client application than a docker container running in the background.

1
submitted 3 days ago* (last edited 3 days ago) by catty@lemmy.world to c/lemmyshitpost@lemmy.world

I'm watching some retro television and this show is wild! Beauty contests with 16 year-old girls (though at the time, it was legal for 16 yo girls to pose topless for newspapers), old racist comedians from working men's clubs doing their routine, Boney M, English singers from the time, and happy dance routines!

vid

view more: next ›

catty

joined 3 days ago