Ollama is a big thing, do you want it to be fast? You will need a GPU, how large is the model you will be running be? 7/8B with CPU not as fast but no problem. 13B slow with CPU but possible
"Eigentlich fertig" was for an IP subnet calculator that I programmed with a fellow student
Probably or the ai if I should have guessed in the backend it's using something like local ai, koboldcpp, llamacpp probably
sudo apt install steam ;D
Kinda reminds me of far cry 5 Where you can finish the game in 5 minutes by not arresting the preacher and just afking
Mines good and from the Reddit thread the majority of people installed it on fdroid and maybe play protect just removed it because the package name and signatures didn't match, at least that is what's being suspected
Seems Interesting. I hope this will also be a local feature and not dependent on a cloud service but since the Company behind this is using some AI and ML I doubt that it will run on the computer especially since firefox is known to run on basically anything
Thats nice. One question tho is rune codex now? Since I saw on elamigos that rune/codex refere to the same people
Mama ich wurde endlich erwähnt! Das bin ich
I really liked it I used it if I need to send some 100 mb files to friends god I will miss that site it was great
Kvm/qemu is really good too you should try it out or is there special feature holding you back on virtualbox?
They also created ghidra! Probably second best