There's plenty of open source models that don't really have any restrictions, you just have to host them yourself (which you can do on your own computer if you have a decent gpu)
for example: mixtral 8x7b
just use koboldcpp or something similar to run the GGUF files and you're good
Actually not 100% true, you can offload a portion of the model into ram to save VRAM to save money on a crazy gpu and still run a decent model, it just takes a bit longer. I personally can wait a minute for a detailed answer instead of needing it in 5 seconds but of course YMMV