So far, running LLMs has required a large amount of computing resources, mainly GPUs. Running locally, a simple prompt with a typical LLM takes on an average Mac ...
This is a multi-user version of shadowsocks-python. Requires a mysql database or a panel which supports SS MU API. run the docker (random free ports will be allocated ...