[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: OpenAI Whisper in Debian?



On Sun, 2023-04-16 at 22:20 +0200, Petter Reinholdtsen wrote:
> The three packages needed to get OpenAI Whisper working in Debian
> bookworm are now available in the deeplearning team git repositories:
> 
>  * <URL: https://salsa.debian.org/deeplearning-team/tiktoken >.
>  * <URL: https://salsa.debian.org/deeplearning-team/triton >.
>  * <URL: https://salsa.debian.org/deeplearning-team/openai-whisper >.
> 
> The package is tested on a machine without relevant GPU and found to be
> working.  My initial tests used around 2.2 GiB RAM for the 'small' model
> and 5.0 GiB RAM for the 'medium' model.  My test machine lack enough ram
> to run the 'full' model, but I assume it will use even more memory.

Cool! Generally as long as the small model works, I don't really expect
the largest model to go wrong as in most cases they are expected to
share the same code paths.

> There is not yet a model package available.  The whisper binary will try
> to download the requested model (model 'small' by default) from the
> Internet on first invocation and store it in ~/.cache/whisper/.

Just as I expected. Some existing like NLTK is already doing this.
One thing that might be tricky is which archive area this package
should go. Maybe it should go to contrib section -- not sure.

https://www.debian.org/doc/debian-policy/ch-archive.html#the-contrib-archive-area

> I suspect a good next step would be to upload tiktoken and triton to
> experimental.  I am unsure I got capascity to properly maintain more
> packages myself, and thus unsure if I want to do so.  I am barely able
> to maintain the packages in my list as it is.

I don't have plan to start working on pytorch 2.0 and triton recently
given my current bandwidth. If you need to upload triton, please
don't wait for me.

In fact I'm sort of enjoying a debian break before the stable release.
Will also slack off for a while after the stable release.


Reply to: