Hello! There was an announcement about fine-tuned ...
# intellij
v
Hello! There was an announcement about fine-tuned CodeLlama-7b for Kotlin https://huggingface.co/JetBrains/CodeLlama-7B-KStack Can I use it somehow locally with fleet or Intellij Idea?
K 3
a
Hi Pavel 👋 in theory, it’s possible to write an IDE plugin that will use that models locally. But generally, it wasn’t intended for that. What exact tasks do you consider? I’m asking because for code completion IDE uses much-much smaller models that were trained for this specific job. For code comprehension, modification and natural language generation or consumption, IDE uses models in cloud. Cause these tasks require high computing resources (e.g. for inference) and memory.
v
Hello Anton! Long time no see) First of all I was interested to understand the reason of opensoursing this model to find how can I use it. One of the options I see is offline work (without Internet). About the plugin... I see
code gpt
supports LM Studio. So in theory I can run CodeLlama-7b with it and use via
code gpt
in Intellij?
🙂 1
a
More or less it’s applied to almost all use-cases of working with code, except those, where you need high speed and a/or small size of model. Mostly, this model is about code generation, conversion, comprehension for afterwards tests/comments/commit_message/… production. It’s open-sourced, so you don’t need to pay other vendors for usage their models and infrastructure. But most likely you need to setup infrastructure for your project. It takes ~ 15GB of memory and inference also need computational resources. Personally, I didn’t try it locally, but I believe you will be satisfied with speed only in case of quite powerful notebooks of PC. Also, this model is a demonstration of the quality of DataSet, on which it was fine-tuned. That DataSet can be used for more wider purposes, with some additional fine-tuning.
So in theory I can run CodeLlama-7b with it and use via
code gpt
in Intellij?
yep, I don’t see reasons why I would not work
👍 1
thank you color 1