Since I burned through my Junie quota for the mont...
# random
s
Since I burned through my Junie quota for the month in no time, I'm thinking of giving local models a shot. I’ve got an RTX 2080 and can run Deepseek-Coder R1 8B without much trouble. There are tons of options at ollama.com/search, and I’m sure some of them will run just fine on my setup. My question is: Can anyone recommend a good local model for Kotlin or something close to Junie's capabilities? Edit: Nevermind. Junie doesn't support local models. I believed that, but I fall for an UX issue.
p
I've the same trouble 😵‍💫
s
I see a lot of people on Reddit having used up their quota. ^^
I think I would prefer daily / hourly limits like ChatGPT has. Being able do nothing for the next 30 days... it's a bit harsh.
p
I rarely was able to finalize a POC for a simple app...now need to do the work by myself 😂
s
Maybe, I don't give up quite yet. I asked Grok and it told me that CodeGemma 7B may be a good model for Kotlin. I just installed LM Studio and will give it a shot.
I wish JetBrains would recommend local models.
CodeGemma 7B produces working Kotlin code, but the solutions are very hacky compared to ChatGPT. What did I expect? ^^
p
Getting chatgpt power for free? 😂
s
I still have hopes that some day a local model will be good enough ^^
Now I would describe that as barely...
Working with AI agents is still like having someone extreme rookie on your team that doesn't test his stuff ^^
d
s
Yes, but that are the big cloud hosted models. I’d like the same for local models you can run on your machine. I can’t connect Junie with ChatGPT myself.
I'm confused... I believed Junie works with local models, but maybe it does not. I asked it what model it is and it responded with Claude. I pulled the network cable and it doesn't work anymore.
😂 1
d
I did not check it, but as I know Junie doesn’t have local LLM connectivity feature, only the AI assistant has the ability to connect to local LLM, and just in the chat, but maybe my knowledge are outdated
s
Yes, it turned out to be just an UX/UI issue. I have two licenses. They can be managed in an extra menu I wasn't aware of. In online mode I need to choose the license with quota, but in offline mode Junie automatically uses the license with quota left. So they may indeed no support for local modals in Junie ... that would have been great.