Benchmarks and leaderboards:
Roughly speaking, Macbook RAM correlates with the parameter size of the model you could use. 34b is enough to make the Macbook run fans.
ollama run phind-codellama:34b-v2
{ "models": [ { "title": "Phind CodeLlama", "provider": "ollama", "model": "phind-codellama-34b", "api_base": "http://localhost:11434" } ], "model_roles": { "default": "Phind CodeLlama" }, "slash_commands": [ { "name": "edit", "description": "Edit highlighted code", "step": "EditHighlightedCodeStep" }, { "name": "comment", "description": "Write comments for the highlighted code", "step": "CommentCodeStep" }, { "name": "share", "description": "Download and share this session", "step": "ShareSessionStep" }, { "name": "cmd", "description": "Generate a shell command", "step": "GenerateShellCommandStep" } ], "custom_commands": [ { "name": "test", "prompt": "Write a comprehensive set of unit tests for the selected code. It should setup, run tests that check for correctness including important edge cases, and teardown. Ensure that the tests are complete and sophisticated. Give the tests just as chat output, don't edit any file.", "description": "Write unit tests for highlighted code" } ]}