The state of local AI today: llama3.1 (the 8B version) runs well on my MacBook Air with 24GB of RAM. I run it with Ollama. There’s a decent FOSS terminal app, Aider, that lets you do useful things with it.
Install those and you can have a reasonable coding assistant that’s 100% free and 100% local.