So everyone wants to code with AI at the moment, but if you are not half smart and using something like OpenAI, then everything you pull from and push to ChatGPT is there for the taking.
A good example, a friend the other day had some code they had to understand that a fired dev had written; once they understood it, they asked chatGPT for an example of how to do something and guess what? ChatGPT gave the same code back almost line for line. Was it good? Was it right? sure, but the said dev that used it hadn't used it correctly, but that's not the point; the point is security and privacy should be at the forefront of everyone's mind.
So, I wanted to share a way to do it locally without sharing your code with the world.
Firstly, I am running a Mac Studio M2 Max with the following specs:
- 12-core CPU with eight performance cores and four efficiency cores
- 30-core GPU
- 16-core Neural Engine
- 400GB/s memory bandwidth
- 32GB unified memory
I am running LM Studio and using OpenCodeInterpreter-6.7B Language Model, and I want to integration with Visual Code.
So first, I download the model in LM Studio:

Then we load the model and start the local LM Studio:


Once started, we move over to Visual code and install Code GPT Chat & AI Agent extension.

Next, we open CodeGPT and select the provider:

Now we can use the chat; for example, for document code, I can simply select the code and ask for it to be explained, documented, or refactored...

Now, if we want to double-check that we are sending the chat request to LM studio, we can go back in and check the server logs:

Let's try to do something a little more complex, like find problems or refactor:
Results from asking to Find Problems in the code:

Results from asking to refactor the code:

Something so simple and quick to do makes it easy to get assisted coding. Now CodeGPT does do autocomplete, but from what I can tell, it still uses the APIs of 3rd party provides so be warned of this.