Skip to content

tjdoomer/Zero_extension

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

zero-ext README

This is the README for a VS code extension called "Zero" used to run Local LLM's.. for free.

Features

This VS extension works in conjuction with Ollama and run models such as DeepSeek.

Requirements

The model being run will be dependendant on your computer hardware, I recommend starting with a lower model e.g. DeepSeek-R1-Distill-Qwen-1.5B if you are unsure.

Rough guide for gpu+ram combo:

  • 8gb ram - DeepSeek-R1-Distill-Qwen-1.5B
  • 16gb ram - DeepSeek-R1-Distill-Llama-8B
  • 32gb ram - DeepSeek-R1-Distill-Qwen-14B

Release Notes

Users appreciate release notes as you update your extension.

1.0.0

Initial release of ...

//### 1.0.1

Guide & Requirements

This project requires the following dependencies,

https://nodejs.org/en

https://yeoman.io

Implement this VS code extensionm template,

npx --package yo --package generator-code -- yo code

alternate extension build https://www.npmjs.com/package/generator-code

Download and install Ollama from https://ollama.com or https://github.com/ollama/ollama/blob/main/docs/faq.md#how-does-ollama-handle-concurrent-requests

ollama run llama3.2

Within ollama download the model you would like to use, e.g. Deepseek https://ollama.com/library/deepseek-r1:8b

  • ensure the model is appropriate for the spec of the machine it is being run on.

additionally add the es6-string-html extension in VS code for Syntax Highlighting.

Once installed replace the extension.ts, package.json and the tsconfig.json in your VS code extension path.

To get it up and running you can run the >debug mode in VS code

debug

Followed by cmd + shift + p

exten

Run the extension and you should now be running a local VS code LLM.

zero

Initial tutorial by Fireship.io, Check out there channel for more tutorials, https://www.youtube.com/watch?v=clJCDHml2cA https://fireship.io/courses/

Enjoy!

About

LLM VS code extension

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published