Second Mate
An open-source, mini imitation of GitHub Copilot using EleutherAI GPT-Neo-2.7B (via Huggingface Model Hub) for Emacs.
This is a much smaller model so will likely not be as effective as Copilot, but can still be interesting to play around with!
Setup
Inference End / Backend
- Set
device
to “cpu” or “cuda” inserve/server.py
- The “priming” is currently done in Python. If you want, modify it to another language or turn it off (priming subjectively seems to help).
- Launch
serve/server.py
. This will launch a Flask app which will allow us to sample the model via REST API.
Emacs
- In
emacs/secondmate.py
, set the URL to “localhost” or the address the API is running on. - Configure Python and script path in
emacs/secondmate.el
. NOTE: The local Python script is a temporary patch which will be replaced by a GET request in Elisp directly.