I hope the next step is going to be a cli command that merges “get a query phrase from this question, find related documents, ask the question again with the closest documents provided for context”, right? (I think this is similar to what llamaindex does)
It’s already really cool though - I’m happy it supports llama2.
Follow-up: my Symbex tool now has the ability to spit out every symbol (class, method, function) in your codebase in a format that LLM can accept and use to generate embeddings: https://github.com/simonw/symbex/releases/tag/1.4
I hope the next step is going to be a cli command that merges “get a query phrase from this question, find related documents, ask the question again with the closest documents provided for context”, right? (I think this is similar to what llamaindex does)
It’s already really cool though - I’m happy it supports llama2.
Yeah, that’s absolutely where I’m going with this. That’s most likely going to be a plugin for it.
This is very useful. Some great ideas in the blogpost, especially about embedding summaries and autogenerated questions on documents.
Follow-up: my Symbex tool now has the ability to spit out every symbol (class, method, function) in your codebase in a format that LLM can accept and use to generate embeddings: https://github.com/simonw/symbex/releases/tag/1.4