It seems important to note that this project’s source has not been released. Even though the article explicitly tries to mask it stating that they can be reach on GitHub, the repository that they link to merely collects a brief README and a bunch of release notes - no actual code (there is actually one issue already asking about this). This means that other editors that suports LSP will not benefit from this server implementation.
ONNX Runtime is a cross-platform inferencing and training accelerator compatible with many popular ML/DNN frameworks, including PyTorch, TensorFlow/Keras, scikit-learn, and more. aka.ms/onnxruntime
Many users can benefit from ONNX Runtime, including those looking to:
Improve inference performance for a wide variety of ML models
Reduce time and cost of training large models
Train in Python but deploy into a C#/C++/Java app
Run on different hardware and operating systems
Support models created in several different frameworks
Not sure what to think of this all. It might be a very sophisticated language server, and they do point out that it is built on top of https://github.com/microsoft/pyright.
It seems important to note that this project’s source has not been released. Even though the article explicitly tries to mask it stating that they can be reach on GitHub, the repository that they link to merely collects a brief README and a bunch of release notes - no actual code (there is actually one issue already asking about this). This means that other editors that suports LSP will not benefit from this server implementation.
So I did this.
That
onnxruntime
lead me to this: https://github.com/microsoft/onnxruntimeNot sure what to think of this all. It might be a very sophisticated language server, and they do point out that it is built on top of https://github.com/microsoft/pyright.
Anyhow. I might revert to the old server.
Looks really great, I will give it a try next week at work !