How big can a project be and still call itself ‘tiny’? This looks neat, but the repository is 700 files with over 100k lines of code (although the tiny_dnn directory has only 115 files and 22k lines of code).
Anyway, it looks like a neat library! I get that it’s smaller than other implementations, and it has no dependencies. I just think ‘tiny’ is still an odd adjective here.
It could mean ‘tiny’ in comparison to other implementations of deep learning, not overall lines of code. Then again, I don’t know what the expected size of a deep learning implementation written in C++ is.
Or perhaps it’s tiny merely in scope compared to other deep learning libraries. However, yet again, I’m merely guessing because I don’t know what the scope of an “average” deep learning implementation is.
How big can a project be and still call itself ‘tiny’? This looks neat, but the repository is 700 files with over 100k lines of code (although the tiny_dnn directory has only 115 files and 22k lines of code).
Anyway, it looks like a neat library! I get that it’s smaller than other implementations, and it has no dependencies. I just think ‘tiny’ is still an odd adjective here.
It could mean ‘tiny’ in comparison to other implementations of deep learning, not overall lines of code. Then again, I don’t know what the expected size of a deep learning implementation written in C++ is.
Or perhaps it’s tiny merely in scope compared to other deep learning libraries. However, yet again, I’m merely guessing because I don’t know what the scope of an “average” deep learning implementation is.