Now I’m considering that maybe buying an intel NUC for a closet server wasn’t such a good deal. It occurred to me as I was selling my five-year-old android phone on Craigslist for $40 what an incredible piece of technology it is that is now adjacent to junk. Multiple photo/video cameras, microphone, battery (built-in UPS!), screen with touch input, GPS, GPU, decent CPU, Wi-fi, 4G antenna (stay online even if your apartment loses power). Getting a rpi-like board with these capabilities would set you back a pretty penny. Anybody else have experience using these as mini servers?
This is a great point. I have several old Android phones kicking around that I’ve wanted to start to hack on. My first idea was to try and build a cyberdeck with my old Pixel 4a, but I bet they could be used for mini server applications! Do you know of a way to easily get some Linux running on them by chance? seems like that’d be the first step
Great find! According to the story submission guidelines, though:
When submitting a URL, the text field is optional and should only be used when additional context or explanation of the URL is needed. Commentary or opinion should be reserved for a comment, so that it can be voted on separately from the story.
I know communities like those on Reddit have a practice of including text with link submissions. However, here on Lobsters, homepage articles include a preview of the article’s text. Submission text overrides that preview.
As such, it’s preferable to just let the first few lines of the article speak for themselves, rather than a description from a secondary or tertiary source written in the third person. This is part of what helps people find stories they’d actually be willing to continue reading.
I was wondering that as well, but it’s likely a cost/performance tradeoff rather than lack of functionality on macOS.
My preliminary speed tests were fairly slow on my Macbook. However, once I deployed the app to an actual iPhone the speed of OCR was extremely promising (possibly due to the Vision framework using the GPU). I was then able to perform extremely accurate OCR on thousands of images in no time at all, even on the budget iPhone models like the 2nd gen SE.
My preliminary speed tests were fairly slow on my Macbook. However, once I deployed the app to an actual iPhone the speed of OCR was extremely promising (possibly due to the Vision framework using the GPU).
Not clear why the MacBook was so much slower, however!
Now I’m considering that maybe buying an intel NUC for a closet server wasn’t such a good deal. It occurred to me as I was selling my five-year-old android phone on Craigslist for $40 what an incredible piece of technology it is that is now adjacent to junk. Multiple photo/video cameras, microphone, battery (built-in UPS!), screen with touch input, GPS, GPU, decent CPU, Wi-fi, 4G antenna (stay online even if your apartment loses power). Getting a rpi-like board with these capabilities would set you back a pretty penny. Anybody else have experience using these as mini servers?
This is a great point. I have several old Android phones kicking around that I’ve wanted to start to hack on. My first idea was to try and build a cyberdeck with my old Pixel 4a, but I bet they could be used for mini server applications! Do you know of a way to easily get some Linux running on them by chance? seems like that’d be the first step
Great find! According to the story submission guidelines, though:
I know communities like those on Reddit have a practice of including text with link submissions. However, here on Lobsters, homepage articles include a preview of the article’s text. Submission text overrides that preview.
As such, it’s preferable to just let the first few lines of the article speak for themselves, rather than a description from a secondary or tertiary source written in the third person. This is part of what helps people find stories they’d actually be willing to continue reading.
I wonder if there’s a reason they didn’t use a fleet macOS machines running multiple iOS simulators.
I was wondering that as well, but it’s likely a cost/performance tradeoff rather than lack of functionality on macOS.
I wonder if his MacBook is an Intel one or an Apple Silicon one. The latter has an architecture closer to the iPhone’s.
Starting with a few $40 iPhones SE 2 probably factored in to the calculation.
Or run the Vision APIs on macOS, when it’s documented as supported….
I’m aware of ocrit, a command-line tool that uses Apple’s Vision for OCR.
The article says:
Not clear why the MacBook was so much slower, however!
Wasn’t he simulating iPhone in XCode when testing on his macbook, though?
Maybe save the OCR’d text in the jpeg header or the lowest bitplane?