Ironically, “square peg in a round hole” is the first thing that comes to mind. At some point I think I would consider reevaluating my software platform. what makes os x an irreplaceable component here?
They mentioned they use builtin graphics libraries in OSX. I imagine it’s done for stability reasons and an interest in consistent hardware performance. They might have to bring in another company or have a new job focused on building and configuring those systems if they used custom boxes with a Linux flavor.
One nice bit about what OSX does is compiling the filter graph into one pass that’s then executed on the GPU. The typical Linux image-processing tools people run (ImageMagick, etc.) don’t do that. Halide does do it, but it’s quite recent, and a lot rougher around the edges (it’s code from a PhD thesis, though much better than typical “research code”).
The imaging pipeline is vastly superior to anything currently available on Linux. The real question is why not use Windows, which has if anything a technically superior story to OS X, and is certainly much more at home in a datacenter.
I’m really interested in why (or what in) OS X is better than Linux as well as why Windows would be better than OS X. Do you have any resources to read more about this?
You can do things with OS X’s imaging and video pipeline that you’d have to spend considerable effort building a bespoke Linux solution to replicate. It’s not prima facie ridiculous.
Ironically, “square peg in a round hole” is the first thing that comes to mind. At some point I think I would consider reevaluating my software platform. what makes os x an irreplaceable component here?
They mentioned they use builtin graphics libraries in OSX. I imagine it’s done for stability reasons and an interest in consistent hardware performance. They might have to bring in another company or have a new job focused on building and configuring those systems if they used custom boxes with a Linux flavor.
One nice bit about what OSX does is compiling the filter graph into one pass that’s then executed on the GPU. The typical Linux image-processing tools people run (ImageMagick, etc.) don’t do that. Halide does do it, but it’s quite recent, and a lot rougher around the edges (it’s code from a PhD thesis, though much better than typical “research code”).
The imaging pipeline is vastly superior to anything currently available on Linux. The real question is why not use Windows, which has if anything a technically superior story to OS X, and is certainly much more at home in a datacenter.
I’m really interested in why (or what in) OS X is better than Linux as well as why Windows would be better than OS X. Do you have any resources to read more about this?
Reminded me of http://macminicolo.net/ .
Interesting, but this also sounds like someone made some very bad decisions way up the pipe.
Also “Once a rack hits the datacenter floor, it can be processing images in as little as two hours.” seems like a really funny idea of “rapid.”
You can do things with OS X’s imaging and video pipeline that you’d have to spend considerable effort building a bespoke Linux solution to replicate. It’s not prima facie ridiculous.
It’s definitely not rapid by most measures, but the power that becomes available (44 Mac Pros) is pretty significant.