Does anyone know where servo stands relative to other browsers. I’ve been following the project for years, and from what I know from talking to the developers a while back is that end-to-end its a lot slower than chrome and firefox.
I haven’t been following it very closely, but the impression I get is that it’s quite a bit faster in layout and rendering than anything else. For JS, it’s using the Spidermonkey engine just like Firefox, so it should be equivalently fast. The biggest weaknesses at the moment are the smaller, fiddly things like form fields and some of the more rarely-used CSS features. There’s a reason why the alpha only targets four specific websites. :)
It only specifically targets four sites. Obviously it’s expected to render far more than that correctly, and any failure to correctly render something Firefox gets is presumably a bug, but those four are prioritized and targeted as fairly complicated targets and ones that make dogfooding easier.
All web browsers can only render specific sites, it’s just at some threshold of market share that starts to become the sites' problem rather than the browser’s problem. :-)
Theoretically, sure. Experience running unusual browser setups does not back up this being how it works in practice. The standards are large and full of edge cases and complying with them is often not enough to make things work.
Does anyone know where servo stands relative to other browsers. I’ve been following the project for years, and from what I know from talking to the developers a while back is that end-to-end its a lot slower than chrome and firefox.
I haven’t been following it very closely, but the impression I get is that it’s quite a bit faster in layout and rendering than anything else. For JS, it’s using the Spidermonkey engine just like Firefox, so it should be equivalently fast. The biggest weaknesses at the moment are the smaller, fiddly things like form fields and some of the more rarely-used CSS features. There’s a reason why the alpha only targets four specific websites. :)
Oh god, it can only render specific sites…..
It only specifically targets four sites. Obviously it’s expected to render far more than that correctly, and any failure to correctly render something Firefox gets is presumably a bug, but those four are prioritized and targeted as fairly complicated targets and ones that make dogfooding easier.
All web browsers can only render specific sites, it’s just at some threshold of market share that starts to become the sites' problem rather than the browser’s problem. :-)
well, browsers and websites also follow specific agreed upon standards. Market share isn’t the win all be all decider of all things…
Theoretically, sure. Experience running unusual browser setups does not back up this being how it works in practice. The standards are large and full of edge cases and complying with them is often not enough to make things work.