1.  

    This is great! Are you aware of any desktop tools like that?

    1.  

      The constituent compiler/toolchain tools are what’s in use under the covers. gcc/clang/objdump etc.

    1.  

      willing to bet the recent issue with YouTube Piracy filter blocking MIT courses and the Blender Foundation are the result of The Machine being the ultimate decider.

      Yeah, I suppose he’s right. I can imagine someone executing a query to tell YT about net value for particular videos and being concerned about some outliers. “zero-point-eight percent of our traffic comes from videos that the participant hasn’t opted in to any monetization. It’s in our T&C that we can make this a requirement. Let’s just do it — we are subsidizing these videos. These users probably just neglected to opt-in and some of their content ‘went viral’.”

      This kind of conversation would sound pretty rational and maaaaaaaybe someone would pipe up with “but what if they didn’t forget to opt-in, they intentionally didn’t opt-in?” If anyone thought that, it was likely dismissed pretty quickly.

      1. 2

        There’s a lot of convoluted logic and repetition that I’d like to eliminate, but there are no tests to help me not break things.

        It’s great that you’ve identified the problem – work on this first! Create tests for the existing code. Unfortunately the best way that you can validate that the tests’ expected results are accurate is by interviewing the team that created the existing code.

        IMO if your team thinks that refactoring the existing code is a good way to go, the best move is to make these tests first. You will inevitably create incomplete test cases and discover gaps after you start refactoring. But the good news is that you will force yourself and your team to go through the exercise of evaluating “what is the intent of the existing design?” I predict that you will end up with net fewer bugs as a result.

          1. 4

            Can anyone help me understand why Metal was designed? Apple’s a heavy hitter in Khronos, right? So what was it that they felt like they couldn’t accomplish with OGL/OCL? Are there non-Mac targets that support Metal?

            1. 6

              OpenGL is a tired old API that is too high level for high performance graphics work. At the time when Metal was being developed folks were working on lower level APIs to expose the GPU more, like Mantle and DirectX 12, and Metal was Apple’s offering. I believe Mantle eventually evolved into Vulkan, but for some reason Apple is continuing to promote Metal. It’s a nicer API for Swift users, but that’s about it. I would have preferred that they’d make a safe API over Vulkan for Swift like Vulkano, they seem to be under some weird impression that they’ll be able to trap devs in their platform with their own, proprietary API. Or maybe they just can’t bear to give up all the sunk cost.

              1. 2

                they seem to be under some weird impression that they’ll be able to trap devs in their platform with their own, proprietary API

                Is it not working quite well for Microsoft with DirectX?

              2. 1

                As I vaguely recall, it started on ios as a way to utilize their graphics chips faster and more efficiently (lower overhead).

              1. 13

                “Truly immutable timestamps could be useful”

                As with most stuff, there’s already a standard for that. One company, Surety, even puts the hash of their timestamp ledger (hash chains) in the New York Times to create a paper trail. I’m sure the decentralized checking part could be scaled horizontally a bit without much change in protocol or energy usage. The individual operations are still simple enough to do on chips that are a few bucks each.

                1. 7

                  there’s already a standard for that. One company…

                  The big feature that Bitcoin and other blockchains bring to the table is decentralization. If you can rely on a company for stewardship of your ledger, then by all means use a permissioned database like Surety does.

                  On the trusted timestamping page you linked, if you skip to the decentralized section, you can see it immediately starts talking about Bitcoin.

                  I’m not sure how much Surety’s service costs, but piggybacking on the Bitcoin or Ethereum blockchains is likely far cheaper. Here is a tutorial on how to store a message as an Ethereum contract. The cost is variable with the string length, but in this case only cost about $0.20. It works by deploying a solidity contract that is just a couple string variables. The output is observable on etherscan.

                  1. 4

                    In my model, several foundations in different countries run by different people would agree on a protocol. It would store stuff in SQLite, FoundationDB, or something similarly fast/resilient. A web or app server with plenty of cache would give snapshots of the ledgers. They’d charge a fixed price for bandwidth and storage which could go up as the tech improves.

                    This setup for something small like hashes with a niche audience could run on $5/mo VM’s. Even dedicated servers, 5-way redundancy with years of compute, storage, and bandwidth would be just over a $1,000 a month. The components theyd use are so vanilla the admins could be part-time. How much does Ethereum or Bitcoin cost in comparison?

                    1. 4

                      Check it out, this message cost $0.80. Zero sysadmin effort on my part due to leveraging a preexisting system. Also, the message won’t vanish if I stop paying VPS bills.

                      If you’re a large corporation that wants to timestamp thousands or millions of messages, the centralized approach could very well be cheaper. For me, verifying maybe a handful of messages per year, it’s way easier to piggyback on a large blockchain project.

                      1. 4

                        That’s a decent point. If you’re just externalizing and aimjng for low cost, you can post the messages in threads of diverse forums, Pastebin, etc. I used to do that with hashes on blogs. Never cost a cent.

                        1. 3

                          verifying maybe a handful of messages per year

                          What’s your actual usecase for this? I struggle to see viable usecases for the blockchain beyond speculation so it’s interesting to hear what people consider are valid usecases for it.

                          1. 2

                            I should have worded that differently… I don’t timestamp messages all that often. What I meant to convey is that $1000/mo is definitely overkill for anyone with intermittent needs.

                            Pastebin and forum posts are fine, but centralized. If Pastebin ever goes down, or starts manipulating old posts, then the integrity of your verification is compromised. Embedding the message in Ethereum’s blockchain is a much stronger guarantee of permanence and immutability.

                            What sorts of blog posts need such tamper-proofing? Anything dealing with warrant canaries, reverse engineering, or low-level firmware might deserve it.

                            1. 3

                              I should have worded that differently… I don’t timestamp messages all that often. What I meant to convey is that $1000/mo is definitely overkill for anyone with intermittent need

                              The $1,000/mo is for the hardware and bandwidth to run the alternative to a blockchain. In the blockchain, you’re a user that pays for a tiny portion that you use. In the alternative, you’d similarly pay for a tiny portion that you use. Maybe a membership fee that covers general cost of operations with you paying the usage parts at cost. I gave the examples of $5 VM’s to illustrate the difference between whatever Bitcoin is doing for mining or transactions. I imagine it takes a bit more hardware than $5/mo.

                              The other article today said companies were paying $10,000 a unit for what supports this system. My hypothesis was getting orders of magnitude better performance with a year of usage at the same price with five-way checking. Adding actors that don’t trust each other just adds small amounts to the system without dragging down the speed of its main DB’s. Whereas, the folks buying the ASIC’s are spending tens of millions to support almost nothing in terms of transactions. The traditional tech is so cheap that I was using blogs to do my version of it. They didn’t even notice. That’s the difference between crypto-currency tech and traditional tech w/ decentralized checking.

                              1. 3

                                Pastebin and forum posts are fine, but centralized. If Pastebin ever goes down, or starts manipulating old posts, then the integrity of your verification is compromised. Embedding the message in Ethereum’s blockchain is a much stronger guarantee of permanence and immutability.

                                It doesn’t solve the permanence problem, but just signing text is sufficient to address tampering, which doesn’t use a lot of electricity. So is being permanent the selling point? There is also ipfs which doesn’t require PoW but is decentralized, would that + signing be sufficient for your needs?

                                Basically, I’m still struggling to figure out what the blockchain does that makes the excessive energy usage worthwhile. Maybe I’m just being narrow minded, but I still only really see financial speculation as the primary motivator, so if that becomes unviable, why would anyone continue to run a bitcoin node (and there goes the permanence?)

                              2. 0

                                Yeah, you’d really need to know the use case to try to use it as justification for the Bitcoin blockchain and all its baggage. As the full quote from the linked article says:

                                Truly immutable timestamps could be useful — assuming anyone finds a timestamp use case so important that it warrants a country-sized percentage of the world’s electricity consumption.

                                1. 2

                                  Ah, sure. Immutable timestamps are a fun way to piggyback on the existing Bitcoin and Ethereum blockchains, but timestamping by itself is not a justification for those coins existing.

                        2. 3

                          plenty of hype here too: https://guardtime.com/ for example

                          1. 3

                            Oh Lord… they gotten bitten by the bug. No surprise, though, since it’s a fad with momentum and lots of money. I expect any company that can do a blockchain product to build one just to make money off it. Given their prior work, there’s little reason to think they actually needed a blockchain vs hash chains with distributed checking and/or HSM’s. Just cashing in. ;)

                            Btw, do check out that Functional-Relational slide deck I submitted. It shows the Out of the Tar Pit solution is essentially what the new, GUI frameworks are doing. It was just years ahead. So, maybe some practical uses for some version of their model.

                            1. 5

                              Guardtime’s KSI Blockchain is probably my favourite blockchain hype. The product was first released in 2007 and they only branded it “blockchain” a few years ago … for marketing reasons.

                              I have a post about it here. In one of their white papers, they literally redefine “blockchain” to mean containing a Merkle tree:

                              Unlike traditional approaches that depend on asymmetric key cryptography, KSI uses only hash-function cryptography, allowing verification to rely only on the security of hash-functions and the availability of the history of cryptologically linked root hashes (the blockchain).

                              I hear cryptocurrency people touting Estonia’s BLOCKCHAIN REVOLUTION as great news for Blockchain, and even great news for cryptocurrency. It’s not even a blockchain.

                              I mean, I have no reason to think there’s anything wrong with it. I’m sure it does its job just fine. But goodness me, it’s the greatest marketing success “blockchain” the buzzword ever saw.

                              1. 4

                                If anything, it was a great way to show we didnt need a blockchain when our older concepts were working fine. They might benefit by using the buzzword. Yet, such misleading usage just reinforces the phenomenon where the BS spreads further.

                                Im not even sure it’s reversible at any level given these fads usually either level off or implode with the name and reputation damage permanently attached to whatever the name touched. AI Winter, expert systems, and Common LISP are some of best examples.

                                1. 1

                                  There’s probably a post I need to write on this topic: basically, we’re going to see a resurgence in the popularity of linked lists with hashes, and they’re going to be branded “blockchain(tm)”. There are a few non-bogus projects along these lines, but it’s not so great actually and in all cases they should have just used a frickin database.

                                  Likely case, we get mostly-working systems that have an eternally painful “blockchain(tm)” implementation at the core that can’t easily be replaced by something sane.

                          2. 2

                            I had no idea about surety or even the ability to do that — thanks!

                            1. 3

                              Sure thing! Trusted timestamping is actually one of my goto examples for hash-chain-using tech that predates blockchain craze. What timestamping-on-blockchain folks hope to achieve is what such companies have been doing reliably and efficiently for years now. Better to just invest in and improve on efficient models that already work.

                            2. 2

                              The standard isn’t the hard part, the trust is.

                              How much money would it take to bribe the Surety employee who has the fewest scruples? That’s about the ceiling for which you can bet on their authentication service.

                              1. 4

                                The thing I push is centralized, standard ledgers with decentralized checking. For Surety done that way, it would take you bribing all the checkers. Alternatively, the HSM’s can mitigate some of the insider risk.

                            1. 2

                              Why this over something else?

                              1. 16

                                Good bittorrent daemons are hard to find. rTorrent is common but it tacks on this difficult curses interface you have to deal with. Transmission is okay but it tends to get buggy and break down at scale. Deluge is buggy as hell. btpd is too bare bones, a lot of important features are missing. All of these options also have really poor RPC protocols that use a lot of network, are annoying to write clients for, and don’t scale.

                                synapse focuses only on being a good daemon and delivers only that. The UIs are offloaded to separate projects. If that doesn’t seem like much, that’s because it’s not - but surprisingly this is not easy to find. We made it because there were no good options.

                                1. 3

                                  Transmission is okay but it tends to get buggy and break down at scale.

                                  I must not have used it at scale before, it always seems to work for me. What sort of failure modes do you observe? Corrupted downloads? Halted downloads w/peers available? Other?

                                  1. 9

                                    Transmission crawls to a halt if you have several hundred torrents. The RPC protocol also becomes unweildy, because it polls for updates and has to resend large amounts of data every time it refreshes. Synapse is more performant with large torrents or a large number of torrents and the RPC is push based, with subscriptions and differential updates.

                                  2. 2

                                    Has synapse been tested at scale then?

                                    Everything I’ve tried has been horrible at scale except rTorrent, and most of the non-rTorrent choices can be pretty horrible even when at a modest amount of torrents (qBittorrent at a certain point ‘invisibly’ adds torrents etc.). With rutorrent as the frontend, I’ve been pretty happy with rTorrent.

                                    Synapse looks interesting, though I’m not terribly enthuasiastic about the node.js webclient (the node.js Flood client uses significantly more resources on my system than does the php rutorrent).

                                    1. 4

                                      Receptor is 100% frontend, pure static content. Node is just used for compiling and packaging it. You don’t even have to install it yourself - a hosted version is available at web.synapse-bt.org.

                                      1. 3

                                        I’ve done load testing (though not particularly realistic) and it appears that both synapse and receptor perform reasonably at the order of 1000 torrents. One of the goals of the project is to perform well at scale and there’s been a fair amount of ongoing work to achieve that.

                                      2. [Comment removed by author]

                                        1. 2

                                          Sequential downloading and file priority are both implemented.

                                        2. 1

                                          offloading UI to separate projects

                                          I love that approach!

                                      1. 1

                                        Bitcoin has undergone enormous changes in the last sixty days or so. The world started buying lots more, so the exchange rate went up. As a direct consequence, fees were more expensive when considered as the fiat equivalent. As an indirect consequence, there were more transactions competing for space in blocks, so the native (satoshis/byte) fees went up. Headlines talked about just how poorly suited bitcoin was for its original mission and likely how overvalued it might be if it’s so expensive to execute transactions.

                                        But in a stunning turnaround, lots of effort has moved into segwit and LN in the last couple weeks. Fees now are lower than ever.

                                        1. 7

                                          At that time, when you turned on your computer, you immediately had programming language available. Even in 90’s, there was QBasic installed on almost all PCs. Interpreter and editor in one, so it was very easy to enter the world of programming. Kids could learn it themselves with cheap books and magazines with lot of BASIC program listings. And I think the most important thing - kids were curious about computers. I can see that today, the role of BASIC is taken by Minecraft. I wouldn’t underestimate it as a trigger for a new generation of engineers and developers. Add more physics and more logic into it and it will be excellent playground like BASIC was in 80s.

                                          1. 5

                                            Now we have the raspberry pi, arduino, python, scratch and so many other ways kids can get started.

                                            1. 10

                                              Right, but at the beginning you have to spend a lot of time more to show kid how to setup everything properly. I admire that it itself is fun, but in 80’s you just turned computer on with one switch and environment was literally READY :)

                                              1. 7

                                                I think the problem is that back then there was much less competition for kids attention. The biggest draw was TV. TV – that played certain shows on a particular schedule, with lots of re-runs. If there was nothing on, but you had a computer nearby, you could escape and unleash your creativity there.

                                                Today – there’s perpetual phones/tablets/computers and mega-society level connectivity. There’s no time during which they can’t find out what their friends are up to.

                                                Even for me – to immerse myself in a computer, exploring programming – it’s harder to do than it was ten years ago.

                                                1. 5

                                                  I admire that it itself is fun, but in 80’s you just turned computer on with one switch and environment was literally READY :)

                                                  We must be using some fairly narrow definition of “the ‘80s”, because this is a seriously rose-tinted description of learning to program at the time. By the late 80’s, with the rise of the Mac and Windows, the only way to learn to program involved buying a commercial compiler.

                                                  I had to beg for a copy of “Just Enough Pascal” in 1988, which came with a floppy containing a copy of Think’s Lightspeed Pascal compiler, and retailed for the equivalent of $155.

                                                  Kids these days have it comparatively easy – all the tools are free.

                                                  1. 1

                                                    Windows still shipped with QBasic well into the 90s, and Macs shipped with HyperCard. It wasn’t quite one-click hacking, but it was still far more accessible than today.

                                                  2. 4

                                                    Just open the web-tools in your browser, you’ll have an already configured javascript development environment.

                                                    I entirely agree with you on

                                                    And I think the most important thing - kids were curious about computers.

                                                    You don’t need to understand how a computer program is made to use it anymore; which is not necessary something bad.

                                                    1. 4

                                                      That’s still not the same. kred is saying it was first thing you see with you immediately able to use it. It was also a simple language designed to be easy to learn. Whereas, you have to go out of your way to get to JS development environment on top of learning complex language and concepts. More complexity. More friction. Less uptake.

                                                      The other issue that’s not addressed enough in these write-ups is that modern platforms have tons of games that treat people as consumers with psychological techniques to keep them addicted. They also build boxes around their mind where they can feel like they’re creating stuff without learning much in useful, reusable skill versus prior generation’s toys. Kids can get the consumer and creator high without doing real creation. So, now they have to ignore that to do the high-friction stuff above to get to the basics of creating that existed for old generation. Most won’t want to do it because it’s not as fun as their apps and games.

                                                      1. 1

                                                        There is no shortage of programmers now. We are not facing any issues with not enough kids learning programming.

                                                        1. 2

                                                          I didnt say there was a shortage of programmers. I said most kids were learning computers in a way that trained them to be consumers vs creators. You’d have to compare what people do in consumer platforms versus things like Scratch to get an idea of what we’re missing out on.

                                                  3. 4

                                                    All of those require a lot more setup than older machines where you flipped a switch and got dropped into a dev environment.

                                                    The Arduino is useless if you don’t have a project, a computer already configured for development, and electronics breadboarding to talk to it. The Raspberry pi is a weird little circuit board that, until you dismantle your existing computer and hook everything up, can’t do anything–and when you do get it hooked up, you’re greeted with Linux. Python is large and hard to put images to on the screen or make noises with in a few lines of code.

                                                    Scratch is maybe the closest, but it still has the “what programmers doing education think is simple” problem instead of the “simple tools for programming in a barebones environment that learners can manage”.

                                                    The field of programming education is broken in this way. It’s a systemic worldview problem.

                                                    1. 1

                                                      Those aren’t even close in terms of ease of use.

                                                      My elementary school circa 1988 had a lab full of these Apple IIe systems, and my recollection (I was about 6 years old at the time, so I may be misremembering) is that by default they booted into a BASIC REPL.

                                                      Raspberry Pis and Arduinos are fun, but they’re a lot more complex and difficult to work with.

                                                    2. 3

                                                      I don’t think kids are less curious today, but it’s important to notice that back then, making a really polished program that felt professional only needed a small amount of comparatively simple work - things like prompting for all your inputs explicitly rather than hard-coding them, and making sure your colored backgrounds were redrawn properly after editing.

                                                      To make a polished GUI app today is prohibitive in terms of time expenditure and diversity of knowledge needed. The web is a little better, but not by much. So beginners are often left with a feeling that their work is inadequate and not worth sharing. The ones who decide to be okay with that and talk about what they’ve done anyway show remarkable courage - and they’re pretty rare.

                                                      Also, of course, back then there was no choice of which of the many available on-ramps to start with. You learned the language that came with your computer, and if you got good enough maybe you learned assembly or asked your parents to save up and buy you a compiler. Today, as you say, things like Minecraft are among the options. As common starting points I’d also like to mention Node and PHP, both ecosystems which owe a lot of their popularity to their efforts to reduce the breadth of knowledge needed to build end-to-end systems.

                                                      But in addition to being good starting points, those ecosystems have something else in common - there are lots of people who viscerally hate them and aren’t shy about saying so. A child just starting out is going to be highly intimidated by that, and feel that they have no way to navigate whether the technical considerations the adults are yelling about are really that important or not. In a past life, I taught middle-school, and it gave me an opportunity to watch young people being pushed away by cultural factors despite their determination to learn. It was really disheartening.

                                                      Navigating the complicated choices of where to start learning is really challenging, no matter what age you are. But for children, it’s often impossible, or too frightening to try.

                                                      I agree with what I took to be your main point, that if those of us who learned young care about helping the next generation to follow in our footsteps, we should meet them where they are and make sure to build playgrounds that they can enjoy with or without a technical understanding. But my real prediction is that the cultural factors are going to continue to be a blocker, and programming is unlikely to again be a thing that children have widespread mastery of in the way that it was in the 80s. It’s really very saddening.

                                                    1. 1

                                                      They should put this in a cheap laptop - a 15’’ one, not those silly netbooks.

                                                      1. 5

                                                        Snapdragon SoC laptops are supposed to arrive 18Q1. I suspect that they will be followed by other ARM based SoCs soon after.

                                                      1. 2

                                                        Wow. Just imagine a city filled with cameras equipped with realtime (or even ~10s per image) masked object detection. With today’s technology you could easily ask the city’s cameras, “So, what happened today?” Each location is probably normally filled with the same ten or twenty objects doing the same ten things.

                                                        What a beautiful, terrible world.

                                                        1. 1

                                                          I remember watching this video around the time when it was released. With the recent explosion in commercial interest in RISC-V it seems like J-Core is getting left behind.

                                                          I look forward to seeing J-Core SoCs and boards. I expect that unlike J-Core, many RISC-V cores will be packed with proprietary extensions and private toolchains.

                                                          1. 3

                                                            What if it were phased in? Like writing OS behavior to mask-interrupts-and-poll with the existing design, and new chips could be designed to capitalize on the fact that no one uses the interrupts?

                                                            I ask because it seems like a cardinal sin for hardware to make backwards-incompatible changes.

                                                            1. 5

                                                              Working on bugs for raiblocks, a cryptocoin I’ve recently heard about. It’s got a small dev team (it was just solo until ~1-2 months ago) so my contributions feel like they’re really valuable.

                                                              1. 2

                                                                Cool post, thanks for sharing.

                                                                Worth noting that HN has an API

                                                                1. 7

                                                                  Mitigations on the way from Chrome and Firefox.

                                                                  1. 1

                                                                    Does anyone know: how can I see whether the version I have has these mitigation(s)? These announcements aren’t explicit about the version numbers that introduce the change.

                                                                    Seems odd that project zero disclosed this six months ago and so many seem caught off guard. Was the problem only disclosed to CPU vendors and not to OS, compiler, browser vendors? And yet many of the mitigations are only now going into compilers+browsers?

                                                                    1. 2

                                                                      The Firefox post has an update at the bottom listing the versions now. If you’re on the regular stable release they’re in 57.0.4, which was released on January 4.

                                                                  1. 5

                                                                    iPhone 6S and subsequent discovery that the performance restored to its full potential after a battery replacement.

                                                                    I was pretty skeptical until I read this bit. At least this feature is bound to the battery performance/age instead of device age as a proxy for battery life.

                                                                    These batteries are notoriously difficult to replace, though, right? How much would it cost to use a repair service to replace the battery on an iphone 6/6S?

                                                                    1. 5

                                                                      I just had my iPhone 6 battery replaced at a Genius Bar. It cost $80 and took 2 prime-time hours.

                                                                      1. 4

                                                                        2 hours to replace a battery. A few years ago on any android device this would have taken about 5 minutes and cost $20.

                                                                        1. 3

                                                                          It doesn’t actually take that long. I got my battery replaced at a non-Apple shop and it was 5-10 minutes.

                                                                          1. 2

                                                                            And your phone has the extra overhead of clips, switches, and whatever other components are necessary to make it easy to disassemble. You may prefer that overhead and that’s fine, but I think it’s fairly obvious that Apple doesn’t, which is fine too.

                                                                            1. 1

                                                                              5110 etc. replaceable battery as the whole back cover.

                                                                          2. 1

                                                                            These batteries are notoriously difficult to replace, though, right? How much would it cost to use a repair service to replace the battery on an iphone 6/6S?

                                                                            Batteries on iPhones aren’t that bad to replace - you remove the screen, disconnect the battery (and unglue it) and then put the new one in. The problem is IIRC, third-party batteries might not be working with the sensors, so you’ll still have the throttling.

                                                                            1. 1

                                                                              Is it possible to find genuine Apple battery without risking of buying clones that lack proper control electronics? I think they’re distributed weirdly to authorized companies so it’s not easy to find genuine batteries.

                                                                          1. 2

                                                                            Google pushes the envelope with compilation and linker features on their browser. I’d wonder whether this is a bug or a feature. Was LTO or ThinLTO enabled for the build? That alone could be responsible for a large portion of the time.

                                                                            Also any significant compilation work should consider the I/O to the files as a potential bottleneck. Was the filesystem cache cold or hot? Was ccache enabled?

                                                                            Compiling the entire Linux kernel for ARM though

                                                                            Notably this is a project which is entirely C + assembly and no C++.

                                                                            Michael Zolotukhin gave a presentation called “LLVM Compile-Time: Challenges. Improvements. Outlook. “ at the 2017 US LLVM Developers’ Meeting. Much of this talk is regarding performance over commits but IIRC there was a decomposition of a single compile and where the time is spent. I recall the C++ frontend being a significant contributor to compile time.

                                                                              1. 3

                                                                                IIRC codeship doesn’t store the build configuration in the repo, it stores the build config in your project on their site. It was kinda nice that I didn’t have to push commits to iterate over the various quirks in the build environment.

                                                                                I’m pretty sure codeship is not only free for open source projects but even a limited number of private ones too?

                                                                                1. 1

                                                                                  Thanks for mentioning Codeship!

                                                                                  It seems that in their Basic version they don’t require any configuration file. You configure your commands via web UI. Not sure about private ones yet, as their pricing page doesn’t mention anything about it.

                                                                                  1. 2

                                                                                    Yes, and they allow you to use ssh to debug failures or investigate/prototype.

                                                                                    I’m almost certain that I have used them in the past with a private bitbucket repo.

                                                                                    1. 3

                                                                                      I see one problem with Codeship for starters, though. They don’t allow you to add a project without connecting your SCM first, which means granting a lot of permissions to codeship, i.e. seemingly more than truly necessary.


                                                                                      GitHub

                                                                                      Personal user data:

                                                                                      • Email addresses (read-only)

                                                                                      Repositories: Public and private

                                                                                      • Code
                                                                                      • Issues
                                                                                      • Pull requests
                                                                                      • Wikis
                                                                                      • Settings
                                                                                      • Webhooks and services
                                                                                      • Deploy keys
                                                                                      • Collaboration invites

                                                                                      BitBucket

                                                                                      • Read and modify your account information
                                                                                      • Read and modify your repositories’ issues
                                                                                      • Access your repositories’ build pipelines and configure their variables
                                                                                      • Read and modify your team’s project settings, and read and transfer repositories within your team’s projects
                                                                                      • Read and modify your repositories and their pull requests
                                                                                      • Administer your repositories
                                                                                      • Delete your repositories
                                                                                      • Read and modify your snippets
                                                                                      • Read and modify your team membership information
                                                                                      • Read and modify your repositories’ webhooks
                                                                                      • Read and modify your repositories’ wikis

                                                                                      GitLab

                                                                                      • Access the authenticated user’s API:
                                                                                        Full access to GitLab as the user, including read/write on all their groups and projects

                                                                                      Second problem is similar to what was I had with old drone.io: old preinstalled packages. I’m mostly referring to very old GCC 4.8 here.

                                                                                      And I’m still not even sure that building C/C++ projects is actually really supported here, as it’s not mentioned explicitly anywhere.

                                                                                      1. 1

                                                                                        Great points, I used it for Python w/o any native/extension code. It was fine for that but might not be suited for C/C++.