1. 10
  1.  

  2. 10

    Why was this written and C++ and not a shell script? Kinda seems like a crime. A sample of the code…

    if(pm == "apt") {
    	// apt
    	search = "apt search ";
    	install = "apt install ";
    	uninstall = "apt purge ";
    	autoremove = "apt purge --autoremove";
    	update = "apt update";
    	upgrade = "apt upgrade";
    	upgrade_pkg = "apt upgrade ";
    	clean = "apt autoclean && apt clean";
    }
    
    1. 1

      I submitted a PR to make it more C++-y and the author felt like the series of if-statements was easier to modify ¯\_(ツ)_/¯

    2. 7

      I get the idea, but… would this really be an improvement? The mature package manager programs have some extremely specific --switches, and some behaviours particular to just that package system (or, at best, a very small subset of systems). e.g. USE flags of Gentoo’s Portage. I’m also a little surprised to see the likes of npm and Rubygems mixed in there, because library/module management is a different animal than system-wide app and library management.

      I’m skeptical about this.

      1. 2

        I guess the advantage would be an UX one: you can just say sysget install and forget about implementation details on the distro you happen to be on. Most of the time, you don’t need package-manager specific flags and the standard install/search/info/remove is enough (build-from-source systems like Gentoo’s portage are the exception, not the rule).

        This also applies to wrapping npm/gem/pip/etc. instead of having to remember the syntax for all these tools, you can just use this as a wrapper and do the common operations without much thinking.

        1. 2

          At least in the case of pip: since it is a package installer (not manager), masking that difference is less a UX convenience, more a bug.

          1. 2

            What does that mean in practical terms? As far as I know pip can install, search, uninstall, etc.?

            1. 2

              At the surface level, there’s no mechanism to resolve dependency conflicts, nor guarantees of legacy package versions to continue being available on PyPI in the future, and other related issues.

              In practical terms, if you pip freeze > requirements.txt a trivial project today then pip install -r requirements.txt in a few years, you shouldn’t have the expectation it would work, regardless of what you’re doing with virtual environments. Well, unless you set up your own package index, though at that point you might as well check in the source if you’re not managing a large site.

              But there are much deeper issues in other ways due to C bindings, Cython, binary wheels, etc. You can see some relatively trivial examples in shapely here or fiona here of well-behaving packages that suggest you install other dependencies in a more appropriate way, but there are many (many many) poorly behaving packages indexed on PyPI.

              But you can pip install amazing software, from ansible to numpy to tensorflow. It gets you up and running quickly, and that’s a massive distribution achievement. You may never run into issues if you can avoid making any sudden movements. Still, most publishers of that kind of significant software steer people towards more appropriate installation methods.

              I stick to my OS package manager for software installation. Most important Python software is there in any major Linux distro or BSD flavor. When something isn’t there, I’d approach it with the same caution and discernment as anything I’d be building from source. Installing software outside a complete package manager usually ends in heartache unless you RTFM and understand what it’s doing. Plus, most *nix systems already come with robust package management (Windows users are probably best off with Anaconda, Mac with some duct-taped combination of brew and conda).

              I assume this doesn’t become an issue for gem and npm due to the nature of Ruby & JS (apologies if this is inaccurate, those languages aren’t part of my tool set). Lots of the things that get installed via pip aren’t related to interpreted Python.

              For the stout of heart, there’s a nice tool I just came across this week to convert pip packages into PKGBUILD files for pacman (pipman). I just switched my Linux workstation to arch with plans to contribute back by adding packages for anything I needed to do my work, but there hasn’t been a single package I needed to build from ports (AUR) yet, let alone port myself.

        2. 2

          most of the time you just need the basic operations it wraps. i use debian at work and i long ago got fed up of having to remember which binary (apt-get, aptitude, dpkg, …) i needed to do which operation and installed wajig instead. i’ve been using it happily for ~10 years, and the only time i need more specialised flags i’ve been copy/pasting commands from some wiki or the other to solve some corner case, and don’t need to remember what exactly i did.

          i agree that mixing in programming language package managers seems like a bad idea, though.

        3. 4

          I think that this would only ever be useful if it mapped package names between distributions too, e.g. sysget install libssl-dev would install libssl-dev on Debian, and openssl-devel on Red Hat.

          1. 4

            It looks like it overlaps with the goals of this recent project:

            https://repl.it/site/blog/upm

            https://news.ycombinator.com/item?id=21418251

            https://github.com/replit/upm (written in Go)

            If repl.it uses it, it apparently works pretty well. I have used repl.it to test out Python, Julia, and R code.

            1. 1

              Honestly until someone writes it in bash scripts I doubt it’ll catch on very well…. The whole problem with something like this written in Go, or C++ or whatever is that you have to build/ run it with dependencies….

              1. 2

                I dunno I’ve tried to write stuff like package managers in shell and it sucks. Shell doesn’t even have a decent flag parser, much less decent data structures. That’s one of the motivations for https://oilshell.org :)

                As far as dependencies, I actually prefer C++ to bash, because a C++ compiler is everywhere. I agree it’s sort of annoying to have the Go build dependency. For example, I had to learn the hard way that nobody really uses the versions of Go compilers in Debian. Most open source code Go doesn’t build that way as far as i remember. You have to get it from the website, etc.

                But still I think that repl.it is doing some interesting stuff and the dependency isn’t a dealbreaker.

                1. 2

                  it’s an if else block with strings so a shell script despite its many shortcomings would probably be fine. I will say I do think yes C/C++ is probably better than Go for software like this. I do think for Repl.it upm makes much more sense, as it doesn’t have to worry about go deps, or even the initial install as they control the environment. Different contexts have different constraints. However for how I the end user would use it I think sysget < UPM < shellscript . For a universal package manager, no compile is better than compile, no runtime is better than runtime. I don’t want to have to use the package manager to install my universal package manager. By the time I do I don’t need it anymore.

            2. 2

              hmm, the one thing that I think this could benefit from is letting me specify a requirements file, and to indicate directly what package manager to use.

              I would love to provide a file and say “I want this from pip, that from renv, this other thing from the system default manager” and for it to do “the right thing” (along with allowing version pinning when possible)

              1. 2

                Sounds like something Ansible can do.

              2. 1

                This is, generally, why I just use puppet or ansible to install stuff.

                1. 1

                  So… PackageKit?