1. 3
  1.  

  2. 3

    Before I switched to programming, I was studying to be a physicist. We had the exact same problem: coding is vital to physics, but it’s never taught as part of the core curriculum. There was one class where you were expected to crunch hundreds of charts in excel spreadsheets. I was the only one of my friend group to get an A, not because I was a particularly good physics student (I was middle of the pack at best), but because I was the only one who knew MATLAB.

    1. 1

      I remember at ic.ac.uk for my physics degree they taught a session on Visual C++ which was next to useless.

      Worse they said ‘most people use Fortran in the field’ which hardly motivated anyone.

      Teaching C++ to a bunch of people who have never coded before, the department must have been drunk at the time to think that was a good idea… :-/

    2. 1

      This is something we’re fighting with at the NRAO where I work. Radio astronomy data in particular requires quite a bit of post-processing to be scientifically usable. The director of the NRAO considers science-ready data products to be one of our most pressing issues.

      The principal data analysis software, CASA, is basically a suite of Python libraries. I don’t see that changing, but a lot of the work in the new archive we’re building has to do with doing processing in our cluster prior to delivery because the datasets are just too big. In the long term, we will start furnishing more capabilities through that interface. It’s difficult for me to imagine us getting to 100% science-ready, “you don’t have to do any coding,” but I think everyone agrees that the situation really can’t afford to get much worse.