(In reply to Stefan Br�ns from comment #3) > I have no problem with using plain BLAS on architectures where openBLAS does > not exist (e.g. RISC-V, its vectorization support does not fit openBLAS) or > which are so rarely used the extra (theoretical) support burden does not pay > off. > > The current change, regardless of the performance, causes fragmentation and > extra work: > > - The HPC builds do not run the test suite, regressions can go in unnoticed This should be fixed regardless of this issue. > - when openBLAS is good enough for HPC, it should be good enough for regular > use This had nothing to do with 'good enough'. On SLE it is to keep the support matrix small - especially on platforms I'm less familiar with, like ppc and s390. BLAS has been available on SLE for a long time and is consumed by other packages, so it is there already and will not go away easily. It might be an option to drop it in favor of OpenBLAS, however. > - Having different code for e.g. Leap/SLE and Tumbleweed doubles the work > required when there are bugs specific to either BLAS or openBLAS. Yes, of course. > numpy is used as the canonical array format for python bindings of many > scientific packages, many of these are not available as HPC modules (and as > far as I can see, HPC modules do not exist for Leap). IMHO with this change > Leap becomes an unusable toy not meant for scientific work. Yes, this is what I don't want either. Technically, you could use these packages with the HPC version of numpy - by setting the environment accordingly - but this would be awkward. (In reply to Stefan Br�ns from comment #4) > Another option would be to finally fix update-alternatives for > BLAS/openBLAS. Currently the openBLAS build mangles the soname so > update-alternatives does not really work. I haven't looked at this, yet, but wouldn't this require for BLAS/OpenBLAS to be ABI compatible? Not sure if this is the case ... I will revert the change - but this will most likely not happen this week, yet, as I'd like to discuss a couple of things with maintainers beforehand.