This afternoon I was listening in on a discussion about recommendations for Multi Messenger Astronomy.(*) The authors recommend a fair bit of boilerplate (cooperation between NASA and NSF, etc).
However, one of the things they note is the problems and risk that results from data acquisition or analysis or simulation software that is custom made for an experiment, and supported by maybe one grad student. He wrote it, and other folks used it, and now he finds himself called on to support it over changes in OS and bug fixes and whatnot for years after he got his degree--and maybe left the collaboration. Support dwindles, though use may increase!
What the scientists value is doing the analysis and finding the physics--not supporting the software. Since the experiment will not succeed without good software, somebody with domain knowledge needs to write and support it. But that luckless person, not so tightly involved with analysis, is not so valued when it comes to new jobs or promotions. So the software support gets short-changed. (Thus the quote above.)
Some experiments hire professional programmers to develop, in cooperation with the scientists. This helps, but is more for the larger groups with more flex in their budgets. It is hard to sufficiently emphasize how vital software is for experiments with large detectors. Even something as simple as archiving, isn't simple.
Researchers from small institutions have less flexibility in working on an experiment than those in large ones, where (for example) teaching duties can be shuffled around more easily--and (dirty secret time) sometimes almost completely shuffled off. Hence "the rich get richer." This issue turns up in the section on "Inclusive Workforce Development", along with the boilerplate about women, black, indigenous, and PoC. The first context for that boilerplate is a fairly strange claim that aggressive people tend to be favored in juicy research roles, while these others "face greater social pressures deterring aggressive or assertive behavior." Do I have to note that the majority of those "pushed aside" are not from minority groups, or is that sufficiently obvious from the definition? It suggests mitigating this with a Code of Conduct "outlining expections regarding data sharing and co-authorship in a public document." The first big problem with this is that everybody gets their name in the paper anyhow; and who actually did the analysis is going to be the key thing other scientists pay attention to--the suggestion is facially stupid. The second is that we all know the CoC will have nice ambiguous language that will be selectively interpreted against politically unpopular individuals. Maybe that will be the jerks, but likely it won't. (not necessarily national politics here) Why? Because the jerks are often useful for getting the grants that make the experiments run.
Several of the recommendations turn into "you have to budget more money for important stuff like software and archiving." Yep. Very very true. However, getting money to the budget for that isn't easy--sometimes you get a bare bones grant that doesn't cover everything you need to do. And "sometimes" is optimistic.
(*) Astronomy is done with different "messengers": light, radio waves, xrays, cosmic rays, gravity waves, and neutrinos. If something interesting appears in one detector (light from a supernova, a burst of radio waves, a neutron star merger), you want to notify the others quickly to make sure they're looking in that direction, and quite a bit of work has gone into communication systems. Part of the report has to do with making it easy to compare events in different types of system--the systematics are always very different.
1 comment:
This strikes me as similar to the lesser prestige of attempting replication of published results or of discovering a null hypothesis for your hypothesis. Everyone agrees that those are "just as important," but no one acts like this is true.
Post a Comment