I know it is not normal to feel so good about this, but when processing large amounts of data coming from a slow network file system, going through 64 cores and 256 GB with nary a blip in CPU activity or memory pressure … well, one does get a little depressed. As Madeleine Albright once observed, what’s the point of all that processing power and memory if you are never going to use it or some such thing.
When a simple pipeline works, it feels good:
Sometimes it feels too close to the edge:
See also Can Parallel::ForkManager speed up a seemingly IO bound task?