• @TechLich
    link
    53 months ago

    “virtually eliminate fattening hex dumps”

    What is a fattening hex dump in this context‽

    • @Redredme
      link
      83 months ago

      You’re clearly not a child of the 80s. It’s literally in the picture. The stack of paper.

      Back then, monitors where shite and small. Diskdrives where slow. So debugging was mostly done on paper (print it, read it, mark the lines with errors) because you could read it better and it was easier to go back a page or two on paper because of shitty slow Diskdrives. It was the time when 640 kb was an insane amount of memory.

      • @TechLich
        link
        13 months ago

        Yeah, I think you’re right but the phrasing is a little weird for that. It makes it sound like the optimiser lets you avoid having to do a “hex dump” which would be somehow “fattening” for the program causing it to have worse performance. Might be the marketing people not knowing what they’re talking about.

        Although we did do a lot of printing code on dot matrix printers back in the day, it would usually be the source code itself, this is a post-pass optimiser. It ran after the COBOL compiler had already turned the human readable code into object code. Although printing out the optimised hex might save on paper as a backup solution, it probably wouldn’t help with debugging.

    • @Redredme
      link
      33 months ago

      Back in those days the answer on all three questions was: yes.