When I first learned Fortran back in the mid-1960s, we always drew a slash through the zero to distinguish it from majuscule O on our Hollerith cards.
I rarely see that anymore.
Back when I was in college one of my computer science profs was really a member of the math department and grudgingly handled a class on logic (I forget the precise class). As far as he was concerned, "computer science" was a bogus discipline. He'd mark answers incorrect if there was a slash through what should be a zero digit because that character meant "null" as in "null set" - it wasn't a zero as far as he was concerned.
He also graded strictly on a curve - top one or two scores got an A, bottom one or two scores got an F, everything else in between according to something close to a normal curve. I remember now that there was one quiz where most everyone's score was bunched together in the nineties. He wasn't too pleased and promised that the next quiz would be much harder.
Which reminds me of an English teacher I had who claimed to grade on a "curve". As he described it, he marked the stairs in his house with "A" through "F" (skipping "E" as that wasn't used for grades in the US back then - no idea what they use now). He would go to the top of the stairs and throw the papers in the air. The step your paper landed on was your grade.