Why these 'strange' numbers for clock frequencies etc. ??
I'm
sure most people here understand, but for
those that don't, as it's not always explained!.....
Computers/Digital systems (& hardware) are 'attuned' to Binary/Hexadecimal representations of numbers. Like 8-Bytes, or 16-Bit Words.
However, we generalize typical computing numbers in multiples of like 2, 4, 8, 16, 32, 64, 128, 256, 512, 1024 etc, that are 'rounded' approximations.
So in reality, 4K in decimal is 'really' 4096, and 64K is really 65,535 in Decimal. They just get 'rounded off' to those above mentioned multiples.
This further adds confusion to some people, when we are relating such numbers in '
Hexadecimal', that computers/programmers love!
Most typical memory banks/blocks are depicted on/in Hexadecimal boundaries, which most should know starts their designation with a '
$' sign,
and when working in 'Hex', it becomes relatively simple to see the whole picture!! So in the above context, we see/use numbers like '
$8000' (hex),
and '
$C000' (hex). They are nicely rounded, (
though accurately depicted now!), in a format that computers/digital-hardware fully understand...
And decimal
32768 = $8000 hex. And decimal
49152 = $C000 hex!! So a Clock Frequency of '32768' is perfect for digital electronics/computers
to work with, while calculating/cascading through numbers!!
I used to love playing/coding stuff with the old Commodore-64, and writing BootLoaders using the 'SYS' command, like '
SYS 32768' to execute some
code that was loaded into the '
$8000' region of memory, etc... Was all good fun!!!