V
IH is the usual notation used for the voltage above which an input pin (or an I/O pin acting as an input) will detect logic '1'. 'min. V
IH' (or 'V
IH(min)') refers to the *MINIMUM* voltage at which all (undamaged) chips of the specified type will *ALWAYS* detect logic '1'. It may or may not depend on the Vcc of the chip in question. if it does, it will be expressed as a proportion or percentage of Vcc or an expression involving Vcc, or as a list of values for various Vcc ranges.
Thanks, again I missed some basics. From what I understand, this VIH has nothing to do with the power supply, but with the pin outputs from MCU, is that correct? On the other hand, VOH has a minimum of 2.4V, does that mean that the MCU pins connected to the I/O0~7 would receive at least 2.4V when they are READING from the rom chip? And then I need to check whether that is sufficient for a "1" signal for the MCU pins.
If interfacing a 5V supply logic IC with a 3.3V MCU with 5V tolerant pins, its usually the other way around - you need to check if the 5V logic IC will 'see' the near 3.3V logic high from the MCU as logic high. As I noted above, that's OK this time. Its extremely unlikely that a 5V logic IC wont have a high enough output level to meet a 5V tolerant 3.3V MCU's min. V
IH threshold voltage. However as the AT28C16 datasheet says 'min. V
OH (@I
OH=-400uA)' is 2.4V then yes you *SHOULD* check it against your MCU's min. V
IH for its 5V tolerant pins.
N.B. its quite common for a MCU to have different input threshold voltages for specific types of pin, and for them to vary with supply voltage. Make sure you are reading the correct V
IH line in the datasheet!
Edit: corrected sentence order: "As I noted above, that's OK this time." referred to MCU VOH => EEPROM VIH