Yeah, analog died a long, long time ago. It's just been a matter of scale ever since. First in the largest scale (high cost, demanding performance) applications, like radio and television production. (I mean, easier applications of course far earlier, like the whole conception of digital computers as we know them; I don't think there was much if any analog computation (beyond mere lab curiosities) past the 60s or so?) Slowly over time, more and more applications opened up; microwave ovens replaced mechanical dials with digital timers or microcontrollers. Eventually even televisions, radios, etc. became wholly digital, save for the interfaces themselves (like VGA cables and CRT displays -- okay, the video path in those were still analog, but entirely digital signal paths are the norm since ca. DVI, HDMI).
The advantage is obvious. Why do varying amounts of something, at great expense to power consumption and noise floor, and without any means to reconfigure the system, when you can do it only every once in a while (might be microseconds, nanoseconds..) and throw a metric shitton of infinitesimally small transistors at it, fully programmable and using less power? For sure, a lot of battery operated devices we take for granted, would be simply impossible. TVs and radios were around back then of course, but they ran through batteries pretty quickly (primary cells at that, replacing them weekly perhaps), and performed poorly (not much volume, small and dim picture, etc.). I mean, not that we have any better situation with battery life today (even with great improvements in their performance), but that's at least in part motivated by the common (human) element, and it's unquestionable the sheer amount of computing power we have at our fingertips. We're walking around with literal supercomputers (as of ca. early 2000s?) in our pockets!
Tim