I mean, I have some nostalgia moments, but while I think that while OP’s got a point that the LCD monitors that replaced CRTs were in many ways significantly technically worse at the time, I also think that in pretty much all aspects, current LCD/LEDs beat CRTs.
Looking at OP’s benefits:
0 motion blur
CRT phosphors didn’t just immediately go dark. They were better than some LCDs at the time, yeah, which were very slow, had enormous mouse pointer trails. But if you’ve ever seen a flashing cursor on a CRT and the fact that it actually faded out, you know that there was some response time.
Response time: 0.01 ms[14] to less than 1 μs,[15] but limited by phosphor decay time (around 5 ms)[16]
0 input lag
That’s not really a function of the display technology. Yeah, a traditional analog CRT television with nothing else involved just spews the signal straight to the screen, but you can stick processing in there too, as cable boxes did. The real problem was “smart” TVs adding stuff like image processing that involved buffering some video.
At the time that people started getting LCDs, a lot of them were just awful in many respects compared to CRTs.
As one moved around, the color you saw on many types of LCDs shifted dramatically.
There was very slow response time; moving a cursor around on some LCD displays would leave a trail, as it sluggishly updated. Looked kind of like some e-ink displays do today.
Contrast wasn’t great; blacks were really murky grays.
Early LCDs couldn’t do full 24-bit color depth, and dealt with it by dithering, which was a real step back in quality.
Pixels could get stuck.
But those have mostly been dealt with.
CRTs had a lot of problems too, and LED/LCD displays really address those:
They were heavy. This wasn’t so bad early on, but as CRTs grew, they really started to suck to work with. I remember straining a muscle in my back getting a >200lb television up a flight of stairs.
They were blurry. That can be a benefit, in that some software, like games, had graphics optimized for them, that lets the blur “blend” together pixels, and so old emulators often have some form of emulation of CRT video artifacts. But in a world where software can be designed around a crisp, sharp display, I’d rather have the sharpness. The blurriness also wasn’t always even, especially on flat-screen CRTs; tended to be worse in corners. And you could only get the resolution and refresh rate so high, and the higher you went, the blurrier things were.
There were scanlines; brightness wasn’t even.
You could get color fringing.
Sony Trinitrons (rather nice late CRT computer displays) had a faint, horizontal line that crossed the screen where a wire was placed to stabilize some other element of the display.
They didn’t deal so well with higher-aspect-ratio displays (well, if you wanted a flat display, anyway). For movies and such, we’re better-off with wider displays.
Analog signalling meant that as cables got longer, the image got blurrier.
They used more electricity and generated more heat than LED/LCD displays.
Also a lot of CRT nostalgia comes from whatever display a certain person had. With a lot of my CRTs, the screen were sharp enough for a more pixelated look, and required me to turn off scanline effects in emulators as they just turned everything quite bad looking instead. Except a really bad monitor I owned, because the previous owner lied it could do 1024x768@75Hz (it was an old VGA monitor, and didn’t like that resolution).
My Trinitron monitor actually had two of those stabilizing wires. They were very thin, much thinner than even a single scan line, but you could definitely notice them on an all white background.
Screens 15" and below have one wire located about two thirds of the way down the screen, while monitors greater than 15" have 2 wires at the one-third and two-thirds positions.
I mean, I have some nostalgia moments, but while I think that while OP’s got a point that the LCD monitors that replaced CRTs were in many ways significantly technically worse at the time, I also think that in pretty much all aspects, current LCD/LEDs beat CRTs.
Looking at OP’s benefits:
CRT phosphors didn’t just immediately go dark. They were better than some LCDs at the time, yeah, which were very slow, had enormous mouse pointer trails. But if you’ve ever seen a flashing cursor on a CRT and the fact that it actually faded out, you know that there was some response time.
https://en.wikipedia.org/wiki/Comparison_of_CRT,_LCD,_plasma,_and_OLED_displays
That’s not really a function of the display technology. Yeah, a traditional analog CRT television with nothing else involved just spews the signal straight to the screen, but you can stick processing in there too, as cable boxes did. The real problem was “smart” TVs adding stuff like image processing that involved buffering some video.
At the time that people started getting LCDs, a lot of them were just awful in many respects compared to CRTs.
As one moved around, the color you saw on many types of LCDs shifted dramatically.
There was very slow response time; moving a cursor around on some LCD displays would leave a trail, as it sluggishly updated. Looked kind of like some e-ink displays do today.
Contrast wasn’t great; blacks were really murky grays.
Early LCDs couldn’t do full 24-bit color depth, and dealt with it by dithering, which was a real step back in quality.
Pixels could get stuck.
But those have mostly been dealt with.
CRTs had a lot of problems too, and LED/LCD displays really address those:
They were heavy. This wasn’t so bad early on, but as CRTs grew, they really started to suck to work with. I remember straining a muscle in my back getting a >200lb television up a flight of stairs.
They were blurry. That can be a benefit, in that some software, like games, had graphics optimized for them, that lets the blur “blend” together pixels, and so old emulators often have some form of emulation of CRT video artifacts. But in a world where software can be designed around a crisp, sharp display, I’d rather have the sharpness. The blurriness also wasn’t always even, especially on flat-screen CRTs; tended to be worse in corners. And you could only get the resolution and refresh rate so high, and the higher you went, the blurrier things were.
There were scanlines; brightness wasn’t even.
You could get color fringing.
Sony Trinitrons (rather nice late CRT computer displays) had a faint, horizontal line that crossed the screen where a wire was placed to stabilize some other element of the display.
They didn’t deal so well with higher-aspect-ratio displays (well, if you wanted a flat display, anyway). For movies and such, we’re better-off with wider displays.
Analog signalling meant that as cables got longer, the image got blurrier.
They used more electricity and generated more heat than LED/LCD displays.
It’s also worth pointing out that OLED’s solve many of the drawbacks of LCD’s, particularly around latency and response times.
We just don’t talk about burn in.
Current-generation OLEDs aren’t worse than late-generation CRTs for burn-in, they’re just worse than LCDs.
And there’s also my personal qualm with CRT monitors: The constant whine
Good news, we’re probably too deaf now to hear it 😞
I still can, and I’m using headphones on a daily basis.
Also a lot of CRT nostalgia comes from whatever display a certain person had. With a lot of my CRTs, the screen were sharp enough for a more pixelated look, and required me to turn off scanline effects in emulators as they just turned everything quite bad looking instead. Except a really bad monitor I owned, because the previous owner lied it could do 1024x768@75Hz (it was an old VGA monitor, and didn’t like that resolution).
My Trinitron monitor actually had two of those stabilizing wires. They were very thin, much thinner than even a single scan line, but you could definitely notice them on an all white background.
Apparently the dividing line was 15 inches:
https://en.wikipedia.org/wiki/Trinitron
Yeah “zero motion blur” tells me OP has literally never seen a CRT and is just repeating something he heard his grandpa say.