I remember CRTs being washed out, heavy, power hungry, loud, hot, susceptible to burn-in and magnetic fields… The screen has to have a curve, so over ~16" and you get weird distortions. You needed a real heavy and sturdy desk to keep them from wobbling. Someone is romanticizing an era that no one liked. I remember the LCD adoption being very quick and near universal as far as tech advancements go.
Anon reflects on e-sports
Submitted 5 months ago by Early_To_Risa@sh.itjust.works to greentext@sh.itjust.works
https://sh.itjust.works/pictrs/image/8e3109d4-bb88-4241-bc26-ac951ac3972b.png
Comments
Bytemeister@lemmy.world 5 months ago
Kit@lemmy.blahaj.zone 5 months ago
As someone who still uses a CRT for specific uses, I feel that you’re misremembering the switch over from CRT to LCD. At the time, LCD were blurry and less vibrant than CRT. Technical advancements have solved this over time.
Late model CRTs were even flat to eliminate the distortion you’re describing.
Pulptastic@midwest.social 5 months ago
I had a flat CRT. It was even heavier than a regular one.
rothaine@lemm.ee 5 months ago
Resolution took a step back as well, IIRC. The last CRT I had could do 1200 vertical pixels, but I feel like it was years before we saw greater than 768 or 1080 on flat screen displays.
sugar_in_your_tea@sh.itjust.works 5 months ago
Sure, but they were thin, flat, and good enough. The desk space savings alone was worth it.
I remember massive projection screens that took up half of a room. People flocked to wall mounted screens even though the picture was worse.
deegeese@sopuli.xyz 5 months ago
I miss the <thunk> sound of the degaussing function.
UnsavoryMollusk@lemmy.world 5 months ago
Schdoing !
ILikeBoobies@lemmy.ca 5 months ago
There was always push back in esports
Smash uses CRTs today because of how much pushback there was/is
WldFyre@lemm.ee 5 months ago
Melee uses CRTs because it’s an old ass game lol
Ultimate is not played on CRTs
cordlesslamp@lemmy.today 5 months ago
Can someone please explain why CRT is 0 blur and 0 latency when it literally draws each pixel one-by-one using the electron ray running across the screen line-by-line?
TexasDrunk@lemmy.world 5 months ago
The guy inside it drawing them is insanely fast at his job. That’s also why they were so bulky, to fit the guy who does the drawing.
JargonWagon@lemmy.world 5 months ago
Perfect response.
B0rax@feddit.de 5 months ago
Because it is analog. There are no buffers or anything in between. Your PC sends the image data in analog throug VGA pixel by pixel. These pixels are projected instantly in the requested color on the screen.
accideath@lemmy.world 5 months ago
And no motion blur because the image is not persistent. LCDs have to change their current image to the new one. The old image stays until it’s replaced. CRTs draw their image line by line and only the the last few lines are actually on screen at any time. It just happens so fast, that, to the human eye, the image looks complete. Although CRTs usually do have noticeable flicker, while LCDs usually do not.
frezik@midwest.social 5 months ago
Of course there’s buffers. Once RAM got cheap enough to have a buffer to represent the whole screen, everyone did that. That was on the late 80s early 90s.
There’s some really bad misconceptions about how latency works on screens.
photonic_sorcerer@lemmy.dbzer0.com 5 months ago
No such thing as instant. There is always some latency.
Hagdos@lemmy.world 5 months ago
That makes 0 latency in the monitor, but how much latency is there in the drivers that convert a digital image to analogue signals? Isn’t the latency just moved to the PC side?
frezik@midwest.social 5 months ago
They don’t have zero latency. It’s a misconception.
The industry standard way to measure screen lag is from the middle of the screen. Let’s say you have a 60Hz display and hit the mouse button to shoot the very moment it’s about to draw the next frame, and the game manages to process the data before the draw starts. The beam would start to draw, and when it gets to the middle of the screen, we take our measurement. That will take 1 / 60 / 2 = 8.3ms.
Some CRTs could do 90Hz, or even higher, but those were really expensive. Modern LCDs can do better than any of them, but it took a long time to get there.
bjoern_tantau@swg-empire.de 5 months ago
Actually 60 Hz was too low to comfortably use a CRT. I think it started to work well at 75 Hz, better 80 or 85. Don’t know if I ever had a 90 Hz one, especially at a resolution above 1280x960. But if you valued your eyes you never went down to 60.
No idea why 60 Hz on an LCD works better, though.
myplacedk@lemmy.world 5 months ago
Because it draws those “pixels” as the signal reaches the monitor. When half of a frame is transmitted to a CRT monitor, it’s basically half way done making it visible.
An LCD monitor needs to wait for the entire frame to arrive, before it can be processed and then made visible.
Sometimes the monitor will wait for several frames to arrive before it processes them. This enables some temporal processing. When you put a monitor in gaming mode, it disables (some of) this.
Jesus_666@feddit.de 5 months ago
If that’s how TFTs worked we wouldn’t have vsync settings in games.
ShortFuse@lemmy.world 5 months ago
The transmission is still the same with the exception of things like VRR and DSC. We still send a VBLANK signal which is the electronic signal to tell a CRT to move up to the top of the screen. We don’t change the way things are sent. It’s still top down, left to right. VSync and HSync are still used but make less obvious sense on LCDs. Digital displays translate this.
Because LCDs convert these signals, we call the time it takes to do the conversion “draw time” but this isn’t as important today. What matters now is the time it takes for a pixel to change one color to another (response time). Because a CRT would fire electrons, the next frame would essentially vanish pretty quickly. LCDs don’t do this.
Conversely OLEDs are plenty fast, but can’t reproduce the same pixel response without inserting a blank frame with Black Frame Insertion which sacrifices brightness and is being slowly removed.
Still, most “lag” comes from transmission time. It takes 1/60s of a second to transmit a full frame at 60hz. Divided that 2 to get the “average” lag and CRTs would measure at 8.3333ms. LCDs were happy to get to 10ms.
Now we can do 120hz which is way more important since even if CRTs are faster, you can get the whole image out in half the time, which “averages” at 4.1666ms, making even a “4ms” slow LCD on PC better than the console running at 60hz on CRT.
And while CRTs could reach high resolution, these were limited by their HSync speed which usually means lower resolution, because a CRT could only move do quickly horizontally.
Today that translates to an OLED is best for emulating any console that ran at 60hz and better or as good pixel response time if you are willing to do BFI. The main reason why the competitive Melee community still uses CRT is mostly pricing, second to FUD.
Socsa@sh.itjust.works 5 months ago
The motion blur thing is complete nonsense. It’s never been a benefit of CRT and reveals this greentext to be fake and gay.
mindbleach@sh.itjust.works 5 months ago
Those pixels appear basically as soon as the signal arrives at the back of the monitor, and they’re gone within a dozen scanlines. Watch slow-motion video of a CRT and you’ll see there’s only a narrow band that’s bright at any given moment.
bjoern_tantau@swg-empire.de 5 months ago
First rule at our LAN parties: You carry your own monitor.
We’d help each other out with carrying equipment and snacks and setting everything up. But that big ass bulky CRT, carry it yourself!
Inktvip@lemm.ee 5 months ago
Not necessarily if we’re the one walking in with the DC++ server. Getting that thing up and running was suddenly priority #1 for the entire floor.
slazer2au@lemmy.world 5 months ago
Oh boy do I miss DC++
Psythik@lemmy.world 5 months ago
Hell, modern displays are just now starting to catch up to CRTs in the input lag and motion blur department.
It was brutal putting up with these shitty LCDs for two whole decades, especially the fact that we had to put up with 60Hz and sub-1080p resolutions, when my CRT was displaying a 1600x1200 picture at 85Hz in the 90s! It wasn’t until I got a 4K 120Hz OLED with VRR and HDR couple years ago that I finally stopped missing CRTs, cause I finally felt like I had something superior.
Twenty fucking years of waiting for something to surpass the good old CRT. Unbelievable.
Heavybell@lemmy.world 5 months ago
LCDs came in just in time for me to be attending LAN parties in uni. Got sick of lugging my CRT up the stairs once a week pretty quickly and was glad when I managed to get my hands on an LCD. I can’t even remember if I noticed the downgrade, I was so thrilled with the portability.
Aux@lemmy.world 5 months ago
If input lag is the only measure for you, ok. But LCDs have surpassed CRTs in pretty much every other metric at least a decade ago.
Psythik@lemmy.world 5 months ago
Not just input lag (I mean I literally mentioned other things too but you obviously didn’t read my entire comment) but also contrast ratio, brightness in LUX, color volume and accuracy, response time, viewing angle, displaying non-native resolutions clearly, flicker, stutter… Should I go on?
All things that LCDs struggled on and still struggle on. OLED fixes most of these issues, and is the only display tech that I’d consider superior to a CRT.
ZILtoid1991@lemmy.world 5 months ago
Most people didn’t own a CRT capable of 1600x1200@85Hz, most were barely if any better in resolution department than your average “cube” LCDs (one which I’m currently using besides my main 32" QHD display). I have owned a gargantuan beast like that with a Trinitron tube, I could run it at 120Hz at 1024x768 and at higher resolutions without much flicker, but it had issues with the PCBs cracking, so it was replaced to a much more mediocre and smaller CRT with much lower refresh rates.
Wilzax@lemmy.world 5 months ago
You know we had 1080p 120hz displays 15 years ago, right?
Psythik@lemmy.world 5 months ago
In an OLED? They weren’t affordable 10 years ago.
A 10 year old LCD is not good. The resolution and refresh rate is irrelevant.
Vilian@lemmy.ca 5 months ago
0 input lag lmao
fallingcats@discuss.tchncs.de 5 months ago
Just goes to show many gamers do not infact know what “input” lag is. I’ve seen the response time a monitor adds called input lag way to many times. And that mostly doesn’t in fact include the delay a (wireless) input device might add, or the GPU (with multiple frames in flight) for that matter.
vardogor@mander.xyz 5 months ago
seems pretty pedantic. the context is monitors, and it’s lag from what’s inputted to what you see. plus especially with TVs, input lag is almost always because of response times.
PieMePlenty@lemmy.world 5 months ago
Lets see If I get this right, input lag is the time it takes from when you make an input (move your mouse) to when you see it happen on screen. So even the speed of light is at play here.
Hadriscus@lemm.ee 5 months ago
Once I tried playing Halo or Battlefield on a friend’s xbox with a wireless controller on a very large TV. I couldn’t tell which of these (the controller, the tv or my friend) caused the delay but whatever I commanded happened on the screen, like, 70ms later. It was literally unplayable
mindbleach@sh.itjust.works 5 months ago
CRTs perfectly demonstrate engineering versus design. All of their technical features are nearly ideal - but they’re heavy as shit, turn a kilowatt straight into heat, and take an enormous footprint for a tiny window. I am typing this on a 55" display that’s probably too close. My first PC had a 15" monitor that was about 19" across, and I thought the square-ass 24" TV in the living room was enormous. They only felt big because they stuck out three feet from the nearest wall!
JackbyDev@programming.dev 5 months ago
The heavy part truly cannot be overstated. I recently got a tiny CRT, not even a cubic foot in size. It’s about the same weight as my friends massive OLED TV. Of course, OLED is particularly light, but still. It’s insane!
mindbleach@sh.itjust.works 5 months ago
And it’s a vacuum tube. How does nothing weigh this much?!
Plasma screens weren’t much better, at first. I had a 30" one circa 2006, maybe three inches thick, and it you’d swear it was solid metal. A decade later we bought a couple 32" LCD TVs, then a few more because they were so cheap, and the later ones weighed next to nothing. Nowadays - well, I walked this 55" up and down a flight of stairs by myself, and the only hard parts were finding somewhere to grab and not bonking any walls.
FrostyCaveman@lemm.ee 5 months ago
That pic reminds me of something. Anyone else remember when “flatscreen” was the cool marketing hype for monitors/TVs?
TSG_Asmodeus@lemmy.world 5 months ago
Anyone else remember when “flatscreen” was the cool marketing hype for monitors/TVs?
We got to move these refrigerators, we got to move these colour TV’s.
ZombiFrancis@sh.itjust.works 5 months ago
Those flatscreen CRTs were pretty great for their time though. Maybe/probably rose tinted glasses but man I remember them being plain better monitors overall.
daltotron@lemmy.world 5 months ago
They probably were in terms of viewing angles at the time of release, and probably were better if you had a technician which was able to come and adjust it or could adjust it at the store before it was sold, but I think the flatscreen CRTs have a much higher tendency for image warping over time.
Sanctus@lemmy.world 5 months ago
God that shit was so dope. LAN parties are the shit.
Etterra@lemmy.world 5 months ago
It was a dark day for gamers when the competitive things crawled out of their sports holes.
synapse1278@lemmy.world 5 months ago
Good’old Warcraft III
Bezzelbob@lemmy.world 5 months ago
Fr, if your worried about 2 ms input lag than it isn’t the lag, your just bad
sugar_in_your_tea@sh.itjust.works 5 months ago
And it’s not input lag, it’d screen refresh. Input lag has more to do with peripherals and game update loops than screen rendering.
fox2263@lemmy.world 5 months ago
I miss my CRT
tal@lemmy.today 5 months ago
I mean, I have some nostalgia moments, but I think that while OP’s got a point that the LCD monitors that replaced CRTs were in many ways significantly technically worse at the time, but I think that in pretty much all aspects, current LCD/LEDs beat CRTs.
Looking at OP’s benefits:
0 motion blur
CRT phosphors didn’t just immediately go dark. They were better than some LCDs at the time, yeah, which were very slow, had enormous cursor trails. But if you’ve ever seen a flashing cursor on a CRT and the fact that it actually faded out, you know that there was some delay.
en.wikipedia.org/…/Comparison_of_CRT,_LCD,_plasma…
Response time: 0.01 ms[14] to less than 1 μs,[15] but limited by phosphor decay time (around 5 ms)[16]
0 input lag
That’s not really a function of the display technology. Yeah, a traditional analog CRT television with nothing else involved just spews the signal straight to the screen, but you can stick processing in there too, as cable boxes did. The real problem was “smart” TVs adding stuff like image processing that involved buffering some video.
At the time that people started getting LCDs, a lot of them were just awful in many respects compared to CRTs.
-
As one moved around, the color you saw on many types of LCDs shifted dramatically.
-
There was very slow response time; moving a cursor around on some LCD displays would leave a trail, as it sluggishly updated. Looked kind of like some e-ink displays do today.
-
Contrast wasn’t great; blacks were really murky grays.
-
Early LCDs couldn’t do full 24-bit color depth, and dealt with it by dithering, which was a real step back in quality.
-
Pixels could get stuck.
But those have mostly been dealt with.
CRTs had a lot of problems too, and current LED/LCD displays really address those:
-
They were heavy. This wasn’t so bad early on, but as CRTs grew, they really started to suck to work with. I remember straining a muscle in my back getting a >200lb television up a flight of stairs.
-
They were blurry. That can be a benefit, in that some software, like games, had graphics optimized for them, that lets the blur “blend” together pixels, and so old emulators often have some form of emulation of CRT video artifacts. But in a world where software can be designed around a crisp, sharp display, I’d rather have the sharpness. The blurriness also wasn’t always even, especially on flat-screen CRTs; tended to be worse in corners. And you could only get the resolution so high.
-
There were scanlines; brightness wasn’t even.
-
You could get color fringing.
-
Sony Trinitrons (rather nice late CRT computer displays) had a faint, horizontal line that crossed the screen where a wire was placed to stabilize some other element of the display.
-
They didn’t deal so well with higher-aspect-ratio displays (well, if you wanted a flat display, anyway). For movies and such, we’re better-off with wider displays.
-
Analog signalling meant that as cables got longer, the image got blurrier.
-
sverit@lemmy.ml 5 months ago
I’m sure he misses you too, buddy
figaro@lemdro.id 5 months ago
It’s ok, if anyone wants them back the smash brothers melee community has them all in the back of their car
ssj2marx@lemmy.ml 5 months ago
love me tower, love me CRT, simple as
PriorityMotif@lemmy.world 5 months ago
Wider fov
HEXN3T@lemmy.blahaj.zone 5 months ago
The amount of CRTs I own has actually been increasing lately.
jaybone@lemmy.world 5 months ago
Guy on the left has this maniacal smile. Like he just got an A on the midterm at Clown College for Advanced Villainry.
Kolanaki@yiffit.net 5 months ago
I cared enough that my first flat screen monitor was a modern IPS display; I still had a CRT well after the iPhone.
r00ty@kbin.life 5 months ago
I think most people that were gaming held onto their CRTs as long as possible. The main reason being, the first generation of LCD panels took the analogue RGB input, and had to present that onto the digital panel. They were generally ONLY 60hz, and you often had to reset their settings when you changed resolution. Even then, the picture was generally worse than a comparable, good quality CRT.
People upgraded mainly because of the reduced space usage and that they looked aesthetically better. Where I worked, we only had an LCD panel on the reception desk, for example. Everyone else kept using CRTs for some years.
CRTs on the other hand often had much better refresh rates available, especially at lower resolutions. This is why it was very common for competitive FPS players to use resolutions like 800x600 when their monitor supported up to 1280x960 or similar. The 800x600 resolution would often allow 120 or 150hz refresh.
When LCD screens with a fully digital interface became common, even though they were pretty much all 60hz locked, they started to offer higher resolutions and in general comparable or better picture quality in a smaller form factor. So people moved over to the LCD screens.
Fast-forward to today, and now we have LCD (LED/OLED/Whatever) screens that are capable of 120/144/240/360/Whatever refresh rates. And all the age-old discussions about our eyes/brain not being able to use more than x refresh rate have resurfaced.
It's all just a little bit of history repeating.
Liz@midwest.social 5 months ago
I like how no one mentions that CRT pixels bleed into each other.
RememberTheApollo_@lemmy.world 5 months ago
And it worked as AA wasn’t as important in that “fuzzier” screen when graphics aren’t as good as they are today.
r00ty@kbin.life 5 months ago
Are you sure it was CRT technology? Because bear in mind, colour CRTs had to focus the beam so accurately that it only hit the specific "pixel" for the colour being lit at that time. What there was, was blur from bad focus settings, age and phosphor persistence (which is still a thing in LCD to an extent).
What DID cause blur was the act of merging the image, the colour and the synchronisation into a composite signal. All the mainstream systems (PAL, SECAM and NTSC) would cause a blurring effect. Games on 80s/90s consoles generally used this to their advantage, and you can see the dithering effects clearly on emulators of systems from that period. Very specifically, the colour signal sharing spectrum with the luminance signal would lead to a softening of the image which would appear like blurring. Most consoles from the time only output either an RF signal for a TV or if you were lucky a composite output.
Good computer monitors (not TVs) of the time were extremely crisp when fed a suitable RGB signal.
ColdWater@lemmy.ca 5 months ago
I thought that dude I a woman with hair ties up
RememberTheApollo_@lemmy.world 5 months ago
I had a 20-odd inch CRT with the flat tube. Best CRT I ever had, last one I had before going to LCD. Still miss that thing, the picture was great! Weighed a ton, though.
pewgar_seemsimandroid@lemmy.blahaj.zone 5 months ago
plasma TV?
RightHandOfIkaros@lemmy.world 5 months ago
People did care, which is why people who played games competitively continued to use CRT monitors well into the crappy LCD days.
julianh@lemm.ee 5 months ago
Idk if it’s just me but I have pretty good hearing, so I can hear the high pitch tone CRTs make and it drives me crazy.
RightHandOfIkaros@lemmy.world 5 months ago
This only happens with TVs or very low quality monitors. The flyback transformer vibrates at a frequency of ~15.7k Hz which is audible to the human ear. However, most PC CRT monitors have a flyback transformer that vibrates at ~32k Hz, which is beyond the human hearing range. So if you are hearing the high frequency noise some CRTs make, it is most likely not coming from a PC monitor.
Its a sound thats a part of the experience, and your brain tunes it out pretty quickly after repeated exposure to it. If the TV is playing sound such as game audio or music it becomes almost undetectable. Unless there is a problem with the flyback transformer circuit, which causes the volume to be higher than its supposed to be.
SkaveRat@discuss.tchncs.de 5 months ago
the solution is to be in your mid-30s
systemglitch@lemmy.world 5 months ago
Ditto, lcd’s were a godsend.
nadiaraven@lemmy.world 5 months ago
eeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee
(me too)
SpaceCowboy@lemmy.ca 5 months ago
For me it was the refresh. If a CRT was at 60Hz, I could see it flashing when I wasn’t looking directly at it. I had to have it set to at least 75 Hz (>80 Hz preferably) or it would give me a headache.
seliaste@lemmy.blahaj.zone 5 months ago
This and the scanlines actually make it feel so weird to look at for me, I hate crts with a passion
SpikesOtherDog@ani.social 5 months ago
You beat me to the punch.
We were absolutely considering output delay and hoarding our CRT monitors.
Some of us were also initially concerned about input delay from early USB until we were shown that while it is slower that it was unnoticeable.
Protoknuckles@lemmy.world 5 months ago
I forgot that! PS/2 mice and keyboards were hoarded.
A_Very_Big_Fan@lemmy.world 5 months ago
🐝
HackerJoe@sh.itjust.works 5 months ago
They do hum and buzz.
sanosuke001@lemmynsfw.com 5 months ago
I bought a Sun Microsystems 24" widescreen CRT for $400 on eBay back in 2003ish? iirc. It was 100lbs and delivered on a pallet lol. There’s a reason why they didn’t get very big and were mostly 4:3. 1920x1200 and like 30" deep! But, they did exist!
RightHandOfIkaros@lemmy.world 5 months ago
I know it. I myself have 2, a 1981 JVC UHF/VHF radio 4.5", and a Sylvania 27" with a DVD/VHS combo unit built into it.
They even made a curved ultrawide CRT once, surprisingly. Though it cost a fortune when it came out.
SpaceCowboy@lemmy.ca 5 months ago
Yeah, right?
The fact that we know about this decades later is because people actually did care about it.
When LCDs (then later LEDs) improved this concern kinda faded away. Which makes sense.
rickyrigatoni@lemm.ee 5 months ago
On this note, anybody know where I can get a 1600x1200 crt these days?
HackerJoe@sh.itjust.works 5 months ago
ebay? If you can get an IBM P77 or Sony G220 (they are the same) in good working condition you should be golden. Those are awesome. They go up 170Hz, 75Hz at 1600x1200. And can even do 2048x1536 although that would be out of specs and only 60Hz (barely usable but fucking impressive).
InFerNo@lemmy.ml 5 months ago
It even took some weird proportions where “pro” gamers set their game to display 4:3 on their widescreen lcd.
Habits die hard.