The static on old CRT TVs with rabbit ears was the cosmic microwave background. No one in the last 25 years has ever seen it.
Well, not really. The cosmic microwave background radiation was a tiny fraction of that noise. What everyone saw was mostly thermal noise generated by the amplifier circuit inside the TV.
Do you think CRTs just magically disappeared after the turn of the millennium?
Don’t you still see this when using an OTA ATSC tuner on a newer LCD display? I thought this was a function of the signal generation and not the display technologies.
deleted by creator
It actually was a pretty rapid switch where all the CRTs disappeared
Cheap led tvs were like 1/5 the cost of Analog TVs. The digital switch over really finished them off too.
Really it’s the size/price that did it though. My buddy paid I think $3k for a maybe 40” Trinitron in 99-2000. It probably weighed 200lbs. Looked amazing at the time but it was probably only months before big leds came out. Plasma might have been a thing then but we’re like $10k+
They lied to us. The real Y2K was the CRT rapture.
I think they’re more likely to have been scrapped than other old tech.
They’re bulky, and mine was too heavy to get out in the attic. I still have my ZX Spectrum and Amiga, but the CRT needed for lightgun games is long gone.
Well to be fair at some point most/all CRTs showed a blue screen instead of static. So it’s possible someone born in 2000 never saw the snowy display.
As someone born in 2000, I’ve personally seen it and I think most people around me did. Maybe someone didn’t, though.
No, I just couldn’t remember exactly when. And as another commenter pointed out, what I should have said was analog TV’s.
People born before 2000 think older technology just evaporated the minute the millenium ticked over.
Like when the black and white world suddenly got colorized! My grandpap told me about them old days - when the lawn, the sidewalk and the sky were just different shades of gray.
deleted by creator
Grandpa was telling you about 50 shades of grey?
Grandpa knew things. Apparently so did grandma.
It is entirely possible for people born after 2000 to have grown up with CRTs.
It is, but those late model CRTs often had a lot of digital circuitry that displayed a solid color on channels with nothing on them. Unless there was a much older CRT around, they never would have seen it.
Most of the CRTs are going to be older
2001 here literally grew up with CRT static, you have your years a bit off there.
I was about to say, i think we had a CRT till about 2010. My grandma still has one upstairs so even my youngest cousins still grew up with it.
By the way, the picture illustrating the post isn’t actually displaying the real thing - the noise in it is too squarish and has no grey tones.
TV static in recent movies and shows that are set in the past almost always instantly pull me out of the narrative because no one seems to be able to get it right and some are just stunningly bad. It’s usually very subtle, so much so that I’m not sure I could even describe what’s wrong. Makes me feel old to notice it.
I think the problem is because CRT displays didn’t have pixels so the uniform noise which is static was not only uniformely spread in distribution and intensity (i.e. greyscale level) but also had “dots” of all sizes.
Also another possible thing that’s off is the speed at which the noise changes: was it the 25fps refresh rate of a CRT monitor, related to that rate but not necessarily at that rate or did the noise itself had more persistent and less persistent parts?
The noise is basically the product of radio waves at all frequencies with various intensities (though all low) with only the ones that could pass the bandpass filter of the TV tuner coming through (and being boosted up in intensitity by automatic gain control) and being painted along a phosphorous screen (hence no pixels) as the beam draw line by line the screen 25 times per second so to get that effect right you probably have to simulate it mathematically from a starting point of random radio noise and it can’t be going through things with pixels (such as 3D textures) to be shown and probably requires some kind of procedural shader.
I bought a plasma in 2009 that would show static if I turned it to cable channels without cable plugged in. Plasmas were susceptible to burn in and since I would game a lot I could see health bars etc start to burn in after a while. Whenever that would happen I would turn it to the static screen - making each pixel flip from one end of the spectrum to the other rapidly like that would actually help remove the burn in.
Tube TV’s remained in common service well into the 2010’s. The changeover from analog to fully digital TV transmission did not happen until 2009, with many delays in between, and the government ultimately had to give away digital-to-analog tuner boxes because so many people still refused to let go of their old CRT’s.
Millions of analog TV’s are still languishing in basements and attics in perfect working order to this very day, still able to show you the cosmic background, if only anyone would dust them off or plug them in. Or in many retro gaming nerds’ setups. I have one, and it’ll show me static any time I ask. (I used it to make this gif, for instance.)
In fact, with no one transmitting analog television anymore (probably with some very low scale hobbyist exceptions), the cosmic background radiation is all they can show you now if you’re not inputting video from some other device. Or unless you have one of those dopey models that detects a no-signal situation and shows a blue screen instead. Those are lame.
Amateur radio operators are indeed allowed to transmit analog NTSC television in the UHF band. It’s most commonly done on the 70cm (440MHz) band, and a normal everyday 90’s television is all you need to receive the signals. You’d tune to what would have been cable channels 57 through 61. The use cases for this have decreased in recent years; for example you used to see hams using amateur television to send video signals from RC aircraft or model rockets, now that’s done with compressed digital video over something like Wi-Fi and doesn’t require a license. But, it’s still legal for hams to do.
I think my mom still uses the last CRT TV that I had. Gave it to her when I bought my first 720p HD TV, as the old CRT was better than her old TV. Later on I also gave her that HD TV but she still has the CRT too.
It really isn’t though. It is thermal noise.
Random radio sources, but a small part of the signal is CMB. I wasn’t sure what you even meant by thermal noise but I believe it’s a phenomenon of flatscreens. I found something that said it was “similar to snow on analog TVs” - so apparently there’s a difference.
Funnily, Google AI says, “In the 1940s, people could detect the CMB at home by tuning their TVs to channel 03 and measuring the remaining static after removing other sources. This allowed them to prove the Big Bang before scientists did.” So they had that going for 'em, which is nice.
“Thermal Noise” is a phenomenon where everything makes EM noise, just from thermal energy.
If you were to put such a TV in a faraday cage, with an RF termination, you would see something similar. Because noise is inherently part of the circuitry and amplifiers.
Could it not be both?
Last time I thought about static I wondered why colour TV didn’t show colour static.
Turns out the colour signal was on very specific frequencies, and if it wasn’t present, it would assume it was a black and white signal and turn off the colour circuit.
#CHSHSHSHSHSHSHSHSHSHSH
Hair stands up
Dude I was born after 2000 and this is firmly planted in my memories. Maybe people born after 2010 haven’t but 2000?
2002 here, we still had such a TV. For quite a while actually, since we never upgraded and just started using phones and computers instead. It became my console monitor.
Yeah OP full of shit. My three sons all born after 2000 have seen this. Hell my flat screen will show snow if I turn it to antenna and there nothing for single to pick up. Also I have console tv for our old gaming systems so they seen that as well
They also know how a vcr works and what a payphone is. We are not that far removed from that technology. Hell my middle son 17 has a record collection and cds. Also we have the cassette audiobook version of Stephen King Dolores Claiborne.
Modern Tv project fake static when there is no siginal because of fimilarity. OTA broadcasts are all digital, either you get a siginal or you dont.
Some TVs may project fake static.
Just because OTA broadcasts are digital doesn’t mean you are stuck with all or nothing. You can definitely have poor signal and see or hear something other than what was intended. Doesn’t manifest as analog static, but depending on your decoding and error correction schemes, you can have cut audio, frozen frames, iframe inconsistencies, and stuttering.
No digital is all or nothing. What you are describing is some digital packets making it through and the algothrim is designed to accept some packet loss and has error correction. Its more complicated then i make it out, but thats the jist of it.
It is nothing like analog thats being drowned out by background radiation.
The sky above the port was the colour of television, tuned to a dead channel…
My family had several tvs that did this until around 2013