Dear God,
I hope they sack this “journalist” quickly.
It really shows how bad the marketing of these higher resolutions are. We always advertised the vertical lines and then we switched to horizontal lines.
You can’t expect a video game journalist to understand basic display principles. EDIT: /s
I get not wanting to call it “2160p” because that’s a lot of syllables. But you’re right, it was really dumb to switch which lines we are referring to. I’m sure a better name could have been come up with. Even UHD was better imo.
The one that really irks me are the people who call 2560x1440 2K. I have always known 2K to mean 2048 x 1080. But it has picked up so much traction that it has pretty much been redefined at this point.
2k is the term i refuse to use in my linguo. Ill yake QHD, or 1440p, but not 2k. 2560 doesnt even round to 2000 in the thousands place.
2K is supposed to refer to a 2048x2048 square 1:1 aspect image, same with 4K being a 4096x4096 image. This term is correctly used a lot when referring to texture sizes. A 4K texture is 4096x4096 texels.
I think the term started getting mixed up with people discussing what resolutions benefit from texture size increases. Generally, if you are running, say, 4K textures, you would really only always benefit from that if you have a 2160p screen, just because lower resolutions dont have the definition to actually display those texels. So, people start inter changing “4K screen” and “4K-benefitting screen” and we end up where we are now.
I don’t expect a journalist to know, I expect an editor or fact checker to at least Google “4k resolution”.
Or how about “red dead redemption xbox” to see what the BC version runs at…
Pro-tip: Xbox One S / Series S - 1440p
Xbox One X / Series X - Native 4Khttps://www.eurogamer.net/digitalfoundry-2018-red-dead-redemption-4k-xbox-one-x-analysis
It’s pretty confusing
“UHD features a 16:9 aspect ratio and is twice the resolution of full HD. In other words, two times 1080p, two times 1920 x 1080 pixels, that is 3840 x 2160 pixels. Having the same 16:9 aspect ratio means it is backward compatible with other HD derivates. However, both 4K and UHD can be shortened to 2160p to match the HD standard and therefore, companies use the terms interchangeably.”
“If you think 4K and UHD are one and the same, I don’t blame you. I blame the companies that LOVE to use them interchangeably all the time. You pick up a Blu-Ray movie disc of a 4K movie and you will most definitely see an Ultra HD label on it. 4K is actually not a consumer display and broadcast standard but UHD is. 4K displays are used in professional production and digital cinemas and feature 4096 x 2160 pixels”
UHD features a 16:9 aspect ratio and is twice the resolution of full HD
Heh, no. 4k is exactly four times the resolution of 1080p.
1920 x 2 = 3840 (4K UHD)
That’s what he’s talking about.
Yeah but that would only be an increase in the horizontal resolution… you’d have 3840 x 1080.
So you gotta double the vertical resolution too, which means you’ve now doubled both horizontal and vertical resolutions, which is equal to 4 times the initial resolution
It is double the resolution, because resolution is expressed as an x,y pair. It is 4 times the pixel density for the same screen size.
Actually, display resolution refers to exactly what you call pixel density, and NOT the pixel dimensions. This error is so common that the term resolution has practically been redefined outside of the professional (science and engineering) space, but technically, display resolution and pixel density are the same thing.
UHD is 4x Full HD resolution. The person who wrote that can’t even do math. That’s like saying 4m^2 = 2 x 1m^2 because 2 x 1 x 1 = 2 x 2
It’s obviously talking about horizontal lines, not pixels
No they specifically say
UHD features a 16:9 aspect ratio and is twice the resolution of full HD.
According to Wikipedia resolution is:
The display resolution or display modes of a digital television, computer monitor or display device is the number of distinct pixels in each dimension that can be displayed
https://en.wikipedia.org/wiki/Display_resolution
Resolution is the number of pixels in both dimensions, so they are wrong
I think the point is that it’s ducking hard to talk about lmao 🤣
How is it obvious that they are talking about horizontal when they also include vertical in the same calculation?
They just don’t know the difference between pixels and lines.
Its pixels, why do you think QHD (Quad HD) is called that. Because its 4x the pixels of HD(720p)
You cant talk about only horizontal because you open up the chat to ultrwides and deceptive marketing, such as AMD using “8k” to show off their new GPUs, when in fact they intentially used a ultrawide and marketed it as 8k.
I don’t know that it’s THAT confusing, since by definition we’re talking consumer grade products, not professional grade.
Amd that’s a distinction consumers have been making for years.
https://www.caranddriver.com/news/a36125131/2022-hyundai-santa-cruz-pickup-revealed/
I mean, yeah, technically it’s classified as a pickup truck… but nobody will ever confuse it for:
I thought the term “basic” would hint the sarcasm but I failed.
It really isn’t that hard to grasp, unless you are trying to frame your article a certain way.
You can’t expect an video game journalist to understand basic display principles.
Yes you can.
Wtf is sarcasm?
Sarcasm is my wife.
AI generated article, or pure incompetence. Or both
Dexerto, gamerant, and similar are all just AI generated. You can see these kinds of mistake so often
I was looking up and down for any explanation as to why 4k is different from 2160p. Shows that this one has no clue what they’re talking about.
It’s nitpicking, whether it runs at 3840x2160 or 4096x2160 does not matter. Same goes for calling it 4K or UHD, even when one is technically incorrect.
If even Sony calls their 3840x2160 blu-rays “4K UHD” I’m fine with the average person using them interchangeable.
I had to go digging but 3840x2160 is both 2160p AND 4k UHD. 4096x2160 is something called 4K DCI which is more of a camera or film industry thing and is rarely used for things like TVs or video games.
1080p 1080i 720p (IE the i/p suffix) denotes a SMPTE resolution and timing.
HD/FHD/UHD (720,1080,2160 respectively) also denote SMPTE resolutions and timings.
These are SMPTE ST2036-1 standards, which are 16:9 and have defined (but not arbitrary) frame rates up to 120fps.4k DCI is still a SMPTE timing, but used for cinema and is generally 24fps (tho can be 48fps for 2k DCI).
It’s SMPTE 428-1.There are other “4k” standards, but not nearly as common.
If you have arbitrary resolutions or timings outside of the SMPTE standards, and generally fall into VESA standard resolution/timings or custom EDID resolution/timings.
Chances are your computer is actually running 1920x1080@60 CVT-RB rather than 1080p60.Whilst 1080p60 and 1920x1080@60 seem like they should be the same, some displays (and devices) might only support SMPTE timings or VESA timings.
So, although a display is 1920x1080 it might expect SMPTE, but the device can only output VESA.Wow. This was very informative. Thanks!
No problem.
Displays, resolutions, framrates, edids are all very complex. And marketing muddies the water!I’ve encountered this issue before when using BlackMagic equipment.
What I was plugging into was described to me as “1080p”.
Laptop directly into it would work, and it looked like 1080p in windows display management.
Going through BlackMagic SDI converters (SDI is a SMPTE standard protocol, so these boxes went hdmi->sdi, sdi cable, sdi->hdmi, and would only support SMPTE resolutions/timings), the display wouldn’t work.
Because the display was VESA only.I then read a lot about SMPTE, VESA, and EDIDs!
Correct, but both can be called 2160p just because of their vertical resolution. Overall both terms don’t matter in gaming because aspect ratio can be changed on the fly (on PC) depending on the output device. Haven’t touched a console in years but I assume they are stuck with a 16:9 aspect ratio no matter what they are playing on?
Same goes for calling it 4K or UHD, even when one is technically incorrect.
Why is it incorrect? 4k isn’t a formal standard. It just means you have approximatly 4k horizontal pixels.
Calling 3840x2160 “4k” makes sense since 3840 is so close.
On a different note sometimes I’ve heard people call 2560x1440 for “2k” but neither 2560 nor 1440p are close to 2k so that makes little sense to me1920x1080 is closer to 2K if anything.
yep Full HD is a 2K resolution https://en.wikipedia.org/wiki/2K_resolution
Heh, I never knew this and I am tech savy. TIL.
I think people call 1440p 2k because they know 4k exists and associate 1080p with 1k.
The logic of some people goes that anything under 4000 horizontal pixels is not “real” 4k. But as mentioned, I don’t care and also call 3840x2160 “4k” simply because it’s shorter than “2160p”.
4K is definitely a formal standard
Ok, can you formally define it or link me to it?
And I don’t want a definition for “4k DCI” or “4k UHD” … just a formally accepted definition of “4k” (in the context of a display resolution). We can all agree that it colloquially means the number 4000, I hope.
There is not one definition, if you hear “4K” you can use the context of the conversation to determine if they’re talking about the consumer 4K UHD format or cinematic 4K, neither of which have a vertical resolution of exactly 4000px. UHD standards are maintained by ITU DCI standards were developed by the DCI group and are now maintained by SMPTE
There is not one definition, if you hear “4K” you can use the context of the conversation to determine if they’re talking about the consumer 4K UHD format or cinematic 4K
I agree. But then it’s not a formal standard.
Huh TIL
I thought it was because 4k has 4x the pixel count of 1080p
4K UHD (along with 8K UHD and 16K UHD) are the consumer format standards for 3840x2160 image formats which includes Blu-ray. Full 4K or True 4K or DCI 4K is the cinematic 4K standard shown at 4096x2160, which many TVs supper via slight letterboxing
I’ve found this. Personally I would not say the difference is worth having another name, maybe for the sake of differentiating between the ratios.
But it seems that indeed 4K is not 2160p 🤷
Lol. on the PS4 Pro or PS5, Red Dead Redemption can even achieve 4K resolution. In contrast, the game runs at 2160p on Series X.
Just saw the Digital Foundry review, it’s sad that Rockstar didn’t even bother to upgrade the UI elements. almost every UI asset runs on 720p
Damn
The fact it’s STILL 30fps is “hard pass” territory from me on its own but like. Wow. Lazy
Probably because they didn’t want to rewrite the physics that are based on 30fps max. Lazy port.
They didn’t need to. The Xbox 360 version can run at 60 FPS in an emulator with a really simple patch
But on the PC Version we don’t have a frame cap. My guess would be, that fixed 30fps feel better than a fluctuation between 25 and 40.
There’s a PC version of RDR outside of emulation?
deleted by creator
This thread reminds me of that time on the bodybuilding forum when they argued about how many days there are in a week.
9 are they fucking stupid or something?
Saturday, Sunday, Monday, Tuesday, Wednesday, Thursday, Friday, Saturday and Sunday.
Fucking idiots.