Hello /r/NintendoSwitch!
I have read a lot of false claims about a few technological things that got heavily upvoted and people are even insulting others if they disagree (720p v 1080p Display for example).
If you are interested in this topic you should read it even tho its quiet long.
So I want to clear some things up.
I have read comments like "a 1080p Display would skyrocket the price and the battery life would decrease drastically " or "if you increase the numbers of pixels you increase the energy consumption - logic". Comments like this get upvoted even tho they are wrong.
Now we have to look at a few things separately.
If you want to talk about the price you have to look at the industry standard. Displays become cheap over time as you can see with all the cheap 4k TVs. 1080p become the standard a few years ago. They are nothing special you can find them literally everywhere. I bought a Nexus 7 three years ago which was arguably one of the strongest 7-inches Tablets on the market. it had a 1080p screen and i paid 180€. You can buy a Nvidia Shield with a Tegra Processor for 200€ and it got an 8-inches screen.
You already find 1440p screens in 5,5 inches smartphones so 1080p already became the "old standard" and it cannot be THAT expensive to "skyrocket" the price.
The next thing is that people claims the battery life would suffer from a 1080p screen.
There are two factors that you have to look at: what power does the screen itself use? and what power does the GPU use to render the image?
What power does the screen itself use? You have to remember the law of physics that energy cannot be lost. It only changes the from. So if you power a screen with electric energy you change the energy into light and heat. You want light and you don't want heat. You are talking about "effectiveness" here. There is no real difference between 720p or 4k screens in effectiveness. If you think a 4k screen has a higher energy consumption you would say that it's either much hotter or much brighter. And that's only the case if you let them. A 4k screen CAN be much brighter but just because it has the potential to do so doesn't mean you have to use this feature.
"But more Pixel = more energy?".
That's not exactly true. Look at this like this: You have 1 light bulb with 100 Watts and three with 25 Watts each. What needs more current? That's just an example to show you that you can't see it that way. As I said before, a higher resolution screen CAN be brighter and therefore use more energy, but you can regulate it to be as bright as a lower resolution screen.
The most important part about energy consumption in displays is the size and the brightness.
Besides the technology of cause but that's not part of this discussion.
PROOF A HIGHER RESOLUTION SCREEN HAS NOT A HIGHER ENERGY CONSUMPTION
If you look at TVs for example:
The Samsung UE55J6300 is a 1080p TV and it has an output of 83 Watts. The Samsung UE55JU6450 is the same TV but with a 4k panel and it has an output of 85 Watts.
You can see there is almost no difference
Source (in german "power consumption is "Energieverbrauch" in german):
http://www.zambullo.de/fernseher/datenblatt/UE55J6350#Energieverbrauch
http://www.zambullo.de/fernseher/datenblatt/UE55JU6450#Energieverbrauch
"But the GPU needs more work to output a higher resolution. That means the GPU has a higher energy consumption and the battery life will be shorten"
Thats only really true if you are talking about idle. The Switch is a gaming device and its build to run under full load.
Let me explain it to you. If you want to talk about the performance of hardware you talk about Flops which basically means "calculations per seconds". The Ps4 can make 1,8 trillion calculations per second. The Xbox can do 1,3 trillion. Let's just assume the Switch can do 1 trillion in the undocked mode.
The Switch display is only relevant in the undocked mode so we don't have to consider a potential overclock in docked mode.
The goal from every developer is to hit 100% GPU load. If you have finished your game and you are sitting at an average of 60% GPU load you can use the unused power to make the game look prettier by increasing texture quality, field of view, Anti-Aliasing et cetera until you hit ~100%. This is called "optimization".
Now if you render an image in 1080p you need full performance from your GPU which is 1 trillion calculations per second. If you now want to render the exact same image in 720P you only need 0,6 trillion calculations per second and now you are sitting at 60% GPU load. So whats now? You can increase the Framerate to hit 100% or you can do other stuff. But you dont want to sit at 60% GPU load because you want to make your game as pretty as possible. Literally every game ever made tried to hit ~100% GPU load. And it really doesn't matter if you render an image in 1080p or 480p if you hit 100% load. The power consumption will be the same.
I have tested it with my GPU as you can see here: http://imgur.com/a/oTPfe
The power consumption was fluctuating between 97% and 100% when the GPU Load hit 100%.
"yes maybe but the dev can just lower the graphics on a 720p screen to lower the GPU load"
I have literally never heard that a dev intentionally misses a huge part of the hardware performance just to keep the device cool and increase the battery life. And you know, you can do the same with a 1080p screen.
The current gen consoles PS4 and Xbox Ones often render games at a lower resolution than 1080p even tho they are used with 1080p screens.
"Smartphone manufacturers like HTC said they wont use 1440p screens in their devices because its will lower the battery life"
Yes, that's true but a Smartphone runs in idle most of the time. It's a difference if you render an image in "idle" between 720 and 1440p because the GPU has room to scale. E.g. a 720p Smartphone runs at 7% GPU load in idle. A 1440p Smartphone runs at 15% load in idle.
Submitted by Activehannes | #Specialdealer
Special Offer Online Shopping Store 2016