I am definitely seeing more

Posted: Jon Peddie 11.08.17
Am I doing more?

Anyone who has known me has most likely heard me say, the more you can see, the more you can do. That rule applies to monitors, and almost any vehicle you can think of, where “doing” in a vehicle translates into staying out of danger. In monitors it translates into productivity.

Most people have seen, or at least have heard of the landmark study A.J Thadhani did at IBM in 1981 about response time of a computer affecting productivity, where he showed how productivity dropped over 50% is the user’s interactivity was interrupted by just one-second. The point that the study made (which has been reproduced and verified dozens of times since) was when you have to interrupt your thinking process to tend to a computer’s needs, or wait for it to respond to you, you have to go back to the beginning of your thought process and recapture the train of thought you were pursuing. Therefore, if those interruptions could be avoided, your productivity would increase.

When you are dealing with a single monitor, and you have to flip windows and scroll up and down to get or register data, you are being interrupted. If, however, all those windows were open in a big matrix/montage that you could easily and quickly mouse over, you wouldn’t lose your concentration, and your productivity would go up; it’s as simple as that.

However, to have a large montage of apps all open at the same time, you need two things besides just a couple of monitors. You need a large screen and a lot of resolution. Part of seeing more is being able to actually see more and that means super high resolution. And be cautious. You may have heard about points or pixels per inch (or mm)—PPI is the common term. Higher PPI is only important when comparing the same size screens. For example, you can get amazingly high PPI on a 6-inch smartphone, but you will do far more (and better) work on a 30-inch monitor with half the PPI. A corollary to PPI is megapixels (MP). Here again, you can have a lot of MP on a small screen and be frustrated if trying to do serious interactive work.

Adding multiple monitors to your systems is easy. All modern graphics add-in boards (AIBs) and even most of the integrated graphics GPUs offer multiple display outputs, typically two or three DisplayPort sockets and an HDMI. All one has to do is plug-in the cables, open up Windows Display Settings applet, and click extend on the new monitors. The monitors are relatively inexpensive, ranging about $300 to $500 each. A monitor will easily last five years making the cost 50 cents a day or less. The bigger problem is space, not everyone’s desk has the space for three 17- to 32-inch monitors; you can stack them if that’s a problem.

All those ‘K’s’

I have been using three 31.5-inch 4K screens for a while. That has provided me with 25 MP and an average 140 PPI across 85 inches of screen, but since I have them angled the actual span is just 72 inches.

Dell brought out a new 31.5-inch, 8K screen—7680 × 4320 or 33.2 MP— three 4K monitors will give you 24.8 MP. My plan was to replace the center 4K monitor with the 8K Dell UP3218K. If I did that I’d have 49.8 MP, which for easy conversation one could say 50 MP, and in fact I planned on saying that every chance I got.

I first tried the 8K UP3218K on one of our gaming machines, with a secondary 27-inch, 2560 × 1440 monitor, all driven by an Nvidia GTX 1080TI AIB. Nvidia AIBs don’t like mixed resolution monitors, but in spite of that prejudice, it ran fine. Well sort of fine, had a little trouble with the Windows task bar, but it drove the 8K Dell at full resolution and the secondary monitor at its max resolution too.

What did I see?

Fallout 4’s visuals were greatly improved, because the game features high-res quality texture maps and various lighting tricks. It looked good on the wide screen 3440 × 1440, it looked good on the 4K, and it looked great on the 8K, and in 10-bit color. I can’t say I’d recommend a $3.9k monitor for the average gamer. But for the lunatic fringe like me, and few of even more extreme friends, hell yeah. However, there was a bit of The Emperor’s New Clothes, involved. The models scaled perfectly. The maps, pretty good, but some didn’t look any different than on any other monitor. So after a few weeks of that, I decided it was time to move it to the work machine and test it for the intentions it was built—productivity.

The work machine has a Vega 64 in it, and as mentioned it was running three 4Ks just fine. AMD can drive up to six mixed resolution (and orientation) monitors; it’s one of their unique and strong features. They can even make them blend into one large surface, they call, Eyefinity. Nvidia can also span and merge up to three monitors, provided they are all the same. That can be a handicap because gamers will accumulate a hodge-podge of parts over time.

How many colors?

Dell introduced the world’s first 31.5-inch 8K 10-bit monitor. The 1300:1 contrast ratio display comes with Dell’s Premier Color which Dell says yields 1.07 billion colors—that is 230, and you get that number because the monitor is a 10 bit per primary panel. It’s slightly amazing until you consider that the human eye can detect a luminance range of 1014, or one hundred trillion (100,000,000,000,000) about 46.5 f-stops in camera terms.

Again, pause for a moment and think about this. A 24-bit, 4K monitor requires you to drive 199 million bits every 33 or 16 ms. An 8K, 30-bit monitor requires you to drive 995 million bits—5×. The numbers are staggering, really hard to grasp. What that translates to is 100% Adobe RGB, 100% sRGB, 100% Rec. 709, 98% DCI-P3, and 400 nits with a 1300:1 contrast ratio, not quite the HDR spec, but damn close. This is the monitor that all studios, all high-end rendering shops, super CAD and GIS users are going to want, and the price won’t faze those folks because this monitor equals productivity, which is where we started. But, . . .

As Robert Burns wrote in his 1785 poem, To a Mouse, "The best-laid plans of mice and men often go awry." When we hooked up the three monitors (4K, 8K, and 4K) to the Vega-based machine, we couldn’t hit the high notes. The max resolution we could drive the Dell UP3218 at was 5120 × 2880 (7K—actually, 7.1111K), or 14.75 MP. And here’s where the numbers and those Ks make you a little crazy. An 8K monitor is 33.2 MP, and 7K is only 14.8? It’s less than half an 8K? Damn you pixels.

One other small point, pun intended, is the bezel. The picture doesn’t show it well, but the Dell UP3218 (in the center) has a thin 9.7mm (~3/8th inch) bezel as compared to the 25.4 mm (1 inch) bezel for the others. So, it ironically makes the physical size of the 8K smaller than the 4Ks.

Suffer, suffer and pain

That’s what the super mutants in FO4 say when they beat the daylights out of you, which in my case they seem to do a lot.

Here I had hoped to up my productivity by moving up from a lowly 25 MP to a lofty 50 MP, and instead got stopped at 31.3 MP—a 25% increase and for $4K. OK, yes I can see colors I never saw before, and yes, I can see one hell of a lot more pixels, but, well, damn it—I wanted to say 50 MP.

I asked AMD, If Eyefinity is supposed to be so great, and I have been a big and vocal fan of it for years, where is it when I need it? They told me, I had exceeded the bandwidth limitation of one Vega AIB.

Plan B. I decided to try and use two AIBs. One to drive the 8K, and one to drive the two 4Ks. That would have been a good plan, but AMD said, when they introduced the Vega, that it would run CrossFire—maybe someday.

Well as luck would have it, while I was whining to my pals at AMD about not being able to get to 50 MP, the company announced a new driver; 17.10.1. And although not officially stated, it enabled CrossFire.

Cables. To run an 8K monitor, you have to use two DisplayPort connections. The Vega has three, plus an HDMI. So I plugged in an extra Vega, connected two DP cables to the 8K, one to the left 4K, and an HDMI to the right 4K and walla—I have 50 megapixels shinning at me. Fifty of them, and most of the fifty, 66.7% of them were 30-bit pixels. It feels a little ironic, I remember when I had HD monitors as my sideboard ancillary monitors and had a big 30-inch Dell 2560 × 1600 center monitor. Pah—a mere 6.2 MP. Now I had eight times that.


There are two problems testing a state of the art monitor like the Dell UP3218—content and content. The first problem is finding 8K images to evaluate. The second is finding 10-bit 8K images. We’re still working on that. The images are out there, but they’re locked up with copyrights and other anti-theft chains and padlocks. The third problem is how do you show it to an audience that will be using HD 8-bit monitors? This is a thankless and frustrating job sometimes.

A 4K absolutely stunning, and shows that it still isn't nearly enough for perfect text at high display 8K monitor is advantageous to pro/enterprise users, and is probably the final resolution for monitors.

If you look at the action games people are playing at 1920 × 1080 and 2560 × 1600, they're still often -limited in resolution. This is definitely the case with ranged battles in Fortnite and Player Unknown Battle Grounds (PUBG). However, they will definitely move to 4K. This doesn't mean we'll incur 4X the cost than 1920 × 1080. Rather, the game engines are already rendering some scene elements at 2x or 4x resolution, and some below 1x resolution, and the game engines will just shift those tradeoffs around as displays and GPUs improve.

It's unclear if games will hit 8K in the mainstream consumer market, because of the questionable need at typical TV & monitor sizes and viewing distances. It might be that AR glasses take over before we hit that point. AR has an insatiable demand for pixels; as Abrash's analysis noted, you won't hit physiological limits until around 24K per eye with a 160-degree FOV.

Tim Sweeney told me, “I feel like the cumulative effect of hardware gains has hit a point that's more than incrementally beyond what current apps & games are targeting. These M.2 SSDs are insanely fast; 18-core CPUs are available to consumers; 8K is widely available; Xpoint is coming; and cloud compute economics and are insanely better than a few years ago. To me, it feels like the point around 1996 where the Pentium, Pentium Pro, and Voodoo1 came out and began to radically change the PC platform, bringing what was once wildly expensive enterprise tech to the masses. Give it a few years to percolate throughout industry economics, and I think PC is in for a major resurgence.”

What’s next? Well, obviously three 8K monitors. That will get me to 100 MP. And I’ll still be chasing more—in CG too much is never enough, and the more you can see, the more you can do.


[1] THADHANI, A. J. 1981. Interactive User Productivity. IBM Syst. J. 20, 407-423.