Jon Peddie Back Pages - It's all about the pixels

Made any 4K videos lately?

Posted by Jon Peddie on October 8th 2014 | Discuss
Tags: 4k

Do you remember when YouTube first hit the scene? Do you remember people saying, Who the hell wants that? Why would I want to look at some dopey home video? How will they ever make money? and all the other usual tripe that narrow-minded people spout when confronted with something new and different. Today, of course, they’re wishing they had bought shares in the company then.  Since then it’s gone from an $11 million startup to a company valued at $40 billion and in the process made a lot of smart moves and bets. One of them is its 4K channel.…

Are graphics worth it, do they matter?

Posted by Jon Peddie on September 23rd 2014 | Discuss
Tags: nvidia gpu amd apple samsung movie cg

How would you measure it? In my travels I’ve been in various discussions of late about the value of graphics—are they important? Depends a lot on the content is the short answer. The example I use in my university lectures is the beautiful Aki in Final Fantasy: The Spirits Within (2001).  The graphics rendering in the first full-length CG film broke new ground in realism, and the characters looked fantastic—even more so given it was 13 years ago. But the graphics couldn’t save the movie because it had no story, or at least no story anyone cared about. The movie industry…

CG and CV are black holes for processing power

Posted by Jon Peddie on September 9th 2014 | Discuss
Tags: peddie flops mip black holes cg dreamworks

Too much, good enough—nonsense   Some of you reading this may know I’ve postulated a few axioms over the years, all of them about scale in one way or another. One of my favorites is my first: In computer graphics, too much is not enough—1981. It was true then, and it’s true now. It’s also why I get so tired of the question, but haven’t integrated graphics caught up? No. There is no catchup. You can’t catch up. You’ll never catch up.  A friend of mine more famous than I, Jim Blinn, also has an axiom: Blinn’s Law: As technology advances,…

The PC isn’t dead—I told you so

Posted by Jon Peddie on August 27th 2014 | Discuss
Tags:

What’s next? In terms of economic recovery, and overall growth trends, the PC market is a lot healthier than many others—  automobiles, for example. However, although the PC market recovered faster than autos, it got gob-smacked by the impact of tablets. Seeing the downturn in sales, the sharpshooters on Wall Street drew a straight line and cleverly predicted the PC would be dead and gone by 2020. Those were the same bright folks who created derivatives that sent the world economy into a nosedive; definitely the folks we want to be listening to. Fortunately, the world beyond Wall Street wasn’t paying attention…

We’ve crossed the line, and there’s no turning back

Posted by Jon Peddie on August 12th 2014 | Discuss
Tags: peddie augmented reality computer graphics ar cg

What’s in a name?  The Holy Grail in computer graphics is the suspension of disbelief—to tell such a convincing story with pixels that the viewer not only totally believes it, but thinks he or she is in it, a participant, voluntary or not. We’ve had such experiences in the cinema for a long time. A story is told, and we become so engrossed with it that when there is a shocking moment like the Alien popping out of an unexpected place, or a FedEx airplane falls apart, we duck, scream, or worse. The images stay with us for decades like the…

Work anywhere any time

Posted by Jon Peddie on July 30th 2014 | Discuss
Tags: nvidia amd intel hp dell ibm professional graphics engineers

Share your GPU, use your colleagues We truly have entered the era of ubiquitous computing. It started in the 1960s with time-share computers, expanded in the late ’70s with the commercialization of the APRANET into the Internet, and further developed in mid-2000s as the concept of the cloud became universal. The final step was virtualization, and specifically virtualization of the GPU. Virtualization of the GPU has had many fathers. The first commercially available example of it was 3Dlabs’ virtualization of code space in the GPU in 2004 in the Wildcat VP. Also in 2004, Imagination Technologies enabled a single core to…

Learning how to count

Posted by Jon Peddie on July 16th 2014 | Discuss
Tags: apple arm q1 pcs

It would be a lot easier if you weren’t such a skeptic In mid-June, Intel said, due to stronger than expected demand for business PCs, it expects second-quarter revenue to be $13.7 billion, plus or minus $300 million, as compared to the previous range of $13.0 billion, plus or minus $500 million. Intel now expects some revenue growth for the year as compared to the previous outlook of approximately flat, driven mostly by strong demand for business PCs. The company will provide additional commentary on all business segments when it reports second-quarter earnings on July 15. That, no doubt, helped Gartner…

Trending up

Posted by Kathleen Maher on July 15th 2014 | Discuss
Tags: peddie maher jpr techwatch gdc q2

I’m so glad we started doing these quarterly issues because it gives us the ability to look at trends over a three-month period and it also helps explain why our travel budgets are so high. What becomes most clear in this issue is that companies are looking for new opportunities and staking out territory. The flip side of this is that com¬panies are acting like they’re ashamed of their traditional businesses—PCs, notebooks, even plain old smartphones. At Nvidia’s GTC conference, when an analyst dared to ask Jen-Hsun Huang how he planned to manage the difficult dynamics of the chip business, he…

The virtual Internet of thingies

Posted by Jon Peddie on July 1st 2014 | Discuss
Tags: ibm linux salesforce softlayer virtual internet dropbox

When everything is connected, and all of the data those thingies collect is somehow stored, all the fabs in the world wouldn’t be able to make enough memory to hold all of the bits the sensors sense. However, there are a bunch of really smart people who can see this inevitability and are worried about it. Better than that, they’ve got ideas about how to manage the hydra that is growing in the cloud, computers, devices, and thingies. That’s both the good news and the bad. It’s good because there are some really exciting and clever solutions being suggested, studied, and…

Evolving graphics APIs

Posted by Jon Peddie on June 17th 2014 | Discuss
Tags: nvidia amd directx opengl api mantle

Waiting for the dust to settle All hell is going to break loose and the PC industry is going to go up in flames if another API war is launched. AMD, Apple, Khronos, Microsoft, EA, Crytek, and others are threatening to unravel all the work done over the past 15 years to stabilize the in-dustry. Or are they? Those of you who were around in the mid to late 1990s will recall we had several APIs to use. Glide was one of the most popular for gaming, Microsoft was improving DirectDraw, OpenGL was available and used in some games, and most…