Ryan Shrout
Sep 7 · 5 min read

We have seen a lot of interest and feedback from our IFA Real World Performance event. Staying true to having open and honest communications with our audience, I felt that a short reply about some of that continued discussion stemming from it would be a great place to showcase that initiative.

The Cinebench Question

Much continues to be made about our stance on Cinebench, a benchmark from Maxon, makers of the popular Cinema 4D rendering tool. I wanted to make it very clear how my position was framed during the IFA event (and all previous events where this discussion was had).

I continue to believe that using Cinebench or any other rendering application to measure performance on platforms where ray-traced rendering applications are not typically run is a disservice to the reader and consumer. I consider this especially true when the weighting and value of those results is exaggerated. To Intel, using Cinebench to measure the performance of U-series, thin and light notebooks will tell you how those systems can do ray-traced rendering, it doesn’t match with the applications that consumers actually use on those platforms.

The statistics I gave during the IFA presentation back this up. While 82% of the top mobile tech reviewers that we worked with at our last workshop used Cinebench for competitive comparisons, 0.22% of the 10.8 million notebook and 2-in-1 systems we have data from have run Cinema 4D, the application workload that Cinebench measures.

I did not, and have not, said that I think Cinebench has no value for benchmarking. In fact, I backed its use for testing X-series and W-series products that fit in the prosumer and workstation segments where its workload is more common. Here is the transcript of what I said during the relevant portion of the IFA event:

Again, reiterating the point, that when you’re selecting your benchmarks, selecting the ways to evaluate performance, make them relevant to the segment you are testing against. If you want to look at Cinebench on X-series and workstation series, got it, totally makes sense. On an ultra-thin notebook? Not a good representation of the workloads that those guys are doing.

Going forward I think it is fair to stop using the phrase “real world” on the slides when having this discussion about Cinebench. The ray tracing workload that it measures is indeed used in a real application and is not a synthetic test. The intent was to phrase “real world” to mean ‘the applications that users in this segment actually run,’ but I can see that the title alone would cause confusion.

Maxon is a great partner of Intel’s and was on-stage with us at the recent Siggraph CREATE event, talking about the continued partnership and technology engagements we share. My intent is not to discredit the Cinebench benchmark, but rather to encourage accurate usage of it for evaluation of processors.

Slides and topic transitions

I have also seen some chatter about other data points and messages from our IFA event. Something that I simply didn’t think about before the slides for the event were posted publicly was how much of the discussion, context, and transitions between topics were handled through voice over. Press that weren’t in attendance (or the enthusiast community that looked at the slides separately) didn’t have that advantage, and as such, some of the ideas could be misconstrued or taken out of context.

For example, back to back slides go from our performance per dollar preview of the upcoming Cascade Lake-X processors immediately to a slide showing “82%” with an image of Cinebench. Without context, I can easily see how you assume we are talking about the same product families and topics, but in reality, this was when I moved from our short desktop introduction section to talk about mobile.

For more complete information about performance and benchmark results, visit www.intel.com/benchmarks

Specifically, here is the transcript of the event:

[CLX-X slide on screen]

“But we’re excited about what this means for the high-end desktop brand. We’re committed to it. We are committed to having leadership products. And this is an example of us making the moves necessary to make that happen.”

[Move to “82%” slide]

“Alright, so that was enough of a divergence on the desktop so let’s go into mobile for a little bit. Some of you might have already seen this type of data before. This 82% represents the percentage of notebook reviewers that came to our Ice Lake SDS performance preview out in Santa Clara and decided to run Cinebench.”

Clearly you can see that in context, and with the voice over of me presenting, the move away from desktop discussion to notebook discussion is very clear. Without the voice over…not so much. Consider this one of my personal learning experiences for making slides and presenting!

Windows on Arm comparison

Another area that lacked context with only the slides was my section comparing a “day in the life” of using a Windows on Arm notebook and an Intel-based notebook using the Core i7–8500Y. The goal of this demo was to show what I consider basic applications from a VPN, to Photoshop Elements, to Dropbox, to a game, that were incompatible with the Windows on Arm platform.

With the slides you only see images that show applications not working, not loading, unable to install, etc. At the event itself you were able to see those same applications running and launching and working on the Core i7-based laptop as you would expect on a “Windows without compromise” experience. (You also missed me wearing a set of R2-D2 Mickey Mouse ears, but a story for another time…)

Here is another context-setting statement from the event:

We also have a broad set of applications…the reality is, there are always a couple of usages that I may use, that you may use, and you’re going to be in a position where you have to change the way you work or change the way you play if you want to have that experience. And if you’re willing to make that compromise, that’s your decision.

Usage of press information

Finally, I wanted to apologize to Roman Hartung (Der8auer) as well as the authors quoted from Guru3D and PC Perspective for not personally giving them a heads up on our plan to use their data in our presentation.

I hope this helps to settle some of the questions and concerns floating around the hardware community about our presentation deck and the messaging supplied by them. It’s clear that the context of these presentations is critical to properly explaining our point of view, so I will definitely be paying attention to that going forward. I welcome any other questions or feedback and I’ll try to respond to as many as I can. Happy Friday!

Performance at Intel

Intel’s blog to share timely and candid information specific to the performance of Intel technologies and products.

Ryan Shrout

Written by

Chief Performance Strategist @ Intel

Performance at Intel

Intel’s blog to share timely and candid information specific to the performance of Intel technologies and products.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade