Last week, we published an article documenting some of the experiences we’ve had with the Galaxy Note 7, sourced from four different units from different XDA members. Shortly after our article went live, The Verge had a somewhat dismissive say in the matter.
We were surprised by the spread of our article. In retrospect, the fact that we used it as a means to specifically call out the lack of documentation or exposure to these performance issues surely played a significant role in its reach. The article went on to be shared hundreds of times and to receive thousands of comments, as well as provoke reactions from both Samsung supporters and haters. We wrote that article not to throw Samsung or any media outlet under the bus, but to cover an issue that we felt had not gathered enough attention from the media at large This is a pattern that many Note enthusiasts should recognize by now: Year after year, many reviews tout the Note hardware and performance while reality slaps credulous new customers in the face.
After pointing out the performance issues of the Note 7, we noticed an unreasonable amount of contempt directed at the device, but also plenty of apologia and damage control in comment sections. We also saw some publications issue editorials defending the device’s performance and overall value with arguments that we believe missed the point of our article. One such example is an article written by The Verge’s Vlad Savov, titled “Stop fretting about the Note 7’s performance” where the author argues that the Note 7 offers comparable (if not mostly equal) performance to other Snapdragon 820 devices. The article also makes an argument in favor of the Note 7’s overall value, which we do not disagree with.
Before tackling some of these arguments, a few things should be made clear. We wrote this article with XDA’s demographics in mind. Vlad called us “the performance obsessives at XDA”, a label that we we welcome and cannot refute — we take performance very seriously, more seriously than more-mainstream publications. Anyone following our change-of-direction over the past year likely noticed that we do in-depth performance-over-time analysis for new devices, and that our reviews’ performance sections alone are longer than some other sites’ full reviews. We do this because we are performance obsessives, and because our readers are savvier than average, often willing to risk bricking their devices to squeeze more performance – or a few more years – out of their smartphones.
We focus on analyzing performance through various means, including but not limited to benchmarks; we also use Qualcomm’s Trepn, Discomark, Gamebench, and GPU Profiling among various other tools to measure real-world performance, of real applications, and to estimate or quantify the impact of background processes, the efficiency of OEM software, and the prowess of the silicon itself. We focus on the real-world aspect of performance as much as we focus on the theoretical side, because we know benchmarks don’t hold all the answers. With the Note 7, we used tools to confirm the data from our senses, not to guide our perception. And even when we do focus on benchmarks, we feature those that emulate real-world workloads like PCMark, and relegate the heavy and discrete simulations to measuring performance-over-time in order to quantify throttling in worst-case scenarios.
The Verge went on to talk about “benchmarks and measurements” as being ‘only analogous” to real world use — for most benchmarks, we can concede that. However, much of what we listed in that specific article was not “synthetic or simulated tests” — rather, they were measurements of application opening times and most importantly to us, device fluidity. The Verge rightly points out that the difference between the HTC 10 and the Note 7 while opening Chrome, as we listed, is only one of about 200ms. He says that the difference is not noticeable, and it is absolutely plausible that it is not for him. But looking at it proportionally, the HTC 10’s launch speed for that app in that particular sample is around 60% of that of the Note 7. You might not notice the difference in isolated tests side by side, or consciously while using the device, but over long periods of time – say, tens of times a day hundreds of day a year – this difference not only adds up, but becomes ingrained into your expectations, and possibly your perception.
This is why, for example, many Galaxy Note owners have marveled at the sheer speed or smoothness of Nexus devices. We can recall how mind-blowingly fast the Nexus 5 seemed in comparison to the Galaxy Note 3 back in 2013 — these two devices shared the same processor, while the Note 3 was the first device to pack 3GB of RAM. System optimization and the differences in “software heft” clearly played a role there, and they still do today, but while the Galaxy Note 3 felt fast, the Nexus 5 felt undeniably faster. This is analogous to today’s Note 7 situation, because like we noted (and exemplified) in the article, nearly every interaction is measurably slower on the Note 7, and at times it is extremely perceptible at that. Vlad noted that he did not notice “anything close to a substantial difference between the speed of the HTC 10, Note 7, or OnePlus 3” and again, that is a plausible claim for his subjective experience. But whether you notice it or not does not mean there isn’t a relatively large difference in performance, be it slower app launch speeds or more dropped frames.
Whether we notice it or not, the frames were dropped and the device stuttered
This is precisely why we used tools like GPU Profiling to measure fluidity — at 60 frames per second, our perception is typically not sharp enough to notice an odd missed frame here or there. But whether we notice it or not, the frame was dropped and the device stuttered. Vlad stated “I can’t say that I’ve yet encountered an Android phone that is [perfect across all performance metrics]”, which is a non-argument that does not invalidate any claims we’ve made. The relative inefficiency of the Note 7 means a higher delta when paired up against the best performers than any other flagship we’ve tested this year. When we use tools like GPU Profiling, we noticed that the percentage of dropped frames under the same workload is significantly higher on the Note 7 than it is on other devices — the Note 7 sometimes even manages to outright lock up for significant fractions of a second, something that other devices seldom suffer. It’s precisely the frequency and volatility of the Note 7’s performance issues that annoyed us the most, given the device randomly begins going on a stutter-spree of a few seconds length on regular usage.
And this is, perhaps, the biggest point we should be making here. We did not base our claims on benchmarks, but real world usage. Truth be told, benchmarks show that the Note 7 is mostly equal to other Snapdragon 820 devices in peak performance. We wrote that article specifically about observable performance because the device felt slow. It was written about real-world usage, not benchmarks — it’s right there in the title. Moreover, we measured the real-world performance with tools that produce results that are easy to grasp, and gave plenty of visual examples to demonstrate the issues we’ve seen across our Snapdragon 820 devices. Said tools merely expand our senses, whether they conform to our or others’ expectations is irrelevant. We noted the extent to which the sub-par performance spreads across the OS. And while we could have gone even deeper (and we will, in our full review), what we found after a few days of regular usage was, in our opinion, enough to demonstrate that the Note 7 is outpaced by competitors with similar (or even sometimes inferior) hardware, despite its premium price and top specifications.
It is paramount to us to speak about these issues because of our demographics, as we mentioned above. Not just because our userbase is somewhat savvier, but also because it is more diverse than other sites’ in that we have users from all over the world, with all kinds of budgets and consequently, all kinds of devices. For example, the largest plurality of phones that browse our site are Nexus 6P owners, yet they account for only ~3% of our total readers. We’ve seen low-end and mid-range devices, as well as affordable flagships sold in emerging markets, reach relatively large percentages too. XDA users are also known for squeezing the most out of their handset, in some cases enough to last them years, and a big aspect of that is performance (we see this in the abnormally large percentage of OnePlus One users that browse our site). We look for good canvases, often in hardware potential, to let our thirst for performance and battery be sated. And we care about the little details, too, especially regarding thesekey aspects of our devices.
Finally, it must be stated that we generally think well of The Verge and that we understand both of our sites appeal to different users. But whether their editors notice a difference or not, such a difference exists. We cannot wrap our heads around the fact that the Nexus 6P, HTC 10, OnePlus 3 and Galaxy Note 7 were all given a 9 out of 10 in the performance breakdown of The Verge’s reviews — clearly, and as Vlad himself noted, one of these is not quite like the others. Even if the speed and fluidity delta would be an estimated 20% (being reasonable), giving all devices the same performance score is misleading, particularly when the Galaxy Note 7 is but the latest heir in a legacy of sub-par performance. If all devices receive the same performance score yet some are clearly superior or inferior, there is really no point to giving the phone’s performance a score in a number line.
The Note 7 is a great device nevertheless — we first listed all the aspects that can justify the price difference between the Note 7 and other devices, and after our performance article, we noted just how remarkable of a phone it is for daily life. We agree with The Verge’s conclusion, in this sense — performance is “not a huge problem for a phone that has the Note 7’s design, camera, display, battery and waterproofing”. We wholeheartedly believe this too, and we agree that phones are ultimately more than the sum of their specs. But none of this changes the fact that its performance is sub-par, nobody’s opinion changes objective reality. We’ve heard such ridiculous rebuttals elsewhere, with arguments such as “the phone sells a lot, therefore this isn’t the case” and “well, I don’t notice it, therefore you are lying or your device is faulty”. And to these arguments, we say this: your anecdote, opinion, or the purchasing habits of the masses do not shape objective reality; it doesn’t matter how many units Samsung sells, the device will still lose more frames per second second on average than other devices with the same hardware setup at this particular point in time.
Whether you perceive it or not, whether your particular unit was blessed with a “higher-binned” chipset and whether your particular usage pattern lends itself to sub-par performance scenarios or not — none of it changes the fact that Samsung could do a lot better with performance. Every year we wish they would, and every year so far we’ve been receiving marginal upgrades relative to the huge strides other phones have made in the same period of time. None of this makes the phone unusable, and we still believe it can easily charge the price it asks for — demand already confirms that the device is a success regardless of UI lag. But when compartmentalizing that aspect in particular, we are completely justified in expecting more out of the company that has dropped the ball on this for so many years in a row (whether “by design” or carelessness), at a time where devices asking for half the price are breaking new records.
One last thing: we noticed Vlad mentioned he was using a Note 7 in Europe. Given his location, we hope that he did not write his article based on experience from an Exynos Note 7; basing his reply on his time with an Exynos variant would mean none of his claims would hold any merit in relation to our piece, as our findings were explicitly limited to Snapdragon variants. Both experiences would be incommensurable and the comparison invalid.