A group of computer vision researchers from ETH Zurich want to do their bit to enhance AI development on smartphones. To wit: They’ve created a benchmark system for assessing the performance of several major neural network architectures used for common AI tasks.
They’re hoping it will be useful to other AI researchers but also to chipmakers (by helping them get competitive insights); Android developers (to see how fast their AI models will run on different devices); and, well, to phone nerds — such as by showing whether or not a particular device contains the necessary drivers for AI accelerators. (And, therefore, whether or not they should believe a company’s marketing messages.)
The app, called AI Benchmark, is available for download on Google Play and can run on any device with Android 4.1 or higher — generating a score the researchers describe as a “final verdict” of the device’s AI performance.
AI tasks being assessed by their benchmark system include image classification, face recognition, image deblurring, image super-resolution, photo enhancement or segmentation.
They are even testing some algorithms used in autonomous driving systems, though there’s not really any practical purpose for doing that at this point. Not yet anyway. (Looking down the road, the researchers say it’s not clear what hardware platform will be used for autonomous driving — and they suggest it’s “quite possible” mobile processors will, in future, become fast enough to be used for this task. So they’re at least prepped for that possibility.)
The app also includes visualizations of the algorithms’ output to help users assess the results and get a feel for the current state-of-the-art in various AI fields.
The researchers hope their score will become a universally accepted metric — similar to DxOMark that is used for evaluating camera performance — and all algorithms included in the benchmark are open source. The current ranking of different smartphones and mobile processors is available on the project’s webpage.
The benchmark system and app was around three months in development, says AI researcher and developer Andrey Ignatov.
He explains that the score being displayed reflects two main aspects: The SoC’s speed and available RAM.
“Let’s consider two devices: one with a score of 6000 and one with a score of 200. If some AI algorithm will run on the first device for 5 seconds, then this means that on the second device this will take about 30 times longer, i.e. almost 2.5 minutes. And if we are thinking about applications like face recognition this is not just about the speed, but about the applicability of the approach: Nobody will wait 10 seconds till their phone will be trying to recognize them.
“The same is about memory: The larger is the network/input image — the more RAM is needed to process it. If the phone has small amount of RAM that is e.g. only enough to enhance 0.3MP photo, then this enhancement will be clearly useless, but if it can do the same job for Full HD images — this opens up much wider possibilities. So, basically the higher score — the more complex algorithms can be used / larger images can be processed / it will take less time to do this.”
Discussing the idea for the benchmark, Ignatov says the lab is “tightly bound” to both research and industry — so “at some point we became curious about what are the limitations of running the recent AI algorithms on smartphones”.
“Since there was no information about this (currently, all AI algorithms are running remotely on the servers, not on your device, except for some built-in apps integrated in phone’s firmware), we decided to develop our own tool that will clearly show the performance and capabilities of each device,” he adds.
“We can say that we are quite satisfied with the obtained results — despite all current problems, the industry is clearly moving towards using AI on smartphones, and we also hope that our efforts will help to accelerate this movement and give some useful information for other members participating in this development.”
After building the benchmarking system and collating scores on a bunch of Android devices, Ignatov sums up the current situation of AI on smartphones as “both interesting and absurd”.
For example, the team found that devices running Qualcomm chips weren’t the clear winners they’d imagined — i.e. based on the company’s promotional materials about Snapdragon’s 845 AI capabilities and 8x performance acceleration.
“It turned out that this acceleration is available only for ‘quantized’ networks that currently cannot be deployed on the phones, thus for ‘normal’ networks you won’t get any acceleration at all,” he says. “The saddest thing is that actually they can theoretically provide acceleration for the latter networks too, but they just haven’t implemented the appropriated drivers yet, and the only possible way to get this acceleration now is to use Snapdragon’s proprietary SDK available for their own processors only. As a result — if you are developing an app that is using AI, you won’t get any acceleration on Snapdragon’s SoCs, unless you are developing it for their processors only.”
Whereas the researchers found that Huawei’s Kirin’s 970 CPU — which is technically even slower than Snapdragon 636 — offered a surprisingly strong performance.
“Their integrated NPU gives almost 10x acceleration for Neural Networks, and thus even the most powerful phone CPUs and GPUs can’t compete with it,” says Ignatov. “Additionally, Huawei P20/P20 Pro are the only smartphones on the market running Android 8.1 that are currently providing AI acceleration, all other phones will get this support only in Android 9 or later.”
It’s not all great news for Huawei phone owners, though, as Ignatov says the NPU doesn’t provide acceleration for ‘quantized’ networks (though he notes the company has promised to add this support by the end of this year); and also it uses its own RAM — which is “quite limited” in size, and therefore you “can’t process large images with it”…
“We would say that if they solve these two issues — most likely nobody will be able to compete with them within the following year(s),” he suggests, though he also emphasizes that this assessment only refers to the one SoC, noting that Huawei’s processors don’t have the NPU module.
For Samsung processors, the researchers flag up that all the company’s devices are still running Android 8.0 but AI acceleration is only available starting from Android 8.1 and above. Natch.
They also found CPU performance could “vary quite significantly” — up to 50% on the same Samsung device — because of throttling and power optimization logic. Which would then have a knock on impact on AI performance.
For Mediatek, the researchers found the chipmaker is providing acceleration for both ‘quantized’ and ‘normal’ networks — which means it can reach the performance of “top CPUs”.
But, on the flip side, Ignatov calls out the company’s slogan — that it’s “Leading the Edge-AI Technology Revolution” — dubbing it “nothing more than their dream”, and adding: “Even the aforementioned Samsung’s latest Exynos CPU can slightly outperform it without using any acceleration at all, not to mention Huawei with its Kirin’s 970 NPU.”
“In summary: Snapdragon — can theoretically provide good results, but are lacking the drivers; Huawei — quite outstanding results now and most probably in the nearest future; Samsung — no acceleration support now (most likely this will change soon since they are now developing their own AI Chip), but powerful CPUs; Mediatek — good results for mid-range devices, but definitely no breakthrough.”
It’s also worth noting that some of the results were obtained on prototype samples, rather than shipped smartphones, so haven’t yet been included in the benchmark table on the team’s website.
“We will wait till the devices with final firmware will come to the market since some changes might still be introduced,” he adds.
0 comments :
Post a Comment