Test Shows Arc GPU vs M1 Max Video Decoding Performance, Intel Superior To Apple

OSTN Staff

Test Shows Arc GPU vs M1 Max Video Decoding Performance, Intel Superior To Apple 1

Since the release of the Apple M1 Max laptops, performance figures comparing it with NVIDIA, AMD, and Intel have shown the new chip is not as ideal as the company claims it to be. Now, Twitter user David Huang compared the video codec performance of the Apple M1 Max CPU with Intel’s Arc A380 GPUs to show that Apple does not continue to dominate the competition.

Intel Arc GPU outperforms Apple M1 Max SOC in video decoder performance tests

Readers should note that Apple shines in CPU performance. It offers a higher quality, but only by a slim margin. However, it does take a hit in graphical performance. An example is NVIDIA’s newer RTX 30 series taking close to a third less time to render than the Apple M1 Max can. The two significant parts of the Twitter thread between David Huang and user @Adol_Christin is below.

The translation provided by Twitter shows the performance test results from Huang’s benchmark between the Intel Arc A380 GPU and the Apple M1 Max SOC.

The performance of the M1 Max video decoding unit seems to be inferior to the DG2 (1299 yuan)… In extreme cases (ultra-high bit rate), the decoding performance is only about half, and the gap between low bit rates is relatively small.

In addition, the M1 Max has two media units like the DG2 (Intel Arc A380), so the parallel decoding of more than two video streams can double the number of frames, which is convenient for video clipping and playback.

Other formats such as HEVC 422/444 or ProRes are excluded (

Apple still has a big advantage, the biggest one is…they’re really selling the M1 Pro/Max

Joke aside, Intel DG2 still has very big problems with driver and ISV adaptation. Some up masters complain that DaVinci and other software export videos and even have a splash screen and so on. Apple’s software support is much better by comparison.

And although Intel’s playback performance may be slightly stronger, it is still relatively limited, and the advantage to Apple is not as big as that of AMD VCN.

Video Decoder Performance of several systems and their performance on both the Jellyfish and bilibili video tests. Source: David Huang (@hjc4869 on Twitter)

In the Jellyfish 4K HVEC (30fps, 400Mbps) and bilibili 4K HVEC HDR (120fps, 14Mbps) video stream tests, the bit rate of the Apple M1 Max is at 61 where the Intel Arc A380 GPU bit rate is sitting at 118. The latter video stream offers a similar view, with Apple offering a 417-bit rate to Intel’s 542-bit rate.

However, Huang points out that M1 Max does offer better software support. Still, the argument could also stand that Apple is in its environment, where Intel, AMD, and NVIDIA are used in several configurations outside of Apple’s environment.

The post Test Shows Arc GPU vs M1 Max Video Decoding Performance, Intel Superior To Apple by Jason R. Wilson appeared first on Wccftech.

Powered by WPeMatico

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.