Intel and Nvidia deny the rumor that they faced AMD

The illustration in Intel and Nvidia probably doesn't work together to keep AMD out of gaming laptops

Print Screen: Nvidia

If you have visited AMD subreddit, Linus Tech Tips forums, or elsewhere in the last few months, you may have come across a conspiracy theory that Intel and Nvidia have reached a secret agreement with each other to keep next-generation GPUs out AMD Ryzen 4000 series laptops. If you look at the list of AMD laptops released last year, you might think so. Asus ROG Zephyrus G14, Lenovo Legion 5 and others came with an AMD processor, but nothing bigger than an RTX 2060. Conspiracy theories are tempting, but it seems to be nothing more than a product of the Intel / AMD / Nvidia wars. That doesn’t help unfounded claims on blogs and news sites all over the world it continues to push the same narrative. All we have to do is dig a little to see that there is no juicy scandal here – just a complicated network of how processors and GPUs work together.


In April 2020, Frank Azor, AMD’s gaming solutions and marketing architect, answered a Twitter user’s question about the lack of state-of-the-art GPUs in AMD laptops, saying “You need to ask this question to your favorite OEMs and PC builders.” Around this time the conspiracy theory began to take shape, but the Azores were right. Laptop configurations are decided by OEMs, not chip manufacturers. And these configurations are usually determined by cost, but they must also make sense. Insufficient processor with an overloaded GPU is not a good combination, and this is a kind of trap that falls Ryzen 9 4900HS or less.

Azor even sat down with The Nerd complete in May 2020 to address the issue again, specifically talking about the trust of OEMs in Ryzen processors. “I think the Ryzen 4000 exceeded everyone’s expectations, but for the most part everyone gave us a head start. Because of this, it was hard to imagine a world where we were the fastest mobile processor, “said Azor. “I think when you were planning your notebook portfolio as an OEM and you hadn’t done it yet – and remember, all that planning for these notebooks was done last year – you leaned a little bit into AMD.”

In essence, the OEMs’ confidence that AMD has a fast mobile processor simply wasn’t there. So why pair a high-end mobile processor with something they thought would be inferior? Finding the midpoint, the “meat of the market”, as Azor said, was the laptops with RTX 2060 and smaller. However, even with this reasonable explanation, the rumor is reversed, looking for clues in the processor specifications for the answers.

Gizmodo contacted Intel and Nvidia about these rumors, which both companies vehemently denied. An Nvidia spokesman told Gizmodo: “The statement is not true. OEMs decide on their system configurations, selecting GPUs and then CPUs to associate them. We support both Intel and AMD processors in our product stack. ”

An Intel spokesman shared the same sentiment: “These allegations are false and there is no such agreement. Intel is committed to doing business with integrity and uncompromising professionalism. ”

The firm denials of Nvidia and Intel certainly suggest that this theory has little or not water, however I don’t think you really even need to deny them to know the theory is bunk. The fact is that the Ryzen 4000 series has never been a strong competitor to the latest mobile games.


There are three elements of AMD’s Ryzen 4000 series, which probably considered the OEMs’ decision not to associate it with a state-of-the-art graphics card. There are PCIe limitations, CPU cachee, and most obviously: single-core performance.

Games are based more on single core performance than multi-core performance, and Intel usually has better single core performance. This is true both historically and in terms of Intel’s 10th generation compared to the Ryzen 4000 in the AMD series. Heck, the 10th generation Core i9-10900K gaming benchmarks are even on par with the latest AMD Ryzen 9 5950X when both are associated with an RTX 3080.

IIn our previous laptop test, AMD’s Ryzen 9 4900HS in Asus’ ROG Zephyrus G14 had unique core performance weaker than MSI’s Creator 15’s Intel Core i7-10875H. The Core i7-10875H is not at the top of the 10th generation Intel mobile line, but the Ryzen 9 4900HS is at the top of AMD. Yesand with almost the same GPU (RTX 2060 Max-Q on G14, RTX 2060 on Creator 15), the Intel system still had an average 1-3 fps higher (1080p, ultra settings.). Pairing a more powerful GPU with a Ryzen 9 4900HS would most likely have blocked some games due to its unique performance.

This will lead to less stellar performance compared to Intel offerings – especially when combined with the wimpy L3 CPU cache from the Ryzen 4000 series. Only 8 MB of L3. This is half of Intel. So the average time required to access data from main memory would be slower than the Intel mobile processor.

The PCIe limitations of the Ryzen 4000 series could also have contributed to the reluctance of OEMs to adopt, but the idea is a bit more shaky. It appeared in a blog post igorLAB this has been explained because Ryzen 4000 CPUs have only eight PCIe 3.0 bands dedicated to discrete GPUs, this could cause a crash if associated with something larger than an RTX 2060. Each PCIe device requires a certain number of bands to run at full capacity and both Nvidia and AMD GPUs need 16. Because 10th-generation Intel processors have 16 support bands, this made them better suited for GPUs. RTX 2070 and higher in last year’s list of gaming laptops.

However, many people on Reddit and other online forums pointed out that the decrease in performance from associating a Ryzen 4000 processor with an RTX 2070 GPU or higher would be very small, if not visible at all, so for them the explanation was not sens. (More fuel for conspiracy theory.) Of course, I had to test this on my own to see if there really was a drop in performance from 16 bands to 8.

I did my own tests the fact that 16 bands really offer better performance at the top range GPUs, but that this performance can also be quite negligible. GI used a much more powerful processor than the Ryzen 9 4900HS, so it was able to handle an even bigger RTX 2060, no matter how many PCIe tapes were available.

My test PC was configured with: an Intel Core i9-10900K, Asus ROG Maximus XII Extreme, 16 GB (8 GB x 2) G.Skill Trident Z Royal DDR4-3600 DRAM, Samsung 970 Evo 500 GB M. 2 PCIe SSDs, a Seasonic 1000W Power Supply and a Corsair H150i Pro RGB 360mm AIO for cooling.

The performance of the games barely changed after I changed the PCIe configuration from 16 bands to 8 bands, but the difference in performance was visible in the synthetic parts. Comparing an RTX 2060 with an RTX 2070 Super (the closest GPU I had on hand to an RTX 2070), I performed benchmarking tests in GeekBench 5, 3DMark, PCMark 10, Shadow of the Tomb Raider, and Metro Exodus, some of which are part of our regular list of tests.

The frame rate increased only by a maximum of 4 fps, the most notable difference being Shadow of the Tomb Raider at 1080p. This supports what many have said about the performance of games that are not substantially affected by reducing the number of PCIe tapes to the GPU until you reach something as powerful as the RTX 2080 Ti.

Synthetic reference tests did not change much from 8 bands to 16 bands with RTX 2060, but the difference in scores was more pronounced with RTX 2070 Super, suggesting that there is a measurable difference that could count in other applications. The GeekBench score of the RTX 2070 Super increased by 3000 when all 16 bands were made available to the GPU. Time Spy scored in line with game standards, and strangely enough, the RTX 2060 recorded a bigger boost in the PCMark test compared to the 2070 Super.

Of course, synthetic benchmarks are not a measure of real-world performance, and PCIe bandwidth is not the main thing that will slow down the system. But because many reviewers use these values ​​to paint a picture of a laptop or desktop, any of the AMD 4000 series processors associated with something larger than an RTX 2060 would have seen lower-than-usual scores. For state-of-the-art GPUs that are “performance-based”, every extra number, every extra frame matters, especially when there are a lot of OEMs competing for a place on the desktop.

This suggests that, yes, OEMs will favor the “better” processor, even if the better CPU is only marginally better. The lack of AMD 4000 series processors, coupled with state-of-the-art Nvidia graphics, could be the result of the OEM underestimating the number of consumers interested in this type of laptop configuration last year, but more likely to be related to the lack of the 4000 series L3 cache and speeds less than a single core. Sure, RTX 2070 and later can work well on PCIe x8, but if the CPU doesn’t have the juice to run the GPU, none of that matters.


There is one last point in disagreeing with this theory. If Intel and Nvidia had collaborated to exclude AMD then why am I more OEM embracing wholeheartedly AMD / Nvidia combo around this hour? Many of their AMD Ryzen 5000 series laptops will have up to an RTX 3070 or 3080; the latest AMD Ryzen mobile processors will have 20 MB of L3 + L2 cache and will support up to 24 PCIe Gen4 bands (16 bands dedicated to a discrete GPU) – exactly what it needs to pair nicely with something bigger than a mid-range card.

Companies are regularly involved in a multitude of shadows activities that increase their result, at the same time affecting consumers and affecting the choices we make every time we enter a Best Buy with money burned in our pockets. But no, Intel and Nvidia are probably not to blame for the slow adoption of AMD processors by OEMs. AMD has had to spend the last few years rebuilding its reputation and creating processors that really compete with Intel in the mobile space and that can support the powerful GPUs that Nvidia produces for laptops.

The Ryzen 4000 series was very good, but not very prepared to compete in the areas that matter most gamers and OEMs for gaming laptops. The Ryzen 5000 series, if the adoption of OEM indicates something, will be a completely different beast. And it will probably be found in all the big gaming laptops in the 4000 series. Nvidia and Intel have nothing to do with it.

.Source