With another rumor about Intel NASDAQ:INTC killing off a large chunk of AXG being posted by a known leaker, I wanted to put my pen to the proverbial paper and render my stance in stone: calls for divesting AXG (Accelerated Computing Systems and Graphics) after barely one attempt and a mere $3.5 Billion in spend is (apologies for the informal language) the silliest thing I have heard in a while.
Before we begin, I will note that I have reached out to several high ranking sources within Intel and they have categorically denied these recent rumors. Arc discrete is definitely not cancelled *right now* and they have even gone as far as to call the rumors 'FUD'. Intel's Tom Petersen and CEO Pat Gelsinger have also both reiterated firm support for AXG and repeated that 'we aren't going anywhere'. My sources said, however, that Intel will be actively reassessing the volume released in the coming quarters - as will NVIDIA and AMD - in the wake of crypto-demand-collapse and the market becoming flooded with used GPUs.
Unfortunately, when there is smoke, there is a non-zero chance of fire and since there has been at least one call by an extremely influential analyst for Intel to divest AXG and exit the discrete GPU market, it seems prudent for the other side of the argument to reach Intel shareholders as well.
Developing AXG and Arc discrete GPUs isn't just another investment and about making money for Intel, it's about survival
Almost 12 years ago, Blockbuster declared bankruptcy, utterly disrupted by the digital video industry. In retrospect, and with perfect hindsight, shareholders of the once-behemoth would have gladly invested the vast majority of revenue then, to be part of the digital-now. Industries change all the time, and while it isn't always easy to see upcoming disruption, managing change is a core part of the responsibility to shareholders by management. This usually involves risk - either financial risk or the risk of disruption, and depending on the estimated size of disruption, you always want to err on the side of financial risk to stave off potential extinction.
Back when AMD's Zen was an upcoming architecture and the stock was hovering around $1, I repeatedly emphasized the disruption it would cause in the x86 CPU market and even publicly warned Intel against the threat in several editorials (the last of which can be read here). Years later, the full Zen-effect has been felt by Intel shareholders. I will take this opportunity, now, to publicly reaffirm support for AXG on the off chance that any Intel executives were even *considering* acting on these rumors.
So what qualifies as 'cancelling' Arc or divesting AXG?
This brings us to a slippery problem. Companies will very rarely exit deeply entrenched positions overnight, a slow winding down is usually the modus operandi - so before we can discuss cancellation or its impacts, we need to define what it is. Since the original rumor has given us a foundation to work off off, lets use that. Wrapping up AXG to be sold off to a third party, obviously counts as a cancellation. Cancelling the commercial Arc lineup (making the A770 and cohort being the first and last commercial discrete GPUs) to focus only on data-center products is a cancellation.
Limiting future generations (Battlemage and Celestial) to a mobile-only product lineup also counts as a cancellation of the discrete lineup. It is worth noting that the learning curve required to build a data-center GPU usually has significant overlap with the commercial side so a cancellation of one side will usually always be followed by an inevitable cancellation of the other.
On the other hand, what doesn't count as a cancellation, and something that Intel should undoubtedly do - is reassess the volume planned for Battlemage and even Alchemist GPUs. With markets flooded with used GPUs after the cryptocurrency market collapse and Ethereum shifting to Proof of Stake, even NVIDIA and AMD will be significantly reassessing volume. That said, this volume should not be less than the one required to achieve a complete and holistic product launch (in other words, DG2 levels of volume do not count) with a sufficient user base.
Answering the unintelligent thesis of 'Arc has been delayed, and Intel has failed to capture any market share, all the while burning $3.5 billion'
Let's stick to commercial discrete graphics (aka Arc). Anyone that expected Intel to capture market share right out of the gate was either grossly unaware of the complexities involved in building a commercial GPU, was deliberately being obtuse, or both. AMD and NVIDIA are sitting on decades of experience, of not only discrete GPU chip-building but the software stack that harnesses the magic of atomic-sized silicon gates. While Intel is probably the best-positioned company in the world (in terms of vertical integration, tacit knowledge, tangential core competencies) that has a shot at turning the GPU duopoly into an oligopoly; it would be impossible to achieve it in just one generation (or even two).
To answer a common clap-back, no, DG1/DG2 did not achieve sufficient volume for me to consider it as a 'shot' - although a lot of people would tell you that it was one. Sufficient volume of a commercial product is an important consideration, not because of the optics, but because it results in trillions of hours of free QA testing by end customers allowing companies to detect and fix critical issues in the driver stack that are feasibly impossible to spot in the lab. Without this 'public beta testing' no company can have a truly competitive silicon product right out of the gate. Most build on the success of previous software stacks for future generations and it represents some of their most valued IP. So Intel has essentially not even had one true shot at a commercial discrete GPU product yet.
Now, yes, Arc was delayed, but so were their CPUs and while calls for Intel to pick up the slack are more than warranted, calls for Intel to divest what is probably the single most important diversification after the foundry business are essentially asking the company to take on a massive amount of disruption risk (valued in major fractions of their market capitalization) just to avoid a small financial risk of $3.5 Billion. Considering Intel has made tens of investments (and divestitures) in the past valued at tens of Billions, not taking a relatively small financial risk on the one investment that could truly transform its business outlook is an exercise in sheer folly. Anything less than three high-volume commercial generations and at-least a decade of building up the software stack is simply not enough data for the stakeholders to even calculate Intel's chances of success in this market.
It isn't all doom and gloom either. AXG has worked at least one miracle - namely XeSS - which is a critical AI-upscaling feature designed to offload some of the graphics load to AI cores and allow GPUs to punch above their weight class. XeSS has been shown to be just as good (if not better) than competing solutions form NVIDIA and AMD. The hardware also clearly has a lot of potential and like AMD of old, continued driver updates will slowly chip away at the locked potential over time.
The GPGPU movement has been incredibly transformative to the commercial computing industry and is in some ways more valuable than a CPU itself. There could very well come a point where a CPU will only be used for niche instruction sets and managing the OS while a GPU handles most of the actual work. The increasingly parallel world of computations (driven by a world hungry for AI and machine learning) has a solid foundation in GPGPU and if Intel does not want to risk getting left by the wayside, and remain competitive with AMD and NVIDIA in all aspects, divesting or even slowing investment in discrete GPUs is simply not something it can even consider. The question shouldn't be if Intel AXG is worth the ROI against a spend of $3.5 Billion (over 5 years), it is if Intel wants to continue building its shot at the inevitable, AI, ML and GPGPU-rich future - because AXG, is the way.
The post Cancelling Discrete Graphics Is Not An Option If Intel Wants To Be Part Of The Future by Usman Pirzada appeared first on Wccftech.