You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
UNDERSTANDING AI HYPE<br />
A framework for investing in the AI opportunity set<br />
perspectives. That said, a potentially useful investment framework that helps to<br />
define near and long-term opportunities is set out below.<br />
<strong>In</strong> 2023, we observed a significant<br />
acceleration in the future value creation<br />
from artificial intelligence, a trend that<br />
was reflected in the share prices of<br />
companies potentially set to benefit<br />
from this multi-generational opportunity.<br />
To navigate this landscape, it's crucial<br />
to separate short-term hype from the<br />
longer-term investment opportunity.<br />
As AI continues to gain momentum,<br />
developing an investment framework<br />
to clarify AI opportunities over different<br />
time horizons is essential. At present,<br />
such a framework could include<br />
computing, infrastructure, models and<br />
applications, and beneficiaries. We firmly<br />
believe that a research-based approach<br />
is the key to identifying potential winners<br />
and avoiding the losers in this everevolving<br />
field.<br />
COMPUTE:<br />
Semiconductors are the brains behind AI,<br />
which is compute-intensive at both the<br />
training and inference stages. Although<br />
semiconductors' growth remains cyclical,<br />
the long-term growth trajectory for<br />
this sector remains exponential, and<br />
the market could almost double from<br />
approximately US $500 billion in 2022 to<br />
more than US $1 trillion by the end of the<br />
decade. A significant amount of this will<br />
likely be driven by increasing computing<br />
demands from AI.<br />
INFRASTRUCTURE:<br />
If semiconductors are the fundamental<br />
building blocks of AI, then companies<br />
providing the infrastructure are the<br />
'plumbing.' This includes public cloud<br />
hyperscalers (such as Microsoft's Azure),<br />
SOURCE: Capitol Group<br />
which allow companies<br />
to outsource computing<br />
to the cloud through<br />
huge data centers. The<br />
advantage of this is<br />
that customers have<br />
Chip designers & providers Cloud hyperscalers<br />
on-demand, pay-peruse<br />
access to the most<br />
Foundries<br />
Datacentres<br />
Manufacturing equipment<br />
Networking<br />
advanced and powerful<br />
computing services<br />
AI ‘stack’<br />
and do not have to<br />
run them on-premise.<br />
<strong>In</strong>frastructure also<br />
includes companies<br />
providing hardware<br />
such as networking<br />
components and switchgear, as well as allocating vast sums of capital to these<br />
software that makes cloud computing models, we only expect a small number<br />
• Compute<br />
more efficient, given AI's high speed and will be able to compete sustainably<br />
bandwidth requirement.<br />
due to the scale requirements and<br />
high barriers to entry − and therefore,<br />
MODELS AND APPLICATIONS:<br />
Much hype surrounding AI is<br />
concentrated on companies 'creating'<br />
AI models. These include names such<br />
as OpenAI, which has garnered plenty of<br />
interest given the success of ChatGPT.<br />
Looking at model developers, we are<br />
wary of potential commoditization<br />
given a large and growing open-source<br />
AI community advocating the 'AI for<br />
Humanity' concept. Data possession<br />
is likely to prove the most important<br />
criteria for identifying ultimate winners<br />
in this space, which naturally favors<br />
owners of large, unique, proprietary<br />
datasets, such as the tech incumbents.<br />
Making a state-of-the-art generalpurpose<br />
foundational model also takes<br />
billions of dollars and talent from a<br />
scarce pool. While many start-ups are<br />
Compute <strong>In</strong>frastructure Models Applications<br />
Beneficiaries<br />
Potentially limitless<br />
Foundational models<br />
Platforms<br />
‘Big Data’ owners<br />
Software<br />
IT Services<br />
Physical applications<br />
as Uber or Airbnb would emerge and<br />
become everyday services. With AI, our<br />
current imagination of what applications<br />
Semiconductors are the brains behind could AI, which be possible is compute-intensive is based on at both our the<br />
training and inference stages. Although limited semiconductors understanding remain of growth this nascent cyclicals,<br />
predict a small the handful long-term of massive growth trajectory for this technology. sector remains exponential and the<br />
winners in the AI market model could area. almost Moving double to from approximately US $500billion in 2022 to more<br />
applications, analysts than US $1trillion believe by software the end of the decade BENEFICIARIES:<br />
4 . A significant amount of this is likely<br />
companies' productizing' to be driven by AI increasing could benefit compute demands from AI.<br />
meaningfully and fast; those winners<br />
Finally, underneath this investment<br />
framework sits the real-life and endindustry<br />
beneficiaries of AI, which could<br />
ultimately be limitless in scope and play<br />
will have a direct monetization lever<br />
by raising prices substantially. • <strong>In</strong>frastructure The<br />
opportunity for If developers semiconductors to provide are the fundamental out building over multiple blocks of generations. AI, then companies Again,<br />
consumer or enterprise-grade providing the infrastructure software are the however, ‘plumbing’. it This important includes public to cloud remember AI<br />
incorporating AI hyperscalers functionality (such is as clear: Microsoft’s Azure), is still which at an allow early companies stage of to outsource development:<br />
consider how a compute company to the like cloud Microsoft through huge it datacentres. remains uncertain The advantage what of the this is technology<br />
that<br />
can add AI to its customers 365-suite, have including on-demand, pay-per-use could access look like to the in most 10 years, advanced how and long it<br />
Outlook, Word, powerful Excel, and compute PowerPoint, services and do not might have take to run for it consumers premise. to build trust,<br />
and charge a substantial <strong>In</strong>frastructure recurring also includes companies and providing how interwoven hardware such in our as networking everyday<br />
premium. We expect components this segment and switchgear, of the as well lives as software AI applications that makes cloud could computing become. We<br />
value chain to evolve more efficient, profoundly given the over high the speed remain and bandwidth focused requirement on opportunities of AI. that<br />
next decade, based on experience with may come out of AI and believe a deep<br />
previous paradigms. <strong>In</strong> the early years research-based approach will become<br />
of smartphones, for • example, Models and few applications could even more critical to identifying potential<br />
have predicted that applications such winners and avoiding losers.<br />
Much of the current hype surrounding AI is concentrated in companies ‘creating’<br />
the AI models. These include names such as OpenAI, which has garnered plenty<br />
of interest given the success of ChatGPT.<br />
Looking at model developers, we are wary of potential commodisation given a<br />
large and growing open-source AI community advocating the ‘AI for Humanity’<br />
4<br />
Data as at 31 December 2022. Source: ASML