GPUs are driving the future of computing – ShareCafe

by Hamish Chamberlayne – Head of Global Sustainable Equities and Richard Clode – Technology Equities Portfolio Manager

Graphics Processing Units (GPUs) are driving the future of computing. Designed for parallel processing, a GPU is a specialized electronic circuit board that works alongside the computer’s brain, the central processing unit (CPU), to improve computing performance. If you’re reading this article on an electronic device, chances are a GPU is powering your screen’s display.

While GPUs were initially used in computer graphics and image processing for personal and professional computing, the use case has expanded significantly as the technology has evolved. Moore’s Law – the observation that the number of transistors in an integrated circuit doubles every two years while the cost of computing halves – has democratized the use of GPUs by making them cheaper and more readily available, transforming GPU adoption across multiple industries. Today, high-performance GPUs are at the heart of many different technologies and will form the basis of the next generation of computing platforms.

GPUs are designed with the purpose of running a large number of workloads at the same time to increase computing efficiency and improve overall computing performance. While this is beneficial for end markets such as gamers who appreciate high-quality real-time computer graphics, it can also be applied to more serious use cases.

The ability of GPUs to process large blocks of data in parallel makes them optimal for training artificial intelligence (AI) and deep learning models that require intense parallel processing of hundreds of thousands of neural networks simultaneously. The application of deep learning is wide, ranging from enabling web services to improving self-driving vehicles and medical research.

While GPUs have already had a positive impact on real-world challenges, the opportunity to shape innovation across industries has yet to be fully explored. The application of AI and deep learning is key to creating a successful digital future, and this is already becoming a reality as the digitalization trend grows. It is important to recognize that this trend is impacting all industries and as such, effective and powerful technology capabilities are essential as businesses embark on their digital transformation.

Regarding its impact, we believe that digitalization plays a positive role in economic development and social empowerment, and we also see a close alignment between digitalization and decarbonization. The “cracks” of digitization open up the shell of traditionally analog functions, promoting data transparency and equipping businesses and individuals with the right insights to make informed consumption, production and reduction decisions based on their behaviors current. For example, ambitious goals of reducing carbon emissions and meeting climate goals can benefit from data mining, transformation, and analysis to determine the best course of action.

We have already started to see digitalization penetrating and advancing traditional practices – manufacturers are integrating technology into industrial processes to optimize production, building managers are using smart technology and data analytics to ensure that energy is only consumed when needed, and intelligent transportation systems analyze traffic data to reduce congestion. , fuel consumption and emissions. Elsewhere, many digital services have started to replace traditional methods that often require more intense energy consumption, such as online meetings to reduce business travel, reducing the carbon footprint on a global scale.

One of the critical challenges of a global digital transformation is the significant energy required for high performance computing. It is important for us to understand the actual energy cost required of the technology and what can be done to reduce overall energy consumption.

There is a misconception that an increase in data center utilization equals an increase in energy demand. According to the International Energy Agency (IEA), data center power consumption has actually remained stable despite an explosion in data center demand and internet traffic – Chart 1. This disparity is due to effective systems and processes. GPUs minimize the high power load of high performance computing in data centers. For AI applications, some GPUs can be up to 42 times more energy efficient than traditional CPUs. Meanwhile, some GPU-based hyperscale data centers only use 2% of the rack space, making them more space-efficient compared to CPU-based systems. 2 In short, GPUs pack a punch. By enabling smarter use of energy, they partly contribute to keeping energy consumption to a minimum.

Chart 1: Data center energy consumption remains stable

Source: IEA, Global Trends in Internet Traffic, Data Center Workloads and Data Center Energy Consumption, 2010-2020, IEA, Paris https://www.iea.org/data-and -statistics/charts/global-trends-in-datacenter-internet-traffic-datacenter-workloads-and-energy-consumption-2010-2020

Like all industries, technology will need to do its part to combat global climate change and reduce its own environmental footprint, with the goal of achieving net zero emissions. In 2020, the International Energy Agency (IEA) released its annual Tracking Clean Energy Progress report, which reports on key energy technologies and sectors that are critical to slowing global warming. Out of 46 sectors, the IEA named data centers and data transmission networks as one of only six sectors that were on track to meet its sustainability scenario. However, the increase in global internet usage during COVID-19, driven by increased video and conference streaming, online gaming and social media, has seen this ranking slip to “more ‘necessary efforts’ in the 2021 report.3

Despite this setback, we believe that a focus on continued improvements in data center infrastructure efficiency is essential to achieving net zero goals, thereby reinforcing the role GPUs play in creating a sustainable digital world.

While the general use case of AI has many benefits, greater adoption of the technology comes with significant underlying ethical risks.

In cases where AI is cheaper, faster and smarter than human labor, it can be used to replace the existing workforce; chatbots have replaced call center staff due to the natural language processing capability of AI, many factory workers have been replaced by automated factory machines, and robot taxis may soon replace human drivers. We recognize the impact this could have on employment, particularly in concentrated areas, and believe it is essential to consider the long-term consequences on society in these cases. However, we also see a benefit in dropping some monotonous work roles to AI. By freeing up human capital, it provides the opportunity for individuals to participate in more fulfilling roles not possible for AI – personal training, creative design and teaching. By doing so, we believe that society could be enriched for the better.

It is also important to recognize the potential sinister uses for which the technology could be used. The US government recently acted to restrict the export of high-end nVIDIA-produced GPU chips to China to prevent some Chinese companies from purchasing GPUs to enable mass surveillance, including in the case of Uyghur Muslims. We fully welcome any restrictions aimed at reducing potential ethical threats to society.

Some companies, including nVIDIA, have also used ethical frameworks to implement “trustworthy AI” principles within the company’s product ecosystem. We consider it very important to place ethical principles at the heart of product design and development to drive positive change and transparency in AI development.

Digitization is the cornerstone of our future. From humble beginnings, the GPU has become one of the most critical enablers of innovation and digital transformation for society. We also believe that the next generation of computing is essential to achieving the global Sustainable Development Goals. When analyzing individual companies, we believe that shifting to a low-carbon business model is a marker of long-term success, and we look to technology to enable this change.

Footnotes

1 Nvidia Blog, ‘World-record-setting DNA sequencing technique helps clinicians quickly diagnose critical care patients’, 2022

2 Nvidia, Corporate Social Responsibility Report, 2021

3 International Energy Agency, Tracking Clean Energy Progress Report, 2022

Definitions

Deep learning: is to feed a computer system with a lot of data, which it can use to make decisions about other data. This data is fed by neural networks – logical constructs that ask a series of binary true/false questions, or extract a numeric value, from all the data that passes through them, and classify it based on the answers received.

Sherry J. Basler