Espace Presse – What is the future of the cloud in the face of economic problems?

How would you sum up the issue of economy in the use of the cloud today?

Today, the development of the cloud has reached a stage comparable to electricity in the time of Nikola Tesla (between the end of the nineteenth century and the beginning of the twentieth century), that is, suppose that we have identified the main element and we have begun to create the first infrastructures (to continue the analogy with electricity, it is equivalent to the first lines and the first homes electrified). The cloud also reaches homes via, for example, photo sharing or public clouds such as those offered by web giants such as Google, Apple, Facebook (Meta), Amazon, Microsoft or the Chinese company Baïdu. Today, more than a century later, electricity is widespread and available everywhere and all the time on demand.
For the cloud, it will be the same: it has become an on-demand computing and storage tool – a “commodity” to use the new Anglo-Saxon term. The question then arises: How can we avoid exaggerating and wasting this resource, especially considering what we are beginning to realize about the energy consumption in the cloud?

© Fotolia

What aspects can we play to move towards a more economical cloud?

The first major ingredient is
Implementation of hardware and software components. Thanks to the experience gained in developing processors and memory for embedded computations (small size and low power consumption), we can create highly economical and efficient components. For example, the ARM processor, which has a simpler architecture and therefore less power consumption than traditional processor families, developed by ARM Ltd since 1983, started by conquering phones to give them good power autonomy. It is now increasingly being integrated into cloud computing and storage media, associated with hardware accelerators that improve some parts of high-consuming computing (eg Kalray circuits, a branch of CEA). CEA is very active in this area, with proposals for new architectures and accelerators aimed at drastically reducing the power required for computing on big data, particularly through in-memory computing solutions that reduce data transmission and therefore energy cost.

CEA is an expert in software engineering

© Adobe Stock

In addition to optimizing processors and memory,
It is also necessary to improve the application, that is, the program. At CEA, we have a long experience in software engineering. Close to industrial applications, our teams have, for example, designed tools that are able to program critical processes – which must be responsive and optimized specifically for their environments. They are mobilizing today to adapt these tools to the limitations of computing and storage in the cloud, allowing for example to make better use of the integrations between algorithms and the electronic components they run on.

All this work is very critical, but we can also work on the system as a whole.

Software improvement

© Fotolia

that by saying?

You have to keep that in mindThe cloud is actually several clouds. In fact, if the cloud is a group of devices connected in a network in a data center, then the cloud in its entirety is actually a group of data centers that “talk to each other”. But the latter is often located at a great distance from the applications they are processing – for example, for two people doing a video conference a few kilometers away, the data will sometimes pass through data centers located several hundred kilometers away. So we see the emergence of a concept
‘cloud on edge’ or field cloud: smaller clouds closer to applications, i.e. where we produce or compute data. It is a revolution going on now. If we take a step back, it’s kind of like a galaxy of clouds that appears.

Is it really more economical to have a galaxy of clouds rather than a few huge data centers?

Decentralization and edge clouds provide capabilities more suited to local needs, and many businesses are seeking to improve this local use. In addition, they process proximity data, which therefore does not need to travel thousands of kilometers, which reduces energy consumption. Finally, while global clouds should be public, local clouds It can specialize thanks to the knowledge of the type of data to be processed, thereby increasing its efficiency, thus avoiding the consumption associated with long-distance transmission. It is clear that the effect of this saving has to be analyzed scientifically, depending on the use case. But not everything can be done locally, especially when data must be shared between users, such as video conferencing, or when the amount of data requires very high computing power, as in a digital aircraft simulation.

Thus, the question arises as to the distribution of computations between those that can be performed on these clouds at the edge, and those in which we need large data centers with different computing capabilities. L ‘Organizing this distribution of accounts is a real issue of economics : If we are wrong, we will over-consume clouds on the edge because we will ask them for accounts that are not their size or, conversely, we will build data centers that will run idle because they have not been assigned the appropriate tasks. Therefore, regulation algorithms are fundamental issues in decentralized clouds.

Finally, this account assignment question pops up
Guarantees to be provided so that the calculation is done correctly : Because if there are errors, we redo the calculations, we repeat the efforts, and we consume more energy.

Cranes for a more economical cloud

© Pixabay

Are there other possible actions to move toward a more economical cloud?

The last crane to work is
human consciousness. That is, how can we be sure that the operator, as well as the user of these systems know how to save resources? can pass
Visualize our uses on the cloud, which will allow us to see hot or cold places in our digital area. From there, the energy balance will be displayed. Thus we can offer the user solutions to reduce their consumption, and therefore their carbon footprint – like what already exists for electricity. At the same time, once we get a better visualization of our consumption in the cloud, it will be necessary to have
Tools to monitor and adapt this consumption “on the go” in order to improve energy efficiency. These are true “digital twins” of the cloud infrastructures to be built, and building them will be a challenge for extensive research in modeling, simulation and optimization.

Energy consumption in the cloud and data centers in low numbers

  • Globally, data centers today use 3% of the electricity produced and its share is expected to increase significantly in the coming years. For 2020, this was estimated at 650 TWh, which is higher consumption than France.
  • Annually, a medium-sized data center uses 600,000 cubic meters of water for cooling
  • According to a report by the European Commission published at the end of 2020, European data center consumption fell from 53.9 TWh/year in 2010 to 76.8 in 2018.

Leave a Reply

Your email address will not be published.