Author: Tronserve admin
Friday 30th July 2021 03:10 AM
From 5G to Climate Change and Back Again
SHENZHEN — At the 2019 Global CEO Summit, no single theme emerged; executives talked about 5G, and design technology, the Internet of things, sensors, virtual reality, ubiquitous communications, and — of course — AI. That said, there was clearly common context for all of it: all of these technology trends are contributing a massive increase of data, and the industry is finally learning how to handle the increase constructively. According to one executive, it's about time, because it is literally an issue of survival.
One executive after another indicated that it is no longer enough to simply keep up with this flood of data (largely by adding capacity), and it’s no longer enough to manage it (storing and processing it), the industry must exploit this torrent by synthesizing it and analyzing it, and then using the results to do things that were not possible before. In fact, it is beginning to do so.
Wei Shaojun, president, IC design branch, Chinese Semiconductor Industry Association, during his opening keynote gradually built a meticulous case for the extent to which he believes 5G technology will be transformative.
He noted that global gross domestic product (GDP) began to grow at an increasingly accelerating rate starting in 2001. “This is not a coincidence,” he said through an interpreter. “It’s the rise of mobile internet, with the introduction of broadband wireless.” He presented graphs that showed the introduction of each successive generation of broadband wireless businesses coinciding with an acceleration in the growth of global GDP.
5G, Wei said, is beginning to be deployed, and it is likely to transform our lives, revolutionize industry and agriculture, and create new industries. China occupies roughly 960 million hectares. The data from one sensor per hectare (at $1 a sensor), could lead to a new level of precision in agriculture, he noted. “5G is not just mobile communications,” he said. “This will affect the entire infrastructure of the nation.”
Tony Hemmelgarn, president and CEO of Siemens Digital Industries Software opened his keynote saying, “Some people say that the best way of dealing with complexity is to try to limit complexity. That’s not going to happen. Customers want to move faster than the competition. They want to lower product development costs. They want to create new business models. Customers can leverage complexity as a competitive advantage.” That certainly includes the design process, he said.
“Complexity,” in the context of the design process, is a synonym for “data” – specifically virtual models of both the electronic and mechanical subsystems of larger products. Siemens, and its Mentor Graphics operation, have been talking about “digital twins” for some time. The basic concept is the ability to create a virtual models. Siemens excels at virtual modeling of mechanical systems, while Mentor excels at virtual modeling of electronic systems.
Two years after Siemens purchased Mentor, the two now have design platform that bridges the digital virtual with the mechanical virtual. IC designers, for example, can change their designs and quickly discover how those changes will affect the mechanical systems that the electronics control — even while everything is still in the virtual domain.
That the closed-loop “digital twin” approach is now practical at the system level is new, but what’s more important is that it transforms the design process. Engineers can now detect problems early in the design process, versus later in prototyping or, worse yet, during system integration. The data available is finally being synthesized, analyzed, and turned into information that can be used to not just accelerate the process, but to improve overall performance.
Mentor rival Synopsys has been moving in the same direction for some time, though the company tends to refer to “virtual prototyping” rather than “digital twins.” Both, by the way, use the phrase “shift-left” — a reference to a timeline for the design process — if you reduce the amount of time it takes to design a product, the line on a timeline that indicates progress shifts to the left.
Synopsys chairman and co-CEO Aart de Geus was also at the Global CEO Summit, and in his keynote de Geus referred to it as “the techonomics of exponentials.” That tongue twister gets at the same phenomenon that Hemmelgarn was describing: using the flood of data coming in and not just adding it to get the sum of its value, but multiplying it to extract additional value — the ability to not just detect, but to act and react.
He, too, talked about what that means for the design process, and spoke about similar benefits for the industry. “You can virtually prototype real systems, predict failure, and fix the problem before it happens,” he observed. But the significance of that capability goes far beyond design and manufacturing. The ability to extract exponential value out of data has global significance — quite literally.
He invoked the most impressive model he said he had ever seen: the models for climate change, which he noted grew in complexity from the 1970s through the ‘80s, ‘90s and so on, to date. The models have became more refined, and more accurate over the years. “In all cases, of course, we see the direction indicates a disaster for the world,” he said. “We have global warming and it impacts many places. Our behavior right now should be a massive shift-left in order — not to fix the problem — you cannot fix the problem fast enough — but at least delay the consequences,” de Geus said.
“Why am I telling you all this? Because this group is one of the most intelligent and educated groups of people in the world. I always say, he who has the brains should also have the heart and the courage to help. We have to take on this problem. Not only by all the new technologies we can build, but by being conscious of how much our existing products consume power. “Engineers design for performance and for efficiency, and that will become more and more important as we collect and process more and more data. The amount of power drawn just by existing cameras alone equal the output of several nuclear power plants,” he said.
“We heard today that there will be multiple IoTs. The sum of all the IoTs multiplied by 5G transmission capabilities is an enormous amount of data that we want to do machine learning on, all of course with massive cloud computation. The result of this is that machine learning itself will cause a massive amount of energy consumption.”That's why what the industry designs, and just as importantly how it designs those things, matters, he explained, closing the loop and bringing the subject back home for his audience full of engineers.