Computer-aided engineering (CAE) has come a long way. Ever since its inception in the 80s, steady improvements have been made, making it faster and easier to use computers in aid of R&D.
With the increased interest and development in cloud scaling, the speed and accuracy of these simulations will see a drastic improvement. Finally, as quantum computers are lurking behind the corner, the R&D processes and CAE as we know them will forever be transformed.
This article will first rewind the clock to see where we came from to eventually give you a glimpse of what is yet to come.
Better hold on to your seats — this might get rowdy.
CAE has its roots in the 80s where engineers could reserve slots from a supercomputer to get help with their computational needs.
As desktop computers became more powerful, companies started to develop software for transferring the computations to the personal machines.
Back in the day, these computers were slow, of course, and because of that, the server rooms still served their purpose - in fact, they still do.
Cloud computing allows access from your desktop to a cluster of cloud-based computers that can solve the simulation problems in parallel, speeding up the solving process immensely.
Yet, if you ask modern engineers, they are still most likely limited by their desktop processing power.
The issue is that the software based on code written in the last century does not scale well in the cloud. For the most part, it is an insurmountable challenge to get the old software to truly take advantage of the parallel nature of cloud scaling.
When most of the simulation software used today was written, there was nothing else than shared-memory desktop computers.
Now, it would require tremendous development effort to port all their features to run efficiently on a distributed-memory cloud infrastructure. Therefore, they are limited to parallelizing only some sections of their software on the cloud and using only a limited subset of the existing scaling algorithms.
For some companies, the vulnerability of uploading confidential designs to cloud servers might make them nervous, but this really shouldn't be a deal-breaker for any companies discussing and sharing these designs on Teams, Slack, or any other cloud-based software, as the information security is held at the same high standard.
The path toward cloud revolution has been going strong for the last decade with new software and new technologies making it possible to get these massive speed-ups by harnessing the potential in cloud scaling.
We, here at Quanscient, are at the forefront of this development, with impressive results already rolling in.
Quantum computing is the next major milestone.
Before the full-blown fault-tolerant era where quantum computers are robust enough to reliably provide a practical advantage, we are at a so-called NISQ (noisy intermediate-scale quantum) -era.
During this transition phase, we can already expect to see speed-ups and the quantum team of Quanscient is working on precisely this as we speak.
After harnessing the full potential of a quantum computer, engineers are able to create digital twins — perfect digital replicas — of their designs and transfer the majority of the R&D to the digital scope.
Furthermore, as the VR technology has evolved enough, you can literally enter the metaspace to inspect the simulations yourself.
Just imagine what can be done with virtually unlimited computational power and algorithms that can precisely solve any physical circumstances.
The year is 2032. You are an engineer.
You have created a digital twin of the prototype you are designing and are now ready to test it out.
You enter the meta-space to inspect the simulation you made for the new design. You test with some variables and different parameters and see the changes in real-time.
After some iterations and a few more simulations, you are satisfied with the results. No physical prototypes needed.
Artificial intelligence is also one of the most promising fields in which quantum computing will have a significant impact.
This would, yet again, have a major impact on R&D as machine learning could be trained to understand designs and optimize them.
Power consumption, aerodynamics, weight, stability, and friction would be just a tiny portion of variables that the machine learning algorithms would be able to grasp in context and modify the design to optimize for them.
Furthermore, ML could be able to skip certain irrelevant parts of the simulations providing further speed-ups.
Needless to say, the possibilities in this scenario are, as well, no less than stunning.
If you share our vision for this future, sign up for our newsletter or be in touch directly here.