Encryption is taking on ever greater importance and it is not unusual to read about important developments in the area that will lead to a significant breakthrough. I was recently sent a short news item about optical computing – a great example of solving a problem caused by not thinking the previous solution through sufficiently.
The need to encrypt data is obvious, but the data still has to be unencrypted if it’s to be computed. As cloud solutions become more prevalent, individual devices are sending calculations back to the cloud servers, increasing the risk that the information can be compromised. Fully homomorphic encryption (FHE) enables computation to take place directly on encrypted data. This means if you have the key, you can send information to the cloud, get it processed and receive it back without compromising anything.
But that takes a lot of time. Optical computing works by encoding data in beams of light rather than electronic currents, using a branch of mathematics called linear algebra to massively speed up processing.
We’re constantly solving problems created by the solution to the last problem we dealt with.
This is the encryption arms race in action. We’re constantly solving problems created by the solution to the last problem we dealt with. In such situations, it is often wise to step back, take a breath and ask why a particular course of action is being taken. Too many technologists are reaching conclusions without understanding the consequences. That’s a bad approach to managing risk. In essence, it means addressing a problem but creating other problems in the wake of the solution to the original problem.
Cryptography starts off being very basic. You shift the alphabet over 1, 2, 3, 4, 5 or 6 places, known as the Caesar Cipher, where, for example, a Q could represent A. Systems evolved into taking a more random approach and developing a key. This then developed into using large integers to create relationships, the basis of the most modern encryption used to protect data transfer on the internet.
Systems that originated in the late-1970s, such as Diffie-Hellman and RSA, are still is use. These have shown their sustainability and longevity, but they are now coming under attack from new developments in technology, such as quantum computers, optical computers and even computers that can solve problems using DNA processors. We went from mechanical computers, such as cash registers, to electronic computers and now we are starting to move towards molecular electronics, using our DNA and artificial neural nets to try to solve equations and problems.
When you start talking about critical infrastructure protection, the implications of any breaches or failures are potentially world-ending.
Within the National Institute of Standards and Technology (NIST) there is a working group known as the Federal Information Processing Standards (FIPS) which is entering its third age, referred to as FIPS publication 140-3. This defines the latest baseline for validating the effectiveness of cryptographic hardware and software. One of the considerations is referred to as quantum resistant cryptography, which in essence means cryptographic systems that can resist attacks by quantum computers.
Encryption is important for financial services, but the repercussions of information leaking go far wider. When you start talking about critical infrastructure protection, the implications of any breaches or failures are potentially world-ending. There are some things in life you don’t want an average person to be able to reach cheaply or for free, such as quantum, optical, or DNA processors and computers.
Considering that humanity is on the verge of the realization of technologies such as quantum, optical, or DNA processors and computers, it’s important to understand that science can pave a path from a mathematical model perspective. The tests are whether we have the technology to build these solutions, whether we have fully considered the consequences and, ultimately, where that might lead us.