22 November, 2022

Quantum secure encryption and data storage.

Recently Siliconangle reported a flaw in "Microsoft 365 message encryption". The Electronic Code Book (ECB) used to encrypt mail is a simple encryption method that converts 'message blocks' into separate...

Recently Siliconangle reported a flaw in "Microsoft 365 message encryption". The Electronic Code Book (ECB) used to encrypt mail is a simple encryption method that converts 'message blocks' into separate standard encrypted text blocks rendering the text unreadable. The problem associated with such a simple form of encryption is that hackers are able to re-engineer the used codebook without any trouble, if they get their hands on enough of these encrypted mails, hence destroying Microsoft 365 message encryption’s security.

Flemish Origin

ECB is first-generation encryption, defined in the Advanced Encryption Standard AES. Considering it's the most basic form of block encryption, it's astonishing Microsoft would still use this minimalistic encryption method. A more advanced form of block encryption is Cypher Blocker Chaining (CBC), where the CBC mode in each text block depends on the plaintext blocks processed up to that point. In other words, a kind of blockchain that makes decryption and codebook reengineering much more cumbersome.

AES stems from the work of two Flemish cryptographers, Joan Damen and Vincent Rijmen, who, under the name 'Rijndael', developed the specification for this standard in 1999, basing their method on a family of numbers with different key and block sizes. For the AES standard, NIST selected three parts of this Rijndael family, each with a block size of 128 bits, but with different key lengths: 128, 192 and 256 bits.

The Future of AES

AES was released by NIST in 2001 and incorporated into the ISO/IEC 18033-3 standard. The US government adopted this encryption standard in 2002, and it still remains the first and only publicly accessible encryption. Approved by the National Security Agency (NSA) for top-secret information when using NSA-approved cryptographic modules. The clever thing about AES is that it relies on permutation substitution. Several substitution and permutation steps ultimately yield an encrypted block and therefore, are inherently less susceptible to the above crypto analysis as with the more straightforward Electronic Code Book.

The flexibility in the use of key lengths contributes to that security. For example, 10 rounds with 128 bit, then 12 rounds for 192 bit and then 14 rounds with 256 bit. After all, unknown variation requires extra computing power for deciphering. In a previous blog, 'Quantum safe,' I reported that the NIST had chosen four new encryption algorithms this year in connection with the arrival of the enormously powerful quantum computers. NIST's new quantum cryptographic standard will be valid for two years. An important milestone in the world of encryption for the continued protection of the growing amount of data and information generated. Fortunately, over the past few years, an increasing number of quantum-resistant algorithms have been developed that are difficult or impossible to solve for quantum computers and which guarantee our information security in the post-quantum period.

Quantum security used for data storage.

The above algorithms focus on general encryption designed to protect any information exchanged over public networks, in addition to digital signatures used for identity verification. Key protection is needed for data that has come to rest and is stored on a storage platform. We are talking about quantum secure storage. Of course, an encrypted message that we keep in the event of theft is difficult to decrypt. But the performance of quantum computers will proliferate in the future, so it is wise also to make 'the search' for those stored files more difficult.

The principle of quantum secure storage is aimed at cutting it into varying pieces, encrypting it differently and then distributing it over different places in the network. First, this makes the search extremely difficult because, if someone already finds such an encrypted block, it is only an unknown fragment of the entire message. In addition, by not cutting up and dividing the data via an algorithm but via relatively simple 'human logic', any computer, even with super quantum properties, can do very little. And that's the charm of this new way of storage: how can you find the needle in the haystack if you don't know and understand what a needle is in the first place?

Grid Computing.

A distributed grid network is ideally suited for the distributed storage of the shredded file. The decentralized networks seen as Web 3.0 infrastructures are emerging for the new Web3-based initiatives. Data and identity protection in these increasingly surrounding virtual worlds are one of the major challenges of our time. In a previous blog, 'Protection of digital identity and privacy,' I talked about the European standards developed to work securely with these new digital identities within Europe. A kind of new GDPR, not for privacy but for identity protection.
IDC studies show that by 2025, global data will grow to 175 ZB, a fivefold increase from the 33 ZB created in 2018. With the advent of outsourcing and later the cloud, data is being stored in increasingly centralized ways; by 2025, IDC expects 49% of this in public clouds. While these public providers have taken their security to an increasingly professional level, at the same time, centralization creates a single point of failure. A single security breach, virus or hack threatens the entire database— a risk that can translate into millions, even billions, of stolen files.

Data Decentralization.

While the pendulum swings between decentralized and centralized, with the cloud – as the mainframe of the internet –greatly enhancing this central movement, we see that edge and grid computing create a decentralized movement.

Due in part to the 'Internet of Things & People', greater processing power and local storage are needed at the edges and at the nodes of networks, to store and process data locally. A balance between central Big Data and local Fast Data. I pointed this out years ago; refer to the 2015-blogs on 'VM-world 2018' and 'the attractiveness of data'.
Data centralisation isn't just risky; it's also unfair. While highly effective as a solution, it often leads to unequal assets and resource distribution. Terms and conditions attached to the systems protect big data companies, not individuals or small businesses. Meanwhile, the users of these centralized data systems have no choice but to accept the terms, or be banned from participating. For that reason too, data decentralization is a sound proposition. Provided that the data can be stored and retained just as securely – and even more securely without a single point of failure.
Fortunately, with modern decentralized, quantum secure storage, this has come within reach. DigiThree Labs, part of the DigiCorp Labs family, is developing enterprise solutions as part of its secure, decentralized and metaverse-based services, and is aiming to develop solutions using quantum secure storage.

Article by Hans Timmerman

Related Posts

18 April, 2023
19 September, 2022
3 December, 2021
magnifiercross