In the age of heightened security risks, securing data has become more important than ever before. The ingenuity of the cyber-criminals and the increasing incidence of threats from insiders with access privileges have almost rendered obsolete conventional approach of installing perimeter rings of defense around the data, such as firewalls.
The new approach to security now focuses on the data itself, rather than on the device or the network. It aims to protect the data itself, rather than create protective barriers around it. This takes the form of encryption, tokenization, coding, hashing, storing data in multiple silos, and more.
Data encryption is, in essence, conversion of data into ciphertext. The key to decode the cypertext is passed on to the intended recipient separately, preferably using another channel. With decoding the cypertext, the encrypted data remains mumbo-jumbo, and worthless.
Cypertext in its basic form entails substituting of letters for numbers, rotating letters, or scrambling the content. Complex encryption in computing makes use of sophisticated computer algorithms that rearrange the data bits in digital signals, making it near-impossible for preying eyes to decode the data, without the key. There are several encryption schemes, such as 3DES, AES and others in vogue today. The intensity or the strength of each scheme varies.
The success of encryption however depends on the use of a robust scheme, effective management of the decoding keys, rotating or changing the keys periodically, and more importantly, making sure that the keys itself are secure.
Tokenization is substituting the actual data for tokens, and has especially gained traction among merchants accepting electronic payments. Using tokenization, retailers convert the credit card data into meaningless tokens, for use in the merchant’s systems. The actual credit card details and their token numbers are stored locally or in a hyper-secure off-site data vault. With this, even if attackers hack the merchant’s system, they end up only with useless tokens and not the actual credit card numbers or personal information.
One big advantage of tokenization is its not requiring any significant investment or changes to the existing IT architecture.
The newly emerging field of quantum cryptography is widely regarded as the ultimate in protecting data. Encryption and tokenization depend on mathematics. Quantum cryptography depends on quantum physics, replacing mathematical algorithms with creating and transmitting photons out of LED.
Storing Data in Multiple Silos
Very often, cyber criminals breach network servers and take off with the data inside the server. One way to mitigate the loss is to store incomplete information inside each server. Spreading the data across multiple servers, to the extent that data from any one server becomes incoherent in itself would foil the hacker’s attempt to benefit from the attack. The hacker would have to attack all the servers simultaneously to gain the complete data.
A variant of this concept is separating the corporate internal network handling sensitive data on a different network segment from routine operations, and firewalling these segments away from database servers. This way, even if attackers manage to infiltrate a network, they would still remain hard-pressed to access any sensitive data.
Making Data Invisible
Technology is constantly evolving, and with the passage of time, new and better systems evolve, some of them become widely-accepted, and a few become the norm.
Among the latest developments is a cluster-based covert channel that makes the data invisible, even as it resides on the hard disk, without encryption. When data is invisible, the hackers may miss it altogether. The software used for the purpose fragments the data and scatters individual segments around the hard drive. Only the key brings together all such widely scattered information.