in Part five, we will talk about the different data formats that common they exist and that need to be protected in use.
Files can be stored in many places and should be protected wherever they are. The protection should follow the data,
often their self contained, and can be processed by an application even when the computer is off line.
Many of the common off the shelf applications for consumers and enterprises work with files.
Generic backward compatibility should be maintained with these applications.
Data in databases is usually protected and managed by I T administrators, and a lot of online infrastructure is needed in order to interact with the data residing in databases. For example, many of the databases are in the back end of larger lev applications, which required interaction with highly available applications
Thes Web applications are often custom built and built in house, and protection should fall the data for the individual database cells wherever they go.
SQL injection is a huge risk when sensitive data is stored in the database and I accessed by vulnerable Web applications. If protection follows the data, Attackers will only get cipher text as a result of dumping a database using SQL injection, Attackers will not be able to decrypt the data base cells.
Military and entertainment applications often require live streaming data to be shared.
Common ways of consuming content streaming include Web browsers.
If protection follows the data stream,
it would ensure that stream REBROADCASTING would not happen, such as when people pirate sensitive content or copyrighted content on Facebook and they re stream it to their friends
in the financial service is and payment card industries. Token ization allows substituting sensitive data with a non sensitive replacement with no cryptographic correlation between the sensitive data and the token. The correlation is often done by a token ization service that maps the token back to the sensitive data.
But this approach only shifts the problem rather than solving it,
because tokens must be detail agonized at some point. In addition, it makes the token ization service a central and high value target.
Many parts of the token ization service must be online and require a lot of infrastructure and customized application and hardware.
Since tokens must be de token ized at some point in the process, the detail canonization should be carefully protected
Ram scraping Mauer like that found a target can still attack the detail Canonization process if sensitive data in use is not protected.