Only a few people have the ability to make people stop in their tracks and really listen to what they have to say. Not just listening to the words that they speak, but really absorbing the things that they say is probably the best of use of time possible. Dan Geer is one of these persons.

Marcus Ranum interviewed Geer on his "Rear Guard" Security Podcast and a few points that were made stood out to me. One of the points that Geer made that somewhere in the past decade, it became far cheaper to keep data than to delete it selectively. A direct consequence of keeping more and more data is that it becomes nearly impossibly to categorize data and we increasingly rely on search to find that one bit of information that we are looking for.

So, if a lack of selective deletion of data leads to the (partial)
disappearance of information classification, and if we rely on search
to find what we are looking for, a skilled adversary has an advantage
that he can leverage through a disinformation strategy. In other words,
if we only see the things that we look for, a skilled adversary can
either influence those search results to make us see what he wants us
to see, or he can hide his tracks and we will never know about his
presence in the first place.

And that is scary.

Especially when thinking of this in terms of service oriented
computing, software as a service, cloud computing, etc., where our
visibility of what goes on with our data might be reduced more than
when we would own and operate the entire information technology stack,
thoughts like this must set of alarm bells with security professionals.

The current information security paradigm teaches us that In order
to protect data, we need to categorize it first, and then understand
the processes that interact with it. Yet, if we cannot directly observe
(and as a consequence, fully control) the processes that manipulate our
data, and instead only see the connection points of different processes
(in the form of services), we need to come up with a fundamental
paradigm shift.

Needless to say, we are not there yet.