Well, perhaps not entirely surprisingly, my recent blog on data classification resulted in a discussion with Boldon James, a QinetiQ company and a significant competitor for TITUS, mentioned in my blog. It is good there is one—with anything as potentially disruptive to the business as most aspects of security, and as important to the business when done right, I'd hate to be choosing suppliers in a field of one.
Anyway, Boldon James has some interesting products, such as a whole range of Classifier tools and a good provenance with military data classification, although (as Martin Sugden, Boldon James’ CEO, who I was talking with, agreed) commercial requirements aren't the same as military ones. For a start, civilians can and will grumble and complain if a tool is stopping them doing business (not so much a problem with classification itself as with linking it to data loss prevention, because of the 'false positives' associated with technology centric implementations), whereas a soldier probably can't—although both groups can probably sabotage your security rules if they are sufficiently annoyed by them.
Our discussion was really around good practice for implementing data classification, especially if you want to link it to rule-based data loss prevention tools. A technology-centric solution (with some technician fresh from reading spy stories assigning classifications like 'eyes only') usually leads to tears, if you use technology to enforce access controls, as no one really understands what the classifications mean and everything gradually moves up the classification scale 'just to be on the safe side'. After a while, automated security then often begins to impact peoples' ability to work effectively and security then gets watered down in a fairly ad-hoc way. What you should do is classify documents at a business level which makes sense to end-users (and this implies that you give them security awareness training and encourage them to take ownership of the classification) and then, when this is working, implement data loss prevention rules that make sense to the business. Installing a data loss prevention tool in a panic, after some incident has attracted the CEO's attention, and thinking about business impact and data classification after the fact, is not a recipe for success (and doubtless TITUS would agree with this too).
Probably the most interesting thing I learned from talking to Martin was that Boldon James has a customer, Allianz Ireland plc., that not only set a baseline before implementing data classification but also measured its success—and is prepared to talk about this in public. Well, how else would one do it? In fact, such an approach is the mark of a truly mature company and less common than you might think. Many companies still have a 'blame culture', in which it is much easier to claim an unqualified success (and keep your job) if you don't have a baseline to measure your success (or failure) against. There wouldn't be much point in the baseline anyway, because the reason for measuring success is to find out what works and what doesn't and implement a continuous improvement plan, and that is impossible in a blame culture that punishes anyone who points out that there's room for improvement. As for talking about it in public, any form of 'security by obscurity' is a dead duck—it encourages complacency; and you should assume that criminals can find out your secrets anyway (it's what criminals do; sometimes using very unpleasant methods). If you do it right, you can talk about what you do (without posting your encryption keys and suchlike on the web, of course) without compromising security—and this may improve staff morale and encourage your staff to take security seriously. The most thankless task I ever had, in a previous life, was to make people take security 'good practice' seriously when the bank I worked for said, as a matter of policy (not fact), that it had never had a successful security breach in its iron-clad perimeter.
I don't have the space to analyse the Allianz experience in full in this blog (you can read about it for yourself here) but it does make for interesting reading. The key paragraph for me is the one listing benefits (because I think that a successful Data Classification program must be delivered as a business benefit, not as a technology cost of doing business): "Allianz Ireland has already seen huge improvements in their Gap Audit score against the Group Security Policy and Standards in relation to Data Classification. They have also found that there is much greater awareness of Data Classification and security throughout the organisation. Employees are more aware of the type of information they are dealing with and their obligations to protect this data and it is hoped that this will be instrumental in preventing data loss in the future". The second bit of this is nicely people-focussed but what isn't made clear (and should be) is that, as I understand it, improvement metrics were made against a genuine set of pre-implementation baseline metrics. This is the only way to do it if the goal is continuous improvement, which is the only practical approach to implementing any kind of governance effectively.