Cleanroom Techniques for Office Geeks

My portable Linux desktop for sysadmins lives on GitHub if that's all you came for.

Medical and health metaphors are ubiquitous in everyday IT security. Computers get infected, cured and healed. And to a degree, they work. Computer viruses do spread through networks in way that resembles a biological virus. Anti-virus products do "disinfect" machines.

But this overlooks two very important properties of computer "infections": their flash fire like spread and the certainty of re-infection if original security hole is left unfixed. "Viral" security threats are just sequences of zeros and ones which trip a flaw in a machine's programming and take over. In no way will a "stronger" computer beat an "infection", these "infections" are instantaneous - and a "virus" will spread to every vulnerable machine as quickly as the wires will carry it. A healthy machines reintroduced to an infected network will always be re-infected if it has the targeted vulnerability. This is maths, not immunology. (In 2003, one internet-borne worm infected 90% of vulnerable machines, that's over 75,000 servers globally, within 10 minutes!)

No-one fights viral infections in whizzy Hollywood rendered cyberspace battles; we unplug everything, carefully erase the security compromise from each machine (often by wiping and re-installing everything), patch them so that they can no longer be infected (at least in same way) and turn them back on, one-by-one. Recovering a compromised network to a trust-worthy state is slow and expensive. Sysadmin don't heroically stop infections as they happen, sometimes the whole network is gone by the time anyone can notice. Unfortunately anti-virus solutions perform abysmally against newly released viruses. (And they always will, recognising viruses is a hard problem.)

So I would propose a second metaphor, the industrial cleanroom: a contamination free sealed area with controlled access [air locks, cleaning areas, air showers, etc] to protect it. Should someone walk in with muddy shoes and throw open a window, the entire cleanroom has to be shutdown and decontaminated. Cleanrooms are often nestled into one another, with an outer cleanroom enclosing an even cleaner room for more critical work.

I think this is a useful way to think of system integrity. Running a malware detector and installing anti-virus on a virus ridden home PC, will not necessarily make it a trustworthy environment again. You need to start with a clean installation and move data back into it.

(Both the "virus" and the "cleanroom" metaphor are flawed in not highlighting the full unpleasantness a computer body snatched by something from the interwebs. Read 'Meet the men who spy on women through their webcams' if you want to unsettle yourself!)

This poses the question of what we might consider a "room" to be. The design of modern mainstream operating systems [e.g. POSIX] more or less answers this for us. Mainstream operating systems date from when a dozen academics shared a single minicomputer and wrote their own programs. The operating system protected itself from the users and the users from each other. Unfortunately on a modern desktop we have one user and everything of value is within that single user's account. (Windows has made some progress in beginning to fix this.) Any piece of software loaded into your machine can take control, e.g. the confused deputy problem. So barring a wholesale shift from Windows (and Unix and MacOS X and others), our basic potential cleanroom is an entire operating system installation. Chunky, but still workable.

Industrial units that use cleanrooms typically define levels of cleanroom, with the most sensitive processes restricted to the cleanest environments. They would also lay down rules for the transfer of materials and half-finished product between different levels of cleanroom.

So if we were to apply this thinking to typical IT set-up for a bank or stockbroker, we might end up with something like this.

Level Name Typical Functions Linkages
0 Control Direct access to system. Releasing new software. Overriding normal controls. Manual input, and vetted files only.
1 Secret Handling confidential client information. Key business control functions. Data links to business partners, audited manual data entry. Publishes sanitised aggregate and de-personalised data to Business.
2 Business Normal office functions for private but non-confidential client information. Office administration. HR tasks, etc. Accepts data published from the Secret layer. Emails and more to various ad-hoc 3rd parties.
3 Playpen Removing worker's motivation to subvert other computer's security. Quake 3 games. Facebook. Personal emails. Tasks that require unapproved software. Completely segregated. Files can be introduced from normal systems with permission from a sysadmin.

The playpen is the most important of these levels. In practice, it is impossible to stop someone in possession of a computer from removing restrictions on it. (I have heard of individuals imaging secure laptops to VMs and running them on other machines!) Play computers take that motivation (and the excuses) away from users. Reliable separation requires a place for dirty things!

This might sound impractical, since each layer involves buying extra (or dual booting) computers, but I would argue it is still a good idea. Reliable IT is very economical. Hardware is cheap; maintenance expensive. (The netbook I am writing on cost £90!) Few (good and/or working in a sensibly run firms) system administrators would spend a significant part of their day logged into critical control functions. Most call centre staff will spend their working day dealing with confidential client information, without needing Excel or Word. Line managers will only spend a modest amount of time looking directly at client information (and should never be emailing it around in bulk). (And, of course, in no office I have worked would more than a handful of people wanted to browse Facebook or play Angry Birds on company time :-) )

Physical computers are cheap and standard solutions exist to share monitors & keyboards between them. It shouldn't be worth risking security incidents simply because one executive needs to install their pet poker game. Or a junior sales admin their bouncing sheep (Yes, this is something I have seen, long ago...) mouse pointer.

Our time as IT guys is too valuable to disentangle dancing pigs and critical systems after things go wrong. (And obviously the positive experience of our users is too valuable to to force them to chose ;-) ) Separating them at boot-up is a crude solution, but all the better options are still in academia.

Returning to our hypothetical broker bank business office. Having defined some hygiene rules, here are some suggestions for a real set-up.

Brightly coloured playpen laptops from Alienware sit in the office kitchen and staff lounge. They connect to the firm's open access wireless which extends over the whole street (cheap advertising & plausible deniability if employees pirate music), and the USB ports can be temporarily enabled by IT. Every machine has Tomb Raider, The Sims and Super Fun Photobooth Party. Stickers inform employes that they have an expectation of privacy for their personal e-mail, etc. Resetting a playpen laptop cleans everything back to the original installation. (Deep Freeze amongst other products let you do this.)

Business users have Windows 7 desktop computers with a decent selection of applications [Office, Outlook, Photoshop, Crystal Reports, etc, etc] and no ability to install anything else. The internet is strictly filtered (through something like ZScaler), no facebook, no file lockers, no instant messenger, etc. Windows is configured to the US government secure standard (available free online!) with the usual anti-virus and security tools. Anyone's line manager can read their work email, confidential information is generally only referred to and the only link to Secret is sanitised.

For Secret stuff, in our case the usual call centre customer management software, one machine in a locked room runs this for the whole office. Every user has a dumb terminal box under their desk that they can switch their monitor and keyboard over to. The "Secret Server" is not directly attached to the internet or the normal office network; normal uses can only copy data from it by re-typing.

The Control laptops are thin dull laptops with restricted Linux desktops and two-factor security tokens. Only a few sysadmins have them and most staff wouldn't recognise them. They mostly sit in draws and get pulled out for releasing new software or dissecting servers. (Day-to-day functions like adding normal users and checking logging can be done from their Business PC.) Since Control users are IT-savvy and the software they run is so limited, they have almost no restrictions on their internet use or how they can connect to things.

Obviously anyone trying to impose this on a real company is going to be very brave! As far as I know only the CIA and GCHQ issue multiple computers as a security measure. Big firms are known to pull all their customer's addresses into Microsoft Word to send routine form letters. But hardware costs and the danger of malware mean people are already moving in this direction. Most of us aren't worrying about hackers from certain countries, but dancing pig applets waste enough time and effort on their own to justify changes.

And with the industrialisation of online wire fraud, every company should at least be isolating their online banking.

And if they start with scattering some pink gaming laptops around for people to play Tomb Raider on, maybe those changes could be be tolerated by "the business" :-)

Pink Laptop

At the other end from pink laptops, I'd like to present my practical contribution: Cleanroom. A minimal Linux desktop for system administrators. Just build it to an SD Card and you have a safe portable admin environment. It boots on almost any modern laptop and is simple enough for netbooks (easy to carry on pager duty).

If you want to try it, go over to my GitHib here.