So that happened.

I’ve, of course, considered such services for a long time. My first serious identity theft episode (besides credit cards) was about 15 years ago, when I was informed by my mortgage loan officer that I would not be getting that top-tier rate we had previously discussed.

There were items sent to collections I had never heard of. Addresses reported where I had not lived. There was an unscrupulous collections agency who took my report of fraud, attached to their record the full correct contact info they required me to give them, and submitted it again to the credit agencies as valid.

Among other things, the thieves signed up for local telephone service. But the phone company had No Earthly Idea where they might be located and apologized that they would be unable to help me on that issue, Thank You And Have A Nice Day. A police department in a state I never lived in refused to accept a report except in person. I couldn’t get anyone to tell me if the drivers license number on one of the credit applications meant someone applied for one in my name. My own state and local authorities wanted nothing to do with it, because the visible crime happened elsewhere. “You could try calling the FBI, but they are only interested in cases over a million dollars.”

At one point, when I was having a rather convoluted “discussion” with one of the credit bureaus, I offered to come to their office with paper copies of documents supporting my request to remove the fraudulent items. The main corporate office was ten minute’s walk from my workplace. They offered to call the police if explored that possibility.

This took several years to fully clean up, continuing even after I moved to California. I still have to assume that my personal information is sitting out there, waiting for someone else to abuse it. For all practical purposes, I have a lifetime subscription to credit reports on demand.

So let’s just say I’ve gotten pretty good at this. It’s a giant pain in the ass, but not enough to pay someone a monthly fee for the rest of my life (and probably after.) Particularly when the services available consisted of little more than automated credit report checking. Once in a while something happens, I spend a few weeks arguing about it with various companies, and then it goes away. (Until next time.)

So what changed?

Well, you might have noticed I know a thing or two about computers. Keeping them safe and secure, to the best of my abilities and time available. You would not be surprised to learn that I like backups. Backups! Backups as far as the eye can see! Backups that run hourly. Backups that are swapped out whenever something has the slightest suggestion of a hardware blip. Backups that live in my travel bag. Backups that live at my mother’s house. And backups that live in my car.

My usual “offsite backup” stays in the car glovebox. Every so often, I try for at least monthly, I take it inside and refresh it. We do have a storage unit, I could keep it there, but it’s far less convenient. That means it would be updated less often, and monthly is already not that great.

My laptop backup is encrypted, as are all of my USB hard drives if possible. My server backup is one of those that is not, because the OS version is too old. So my glovebox backup is one USB drive with two volumes, one encrypted and one not.

The unencrypted server backup always concerns me a bit. If someone knowledgable got it, it has all the information necessary to royally screw with my server. That’s a problem. But eventually that server will be going away, replaced with something better. And it’s a basic machine that runs a few websites and processes my outbound email. (I haven’t hosted my own inbox in years.) Yeah, having some archived files of ancient email released would not be fun. But that’s the extent of anything that would impact my actual personal life.

I’d rather not have my backup drive stolen out of the car, sure. It would be annoying, both for the car and having to lock down my server. But it wouldn’t be the end of the world.

So that’s not it, what else? (I’m guessing, at this point, you have some idea that there will be a car chapter to this story.)

A few weeks ago, my spouse decided that this offsite backup thing wasn’t such a bad idea. The thought of having to use it, because the house burned down or all our stuff was stolen, is not pretty. But it’s better to have something in that situation than have nothing. And it’s not that difficult to remember to update and put back once in a while. So he did.

Given that he’s the inspiration for the “tinfoil hat” userpic I have on one of my social media accounts, I presumed it was encrypted. He has many years’ experience in professional system administration and is far, far more paranoid than I am. Nothing with a name or address is discarded intact. He insists the shredding goes to a facility where he can watch it being shredded. When I moved to California, he would not use the cheap 900 MHz cordless phone I brought with me because it was insecure. He doesn’t like my passwords because sometimes I have to choose ones that are capable of being manually typed within two or three tries.

Guess what. Oops.

A few days ago, someone broke into our car and ransacked the glovebox. The only things taken were a small bag of charging cables and two hard drives, mainly because there was nearly nothing else to be had. (This is, by far, not my first rodeo.) Car documents, paper napkins, and some random receipts were scattered about.

One of those hard drives is my spouse’s unencrypted laptop backup.

First I dealt with the immediate problem of filing a police report, which took about 20 minutes on the phone. It is a process that is at least highly efficient, since it is almost certainly useless in getting our stuff back or even in identifying a suspect. But to be able to discuss this with my insurance company, it needed to be done.

Then came the discussion on what, exactly, was on that hard drive: it’s a copy of his user directory. So it didn’t contain system passwords, but that was about the only good thing that could be said. He uses a password manager for many things, but it’s not possible to protect everything that way. Years of email, confidential documents, client project details, credit card statements, tax returns, the medical documents I needed him to scan for me while I was out of town. All there. I handle most of the household finances, so a great many more items are instead on my machine. But sometimes you have to share, and things get passed around.

It’s almost certain that the thief didn’t care about the data. But wherever those drives get dumped, or whoever they are sold to, somebody very easily could. Names, addresses past and present, names and addresses of family members, birth dates, social security numbers, financial account numbers, everything necessary to utterly ruin our financial lives.

I’ll have more to say in other posts: which service I chose, what happened with the car, and how this story develops. But that explains why now, after many years of not being impressed with paid monitoring services, I now have forked over my money for one.

I did say that this blog would avoid getting into political issues and stick to practical concerns. But the events of the past week with Apple and the FBI are pretty disturbing and I want to talk about why.

First, nothing about the technical matters involved in the conversation (with one exception) is anything that you or I or any other private individual can do anything about. It’s all taking place in the rarified air of law enforcement vs public policy, from those who believe they know what is good for us, and have the power to change how others are allowed to access our personal data. We can lobby our elected officials and hope somebody can get past the fear mongering enough to listen.

Next, there is one technical thing you can do to protect your personal device: choose a strong passcode. I’m going to assume you already use a passcode, but the default four or six digit number isn’t going to stand up to a brute force attempt to break it. Make it longer. Make it numbers and letters if you can stand to (it’s a real pain to enter, I know.) Do it not because you “have something to hide” but because you will always have something you don’t want shared and it’s not possible to know in advance what, when, or how that might come about. Make protecting your personal data a normal activity. The longer and more complicated your passcode, the more effort it will take to guess. As long as we continue to use passcodes this will be true, and the goalpost is always moving.

Now, on with the real subject of this post. Get comfortable, this will take a while.

Folks who have followed this issue know that Apple (along with other companies) have routinely responded to search warrants and other official requests for customer data. From a practical standpoint, they have to. But they also have been re-designing parts of their systems to make it less feasible. (It’s important to note that recovering data from a locked device is not the same as unlocking it.) Not only is it now more difficult for people outside Apple to access private data on iOS devices, it’s also more difficult for Apple itself to do.

Discussion of the two current court cases, with detail on what is possible to recover from a locked device for various iOS versions
No, Apple Has Not Unlocked 70 iPhones For Law Enforcement

Court order requiring Apple to comply

The reason for this has many parts, and one very important part is of course to make their product more attractive to customers. Apple is in the business of selling equipment, that’s what they do. When it came out that we tinfoil hats hadn’t just been making up stuff we suspected the NSA was snooping on (and they far exceeded our speculations) suddenly US companies had a huge problem: international customers. Foreign organizations, businesses and governments alike, were none too keen to have confirmed in excruciating detail the extent that the US government was spying on everyone. If US companies want to continue to sell products, they have to be able to convince security-conscious customers that they aren’t just a lapdog for the NSA.

When somebody says “Apple is only doing this because of marketing” consider what that means. People don’t buy your product without “marketing.” Unless you have somehow managed a sweet exclusive deal that can never be taken away, your company depends on marketing for its continued existence. And your marketing and the products it promotes have to appeal to customers. All over the world, more and more people are saying “You know, I don’t much like the idea that the US Government could walk in and see my stuff.”

Strong cryptography is not just for protecting business interests. Livelihoods, and sometimes lives, also depend on the ability to keep private things private. For years people have claimed that products built in China are untrustworthy because the Chinese government can force their makers to provide a way in to protected data. It’s better to buy from trusted companies in enlightened countries where that won’t happen. Who is left on that list?

And what about terrorism? Of course, the things terrorists have done are awful. Nobody is contesting that. But opening up everyone to risk so governments have the ability to sneak up and overhear a potential terror plot doesn’t change how threats are discovered. The intelligence agencies already have more data than they are able to handle, it’s the process that’s broken and not that they suddenly have nothing to look at. There have been multiple cases where pronouncements of “This would have never happened without encryption” have been quickly followed by the discovery that perpetrators were using basic non-encrypted communications that were not intercepted or correctly analyzed. “Collect more data because we can” is not a rational proposal to improve the intelligence process, even if the abuse of privacy could be constitutionally justified.

There is no such thing as a magic key that only authorized users are permitted to use and all others will be kept out forever. If there’s a way in, someone will find it. Nothing is perfect, a defect will eventually be found, maybe even those authorized users will slip up and open the door. Also, state actors are hardly trustworthy when they say these powers will only be used to fight the most egregious terror threats and everybody else will be left alone. Even if they could prevent backdoors from being used without authorization, their own histories belie their claims.

The dangers of having “secret” entry enforced only by policy to not give out the key
TSA Doesn’t Care That Its Luggage Locks Have Been Hacked

Intelligence agencies claim encryption is the reason they can’t identify terror plots, when the far larger problem is that mass surveillance generates vast quantities of data they don’t have the ability to use effectively
5 Myths Regarding the Paris Terror Attacks

Officials investigating the San Bernardino attack report the terrorists used encrypted communication, but the Senate briefing said they didn’t
Clueless Press Being Played To Suggest Encryption Played A Role In San Bernardino Attacks

What the expanded “Sneak-and-Peek” secret investigatory powers of the Patriot Act, claimed to be necessary because of terrorism, are actually being used for
Surprise! Controversial Patriot Act power now overwhelmingly used in drug investigations

TSA ordered searches of cars valet parked at airports
TSA Is Making Airport Valets Search Your Trunk

What is being asked of Apple in this case?

Not to unlock the phone, because everyone agrees that’s not technically possible. Not to provide data recoverable from the locked device by Apple in their own labs, which they could do for previous iOS versions but not now. What the court order actually says is they must create a special version of the operating system that prevents data from being wiped after 10 incorrect passcodes, the means to rapidly try new passcodes in an automated fashion, and the ability to install this software on the target device (that will only accept OS updates via the Apple-authorized standard mechanism.)

What would happen if Apple did this?

The government says Apple would be shown to be a fine, upstanding corporate citizen, this one single solitary special case would be “solved,” and we all go on with our lives content to know that justice was served. Apple can even delete the software they created when they are done. The FBI claims the contents of this employer-owned phone are required to know if the terrorists were communicating with other terrorists in coordinated actions. No other evidence has suggested this happened, so it must be hidden on that particular phone (and not, for example, on the non-work phones that were destroyed or in any of the data on Apple’s servers that they did provide.)

How the law enforcement community is reacting to the prospect of the FBI winning this case
FBI Says Apple Court Order Is Narrow, But Other Law Enforcers Hungry to Exploit It

Apple would, first and foremost, be compelled to spend considerable effort on creating a tool to be used by the government. Not just “we’ll hack at it and see what we find” but a testable piece of software that can stand up to being verified at a level sufficient to survive court challenges of its accuracy and reliability. Because if the FBI did find evidence they wanted to use to accuse someone else, that party’s legal team will absolutely question how it was acquired. If that can’t be done, all this effort is wasted.

A discussion of the many, many requirements for building and maintaining a tool suitable for use as as source of evidence in criminal proceedings.
Apple, FBI, and the Burden of Forensic Methodology

Next, the probability of this software escaping the confines of Apple’s labs is high. The 3rd-party testing necessary to make the results admissible in court, at absolute minimum, gives anyone in physical possession of a test device access to reverse-engineer the contents. If the FBI has the target device, it too can give it to their security researchers to evaluate. Many people will need access to the software during the course of the investigation.

Finally, everyone in the world would know that Apple, who said they had no way to do this thing, now does. And now that it does, more people will want it. Other governments would love to have this capability and Apple, as a global company, will be pressured to give in. What would that pressure be? In the US, it’s standard for the government to threaten crushing fines or imprisonment of corporate officers for defying the courts. Governments can forbid Apple to sell products in their countries or assess punitive import taxes. Any of these can destroy a company.

Non-technical people often decry security folks as eggheads crying wolf over petty concerns when there are so many more important things to discuss. That’s fine, and our role as professionals includes the responsibility to educate and explain how what we do impacts others.

I encourage you to consider this issue for yourself and what it would mean if you were the person at the other end of that search warrant. Ubiquitous communication and data collection have fundamentally changed how much of the world lives and works, and there are plenty of governments far less principled than our own who want access to private data of mobile phone users. We should not be encouraging them by saying it’s just fine, no problem, go ahead, just hand over the (cryptographic) keys and everything will be ok.

Recently some folks with Tor, the open source project behind the global decentralized anonymizing network, released a beta version of a new chat client. It’s designed to be secure by default, but usable by normal people. This is something that has escaped many previous efforts, so it’s a welcome development

It encrypts messages with OTR (so only you and the person you are chatting with can see them) and sends them via the Tor network (to hide where you are on the Internet.) These are very, very good things and I’m happy to see user-friendly applications building on the excellent work Tor has been doing.

The difficulty for me is how it fits into the way I use chat, specifically that it’s impossible to save chat transcripts. While that has a benefit for the purest of high-impact security, what doesn’t exist can’t be compromised, it is exactly the opposite of how I use chat.

It seems that many people use instant messaging only for one-off communications. I treat it like email and constantly go back to reference something I’ve sent or information I received. This is a major reason I’m still using Apple’s Messages client, because it makes searching chats trivially easy.

But despite Messages allowing you to use a whole collection of different chat services, it doesn’t provide encryption for anything other than Apple’s own service. (Which I don’t use for reasons too long to go into right now.) I’ve tried other clients, but haven’t been thrilled. Even without getting into if or how they use encryption, I’ve found them clunky. And, most importantly, hard to reference old messages. The best of them, Adium, has a custom viewer only usable from inside the app but the archive chats use a tiny fixed size font that can’t be changed. That makes it useless for me.

Between encryption by default and using the Tor network, I really really want to like Tor Messenger. I dug around and with some help from the Tor folks figured out how to re-enable chat logs, but the results were not usable for several reasons:

First, it creates files in JSON format, something designed to be easily readable by computers. While it’s true that JSON contains text, it isn’t in a human-readable format by any rational definition because it contains a bunch of required formatting and other control structures that get in the way of human understanding.

Next, that file is overwritten every time the program starts. Unless you have your own way to save the contents automatically (and this is a far more difficult problem than it sounds) you lose your history anyway.

Finally, it’s located deep inside the app’s install directory. This is not a problem for me, but would certainly be an issue for anyone not very familiar with technical aspects of OS X. And that also means it’s excluded from Spotlight, Apple’s disk searching tool.

I still have hope, because it’s early and also because it’s open source. When they are able to release the Mac build instructions, I can just go change what’s annoying myself. (And if I’m going to choose an open source project to work on, I’m thinking I might prefer the more security-focused Tor over Adium. Sorry Adium friends.)

But for the moment, unless I’m willing to forge onward into the wilderness of creating my own custom version of something, I’m still stuck with the choice between secure and annoying, or insecure but fits into how I work.

Yesterday I decided to finally take a look at the iOS app E*Trade has been telling me all those wonderful things about. I’d been kinda skeptical about managing my brokerage account from my phone, but sometimes it’s nice to check on stuff. (Like if I actually transfered that money from savings to cover a check.)

Other reviewers can discuss the features (which seem a little clunky and definitely overly complex) but what I wanted to investigate is how the app secures data over the network.

The info about it from the App Store says it’s all wonderful and secure and stuff, because data is stored on the server and never on the device. That’s nice. And the website is all about how secure it is. Spiffy. How, exactly, is data protected as it goes from here to there? No Comment. Not even marketing copy about “Industry-Standard 938,842-bit Encryption.”

When I started up the app, the first thing I got was a giant agreement to read and accept. It was clearly written by lawyers, because there is an entire paragraph where they disclaim any and all liability for network data security. The user is responsible for ensuring the device’s connection to the Internet is reliable and secure, blah blah blah. (I tried to find a copy of this online, but haven’t yet.) As far as I can tell, they can send everything absolutely in the clear and according to the user agreement it would be just fine.

So I did what any self-respecting, security-aware user would do (no, not fire up Wireshark, or at least not yet) I call them up and asked.

The mobile trading support guy said “Of course everything is encrypted.” Ok, good. I recall my comment about SSL answered by “whatever that is.” Ok, he’s not a developer. I mentioned it would be nice if the description of the app actually said something about the encryption standards used, and he agreed.

What I got out of this exercise is that E*Trade almost certainly contracted out the development of their mobile apps (which is normal) and their customer-facing support staff doesn’t know much about the details of data protection for them (which is disconcerting.) I know enough iOS developers that the people who built the app were probably not so stupid as to ignore data security, but there was a breakdown in communication between them and the online documentation. I hope my feedback actually gets to someone who knows what SSL is.

In the meantime, if I absolutely must do something while away from my computer, I’ll turn on the VPN connection and at least keep it from being sniffed over the air. And look for an app update with a full description of how the app protects my data in transit.

I’ve been using FileVault to encrypt the drive on my travel machine for a while now, but I’ve only recently enabled it on my everyday machine. I rarely take it anywhere, but since I bought a nice new tiny laptop that will change. (I’ll discuss the mechanics of enabling it another time.)

Mostly this hasn’t made any difference in how I use my laptop, but here and there I run into something. (If you don’t already have your Mac configured to require a password, FileVault will enable that. Many people do, and many corporate IT policies require it.)

I had to take the new machine in for repair this week, and as part of the routine intake process they ask for the login and password. Uh, no. That’s pretty normal for me, I usually wipe a machine before I hand it over but I didn’t have time. Now if the service issue were software, this obviously wouldn’t work. But so far I’ve not needed to take something in overnight for anything other than broken hardware.

I dutifully inform them the encrypted state of the drive, and they will get back to me if there is a problem. But what exactly does that mean for them? FileVault 2 is full-disk encryption (unlike the original FileVault) so when you start up the machine you immediately get a login screen. If you don’t log in, it won’t even finish booting. After replacing my logic board, the repair tech will have to attach another bootable volume, use the Option key on startup to select where to boot from, and test it that way.

When I travel, I always shut down the machine before I pack it away. Not only does that mean it can’t accidentally wake up in transit (risking your hard drive if you have the old spinning kind, or your battery either way) but if anybody steals it there is no chance someone is getting into my hard drive. Now if they had my password-protected laptop and it were only sleeping, technically it would be “easier” to gain access. But by that I mean if a skilled and determined attacker were interested, there might be weaknesses in the OS or other things that could be compromised to allow unauthorized access. Might. If you are being tracked by a government agency and your laptop gets taken off in a black helicopter, perhaps you have some concern. The sketchy dude who lifted your MacBook Air from Starbucks? Unlikely.

Now one thing Sketchy Dude is likely to do is open it up to see if it works. If your laptop is able to connect to a wireless network and you have some kind of location tracking program enabled, then you might be able to find out where Sketchy Dude is. That wouldn’t happen if the machine were shut down. (It also wouldn’t happen if he wipes the drive before connecting it to a network, which thieves who know anything about computers will do.)

I haven’t enabled Find My Mac because if someone has taken off with my laptop, I’m not counting on getting it back. (It’s fully backed up, after all.) It also means it’s not constantly reporting its location, and there’s one less source of information about me to exist in somebody’s giant database. (I do use it on my phone, as that’s a different story.)

So enabling disk encryption hasn’t changed anything for me, but that might not be the same for someone else. If you really hate entering a password, you aren’t going to like FileVault.

Update:

Well, I did find one thing: Safe Boot doesn’t work with FileVault (see the link in the comments.) When I was having migration problems, the Apple tech recommended I restart with Safe Boot but I couldn’t. Unfortunately she also didn’t know that was on purpose. (Fortunately, for FV2 anyway, the migration issue didn’t seem to be related to encryption.)

Resources:
Complete guide to FileVault 2 in Lion

Rich Trouton’s blog posts about FileVault 2 (for hardcore IT folks)