When Cookies Fail

Mark Cornick, Former Viget

Article Category: #Code

Posted on

Since I installed the Safari 3.1 update the other day, I repeatedly ran into a weird problem: the "Accept cookies" preference is being mysteriously reset to "Never." This isn't a tech support forum (and I'm sure "just use Firefox" would be a popular response if it were); rather, I'm here to talk about what happens when I try to use Safari after this bug crops up. Just about every significant web application uses cookies. Despite the paranoia that accompanied their introduction over a decade ago, cookies are an essential part of the modern web application's diet. One of the most common uses of a cookie is to persist an authentication state. For instance, when I log in to an application, it'll send a cookie with a key like "user_authentication" and some value. As long as this cookie gets sent back with each subsequent HTTP request, we know the user's authenticated. It's a common pattern, and one that works very well 99% of the time. That other 1%? That's what's happened to me since this Safari bug cropped up. If cookies aren't enabled in a user's browser, authentication doesn't work. The application never gets the cookie it expects to receive, and so assumes the user isn't logged in. So what does your application do in this case? How does it recover? A well-behaved application would detect that it's missing the desired cookie, and present some sort of helpful error message. Gmail, for instance, redirects you to a page stating that Gmail requires cookies, and the user should check the browser's preferences to make sure cookies are enabled. You don't get where you want to go, but at least you're not in the dark. Then there are the not-so-well-behaved applications. Twitter, for instance, doesn't provide any useful feedback to indicate that it's missing its authentication cookie. It just kicks you back to the same login page, without the courtesy of telling you what's going on. (This is how I discovered there was something going on with my Safari. I tried logging in, which failed. I tried again. Still no dice. It wasn't until I tried Gmail that I got the clue.) So, fellow web developer... what does your application do in this case? Try authenticating with cookies turned off. Do you inform the user of the problem, or do you leave the user in the dark? I'm sorry to say that at least one of my applications left the user in the dark. (Consider my wrist slapped.) It's very easy for web developers to take cookies for granted. We rely on the built-in session persistence in frameworks like Rails, which invariably depend, in some way, on cookies. We expect that users will have cookies enabled; that expectation is probably valid in most cases. But when it's not, we need to keep the users' experience positive by helping them understand what's going on. Otherwise, you've got frustrated users who can't understand why they can't get to their previously-favorite application—and it's all downhill from there.

Related Articles