I had a thought and decided to write it down. Welcome to the rantings of someone who decided to write down his thoughts on mysticism, politics, anthropology, science, and art.

Saturday, February 19, 2011

Security = Coercion

Many of us are familiar with Recaptchas, albeit most us don't know what they called. They are those warped words we type in to create accounts or make payments online. The reason for their existence is to prevent computer bots from trying to create accounts or stealing people's identity.

ReCAPTCHA was a project started by Luis von Ahn at Carnegie Mellon to digitize, well, everything. Most of the text we are looking at in Recaptchas are taken from old manuscripts (hence why some words look like real words, but are rather odd, simply because they could be from Middle English or a proper noun).

The human mind can easily recognize various things which are naturally programmed into our minds. In fact, this natural software is so good we tend to project things like faces or animals into ambiguous patterns and shapes, such as clouds or pieces of toast. Computers, on the other hand, can only do this if it has been told to do so. But computational recognition is fairly good. With the app Google Goggles one can snap a picture of Loose Lucy's logo, a fairly warped logo, and can tell you exactly what it says and where the closest store location is. But Recaptchas randomly warp the text or add odd features such as a difference transparency shape over the text or an irregular line through the text. Since the digital scanning of old print and archaic fonts are generally enough to prevent a typical bot from recognizing the text, a little warpage wouldn't hurt to stop even the most advanced bot from recognizing the text (even Google Goggles... try it).

Since humans with fairly good accuracy can decipher these warped texts, while computers cannot, it makes for a reliable security measure, so that a computer bot cannot hack my saving's account and transfer the funds to an offshore account. Many of us remember that these originally contained only one word, but in recent years they now contain two words. Most of us don't really care that a second word was added. I, on the hand, am very over-analytical (acquaintances of mine tend to give me grief for this) and I was curious why a second word was added.

Luis von Ahn is also responsible for another idea that utilizes human recognition over computational recognition (once again, something computers aren't that good at) : it is called the ESP game. Essential, in search engines we can type a word such as "rabbit" and we will get images of rabbits. Now, a computer cannot actually find pictures of rabbits with images alone. So people were needed to look at images and give labels for the images. When multiple people used the same descriptive words for the image it can be an accurate description of the image, and those words are used as metadata for search engines (this is why if you type in "rabbit hole" you might get a few images of "butt holes"). Thing is, people don't like doing these things for free, at least not willingly. They originally packaged this as a sort of game, hence ESP game, but it wasn't very fun.

So how did Carnegie Mellon, Google, and other digitized resources get people to willing provide this sort data for free? By adding a second word to Recaptcha texts in online transactions and account creations. The first word is used to make sure you are a human and not a bot, while the second word is added into ReCAPTCHA's and Google's databases for image metadata. (I'm noticing right now that I can put labels on my own blog for Google to use).

So it's official, security and coercion have fused. Security, a capitalistic idea we conjured up to prevent something like coercion (as well as theft, rape, murder, assault, etc...) has now been infiltrated by the very things it was built to prevent. Are these intangible constructs faulty because the people who invented them for a specific purpose can now use that construct was a gateway for its own antagonism? It is something like the movie The Net, in which a computer security program is used to steal identities, eliminate identities, and ultimately kill their victims... except this is so low-key, under the radar, and practically negligible and convoluted that it is difficult to tell if it is ethical. Even worse, most people don't seem to care.

Prove you're human:

No comments:

Post a Comment