I know two people who have Google Glasses, the new wearable computer from the company that used to be interested in search. One is a friend and the other a work colleague, and they both were awarded their eyewear by virtue of writing popular blogs. They won’t be the last people, I know; the triumph of the gaze-screen is inevitable.

Although most of the attention paid to Google Glasses has focused on the bad science-fiction movie appearance of this first-generation hardware, it’s only a matter of design to put these lenses the size of quinoa grains into fashionable frames and hide the reflective prism behind some tinting.

At the recent Google I/O Conference in San Francisco, Google demonstrated the quality of their latest facial-recognition technology. Google Glass is the most well-known hardware, but there are already competitors, and as soon as people see a demo of a “Name Reminder” app for the heads-up display, it’s all over. What’s the embarrassment of looking like a Borg compared to the universal dread of talking to someone whose name you really, really ought to know?

At first there will be resistance: Some bars have already instituted a “no Glass” rule. The first person who forgets themselves and absently gazes at a child for too long is going to get punched in the nose. As for bathrooms, can we all agree on a zero-tolerance protocol?

Regrettably, bad people already have access to surreptitious devices and plan their activities elaborately. Or, at least, some do. The younger of the Tsarnaev brothers wore his baseball cap backwards on his way to the Boston Marathon bombings. Perhaps at the trial we will learn if he never expected to escape, never expected to be caught, or simply didn’t think about being photographed, but the immediate aftermath of the bombings showed one of the less commendable consequences of a hyper-connected world: the rapidity with which a fringe theory can be amplified.

On Reddit, what seemed like a commendable use of the axiom “many eyes make all bugs shallow,” gave rise to two ideas: one that singled out two young fellows carrying duffel bags, and another that focused on some guys wearing what looked like tactical gear. In both cases, preconceived notions of responsibility were projected onto photos and various pieces of circumstantial “evidence”: “analysis” of eye gazes, “Where’d the backpack go?!” scrawls, collages with red circles and lines worthy of the best cop-show corkboard. In one case, this led to a photo of innocent teenagers appearing in a New York newspaper under the accusatory headline of “Bag Men,” and the other fueled the now-inevitable crackpot theory that the bombing was “another” false flag attack carried out by the international fascist shadow government.

The philosopher Jeremy Bentham coined the phrase “panopticon” to refer to an institution in which subjects were continuously visible. Although he envisioned its use in hospitals and schools, the major thrust was for prisons. Bentham was perfectly aware of the consequences: “[T]he more constantly the persons to be inspected are under the eyes of the persons who should inspect them, the more perfectly will the purpose X of the establishment have been attained. Ideal perfection, if that were the object, would require that each person should actually be in that predicament, during every instant of time. This being impossible, the next thing to be wished for is that, at every instant, seeing reason to believe as much, and not being able to satisfy himself to the contrary, he should conceive himself to be so.”

This is a fine description of the state in which we find ourselves, surrounded by minute audio/video recording devices and unable to determine when they’re being used as such. In some places, digital cameras are legislated to make a shutter-snapping sound, and I expect similar laws requiring gaze-cameras to show a recording light. Such laws are useless, easily bypassed by the indignant or ill-intentioned.

The bloody lesson of the 20th century is that the “purpose X” of the State can descend from civilized to barbaric in short order. We all have a moral duty to be watchful, and I don’t want to dismiss the worst possibilities. (Like many technologists, it seems to me that we could do with a little more fervor over the 4th Amendment to the Constitution.)

But far more common will be the application of a million “purpose X’s,” not of institutions per se, but of any group with an interest strong enough to drive effort. Those groups include all manner of advertisers, activists, nut-jobs and creeps. As a start, we must demand a facial-recognition “do not track” capability from our social networks and cloud providers.

It’s easy to get people worked up over funny-looking eyeglasses and flying robots, and much harder to get them worked up over recognition algorithms, cloud computing, and petabyte tracking databases. But if we cannot escape continuous recording, we must at least satisfy ourselves that we have some respite from continuous monitoring.

Larry O’Brien is a developer evangelist/advocate for Xamarin. Read his blog at www.knowing.net.