When does a privacy enhancement become a privacy distribution mechanism? OPS was a distributed content tagging system with the idea that various organizations would rate web content, and browsers would subscribe to the user’s desired system. I wrote this article about it in 1997, but it’s become relevant again as we discuss how to do distributed content-moderation on federated systems like Mastodon. Bluesky has proposed a similar system as well. What’s interesting is that the threat and privacy models have changed considerably since then.

Internet Magazine “OPS brings us one step closer to market-based solutions for privacy protection,” says Christine Varney, former commissioner of the Federal Trade Commission (FTC) Information Week The key component of the proposed standard is the ability for users to manage how much information their browser gives out and to whom particular information is given… Some browsers began sending a “referer” field--a piece of information that indicates the URL that the user was viewing prior to reaching the current web page, and the Netscape browser (followed by others) began allowing sites to stash a small “cookie” that would be remembered for a specified period of time, and retrieved any time the same site asked for it.

I originally wrote this article in 1997 and posted it to my “blog” (back then I called it a 'zine) as the first entry. As such the links are horribly out of date. Fortunately, the OPS system described here died of neglect. But I’m sure it will come back in one form or another. — Kee Hinckley, February 26, 2017

When does a privacy enhancement become a privacy distribution mechanism?

In the guise of providing greater user privacy, Netscape, Microsoft and Firefly have greatly increased the consumer information that will be available to websites.

A few months ago Netscape, Microsoft and Firefly together announced a new initiative, the Open Profiling System (OPS) aimed at quelling user fears over privacy invasions on the internet. It was a great success (from a PR standpoint at least, implementation lags announcement, as usual). The press picked it up and reported on it widely, but nowhere did anyone seem to examine what this will really mean when it is deployed.

Original link was to http://www.firefly.net/OPS/index.html, the site has since reorganized.

The Open Profiling Standard (OPS) is a proposed standard which enables the trusted exchange of profile information between individuals and Web sites, with built in privacy safeguards. Firefly, Netscape, and Microsoft will work together on the OPS proposal during the remainder of the standards review process of the World Wide Web Consortium (W3C).

OPS is designed to enable personalized electronic commerce, content and communication while providing a framework for the individual’s privacy. OPS gives each person complete control over the exchange and usage of their personal information across the Web and also saves them valuable time since they only have to enter their information once.

OPS offers Web sites a greater understanding of their audiences therefore dramatically improving personalized online content, marketing and commerce.

“OPS brings us one step closer to market-based solutions for privacy protection,” says Christine Varney, former commissioner of the Federal Trade Commission (FTC)—Internet Magazine

But before we get into that, let’s step back for a moment and look at the whole issue of privacy on the internet. This is an area fraught with emotion, and greatly lacking in hard analysis.

When the web began, no one was thinking much about privacy. The HTTP protocol provided a way for a browser to specify the identity of the user, and many browsers sent that information, either in the form of an email address, or just the initial account name. The server happily collected the information and logged it in the log files. Early web servers even had code which could be used to connect back to the sender’s computer and (depending on the type of computer and the software running there) verify the actual identity of the user (IDENTD). These features were primarily used for tracking how many users (as opposed to browser “hits”) had visited a site, and for contacting someone who was apparently having trouble (lots of hits to mispelled pages or some such) and helping them out. Those were the innocent days.

As web use increased, some people started realizing that they didn’t really want every site they browsed to know who they were. People complained, and the browser authors stopped sending the user identity. The log files stopped receiving that information (although the empty identity field still resides there--filled in only if the user provides a username and password for a secure site).

The key component of the proposed standard is the ability for users to manage how much information their browser gives out and to whom particular information is given.
—Information Week

Some time thereafter, two new information sources became available to web site developers. Some browsers began sending a “referer” field--a piece of information that indicates the URL that the user was viewing prior to reaching the current web page, and the Netscape browser (followed by others) began allowing sites to stash a small “cookie” that would be remembered for a specified period of time, and retrieved any time the same site asked for it. Although cookies get all the press, the referer field is actually the only feature capable of exposing personal information that you’d rather not reveal. But this whole issue has everything to do with emotion, and very little to do with facts. Let’s look at the two features.

Cookies

What you may not know is that the industry already has an excellent solution: the “Open Profiling Standard” (OPS).
—Jesse Berst/AnchorDesk

A “cookie” is a computer term for a small piece of information that gets tucked away somewhere by a program for future retrieval. Sometimes they are called “magic cookies”. The name implies an informal storage mechanism, and typically cookies aren’t explicitly stored by the user, they general contain internal information that the program needs. Programs use them all the time. When you restart a program and all the windows come up in the same place as the last time you ran it, when you bring up a search dialog in your word processor and the text of the last item you searched for is sitting there pre-selected--those are all examples of a program stashing away a cookie with some information in it. It didn’t ask you if you wanted to save that information, it just stored it for convenience’s sake. We don’t tend to think of those as privacy risks (although if the last search you did was for “big fat boss”, and the next person to use your computer is the aforementioned boss, you might think otherwise).

“OPS is a great first step,” Gaddis says. “It raises consumer awareness and allows consumers to protect themselves against the few bad eggs that are present with any transaction.”
—Computer Shopper

The cookies stored by your browser are no different. When you go to a web site, it has the option of asking your browser to store some information about your session so that it can access it at some future date. That information is usually a session identifier, or some other data that will enable the site to recognize you when you return. The site may use it to remember your login information, or pre-fillin that complaint form so you don’t have to do it again, or just track the happy fact that you have returned to the site. The cookie does not, and can not, contain any information that you haven’t already provided to the site. It also cannot be passed to any other site, so the information you enter on one site can not be snarfed by some other site.

Referer Fields

Referer fields are slightly different. What they tell a site is how you got there. Within a site they are often used for tracking your movement so that the user interface designers can look at how people are using a site and modify the interface to better give people access to sections that aren’t being visited. However what is usually of more interest is the site that you were on before you came to this one. That gives site owners an idea of which remote links are most useful and/or cost effective. The catch is that browsers don’t just pass the referer field when you click on a link, they also often pass it when you type in a URL. So it is possible that sites will pick up the fact that the previous site you were visiting was, shall we say, not one that you might like the world to know you were visiting. It’s rather like stepping out of the adult bookstore and bumping into your next door neighbor.

Oddly enough, though, the referer fields have never really caught on as a “privacy risk” in the press. So be it.

Selling Yourself

As you travel from one site to another on the web, you may be amazed at how much is being given away for free. Research reports, news, travel directions… the list goes on and on. And it’s all free! Sites that charge money for access are few and far between.

Appearances can be deceiving. In fact there are many, many sites on the web that are charging for access, it’s just that the currency isn’t what you are used to. Instead of cash, the currency is personal information. Information about your age, your sex, your marital status, your wealth. Some sites are subtle (Lucent’s MapsOnUs lets you use the site several times before it asks for some information about you (couldn’t do that without cookies :-). Other sites barely let you past the front page before insisting that you register. Other’s tempt you with a contest of some sort. But the end result is the same, you’ve sold some part of your electronic soul for access to the site. You’ve exchanged one sort of information for another.

But what will those people do with that information? Will they sell it to a mailing list? Will it be picked up by spammers? Will tons of junk paper mail start arriving at work? These questions started the privacy experts questioning the whole process, although in practice this is no different than filling out a magazine’s bingo card (and usually far more rewarding). In stepped Netscape, Microsoft, Firefly and others with the OPS, a combination of two technologies and a business practice addressed at giving users more control over their privacy--at least in theory.

The technologies are the vCard standard from the Internet Mail Consortium, and Digital Certificates (also known as X.509), an IETF (Internet Engineering Task Force) standard. The vCard standard specifies a format for storing and exchanging personal information (typically the type found on a business card, but it can cover just about anything). Digital Certificates provide a mechanism for providing secure storage and transmition of identification information--the driver’s license of the internet.

The business process that ties these together is a promise from companies signing up for this standard that they will adhere to certain privacy guidelines.

Web sites that adopt OPS are strongly encouraged to adopt a recognized privacy assurance program that includes third-party auditing, and to clearly post their privacy policies on the site where visitors can see them. In addition, consumers are cautioned not to release their Personal Profile to any site that does not post its privacy policies and submit to third-party auditing.

Firefly

As business practices go, that one is pretty weak, and nothing that couldn’t have been done without all this new technology. So what does the new technology provide to enhance privacy?

Frankly, nothing. What the OPS does is let you only enter your personal information only once, so that when a site asks for your information, it becomes incredibly easy to provide it. Where before you might have had to fill out a form with home and work addresses, sex, marital status, income and the like. Now you can just hit the “Okay” button on your browser and have all that information automatically sent to the remote system. Where before you might have skipped the non-mandatory fields in a form, now you’ll send them anyway, it’s not any harder.

In sum, the OPS is really a mechanism to make it easier for consumers to tell vendors information about themselves. It provides no more control over privacy information than the current “fill out the form” mechanism, and is far more likely to increase the distribution of personal information to multiple companies. It’s not a “bad” technology in any sense, but the PR that it has gotten is deceptive--OPS does nothing to enhance privacy.