Skip to main content

Privacy, Centralization and Databases

A fascinating article on the potential problems associated with privacy on a large government run database was recently posted at ModernMechanix. The article appears to be in response to an effort to build a centralized data center that would contain personal records on US citizens. The interesting part is that the article appeared in The Atlantic in 1967. Reading it today makes it clear that not only did the author, Arthur R. Miller, laid bare the fundamental issues surrounding centralized government managed data repositories, but that the issues neither have changed, nor been addressed. Unfortunately the article is probably far more relevant today.

The author is concerned that:
With its insatiable appetite for information, its inability to forget anything that has been put into it, a central computer might become the heart of a government surveillance system that would lay bare our finances, our associations, or our mental and physical health to government inquisitors or even to casual observers.
As we discuss a national database of health records, a national identity card; and as we already have centralized employment reporting, mandatory bank transaction reporting, centralized credit agencies, that though seemingly privately run, are protected from any reasonable privacy rules or laws by some unseen forces in DC, we can pretty much declare that except for cleaning up a few loose ends, we already have what Arthur Miller feared in 1967.

This sounds familiar:
The great bulk of the information likely to find its way into the center will be gathered and processed by relatively unskilled and unimaginative people who lack discrimination and sensitivity.
Show me a governmental agency where some clerk hasn't snooped around in other peoples tax or health records. We know it happens. And occasionally we even hear about it in the media.

And of course, once you are in the database, how easy is it to get inaccurate records corrected?
An untested, impersonal, and erroneous computer entry such as “associates with known criminals” has marked him, and he is helpless to rectify the situation. Indeed, it is likely that he would not even be aware that the entry existed.
Does anyone actually think that the No Fly list is accurate? Or that expunged records are expunged?

I'm still waiting for this to happen:
To ensure the accuracy of the center’s files, an individual should have an opportunity to correct errors in information concerning him. Perhaps a print-out of his computer file should be sent to him once a year.
Let me try that this weekend. I'll write a letter to every state and federal agency that has ever had any records on me and ask them for copy. I'm sure that will work. While I'm at it, I'll ask for every photo from every traffic camera that I every drove through also.

Who hasn't heard this idea:
One solution may be to store information according to its sensitivity or its accessibility, or both.
Wow. I just paid a $300/hr consultant to tell me that the new enterprise security best practices will require me to classify our data and secure it according to its sensitivity. I could have read a 40 year old Atlantic for a $1 instead. (That was sarcasm. Data classification is obvious and self evident.)

And:
It probably will also be necessary to audit the programs controlling the manipulation of the files and access to the system to make sure that no one has inserted a secret “door” or a password permitting entry to the data by unauthorized personnel.
Fascinating. Code audits, account audits, integrity checking, intrusion detection.

If this is even the slightest bit interesting to you then page through this Harvard Law Review article also (PDF). The arguments are reiterated in greater detail. The notes in the margin are also very interesting.

I'm rather disappointed that my generation, the one that took computing from the early 1980s through today, seems to have neither come up with any significant new privacy issues, nor solved any longstanding privacy problems.

What's even more chilling is that the use of organized, automated data indexing and storage for nefarious purposes has an extraordinary precedent. Edwin Black has concluded that the efficiency of Hollerith punch cards and tabulating machines made possible the extremely "...efficient asset confiscation, ghettoization, deportation, enslaved labor, and, ultimately, annihilation..." of a large group of people that a particular political party found to be undesirable.

History repeats itself. We need to assume that the events of the first half of the twentieth century will re-occur someday, somewhere, with probably greater efficiency.

What are we doing to protect our future?


Comments

Popular posts from this blog

Cargo Cult System Administration

“imitate the superficial exterior of a process or system without having any understanding of the underlying substance” --Wikipedia During and after WWII, some native south pacific islanders erroneously associated the presence of war related technology with the delivery of highly desirable cargo. When the war ended and the cargo stopped showing up, they built crude facsimiles of runways, control towers, and airplanes in the belief that the presence of war technology caused the delivery of desirable cargo. From our point of view, it looks pretty amusing to see people build fake airplanes, runways and control towers  and wait for cargo to fall from the sky.The question is, how amusing are we?We have cargo cult science[1], cargo cult management[2], cargo cult programming[3], how about cargo cult system management?Here’s some common system administration failures that might be ‘cargo cult’:Failing to understand the difference between necessary and sufficient. A daily backup is necessary, b…

Ad-Hoc Verses Structured System Management

Structured system management is a concept that covers the fundamentals of building, securing, deploying, monitoring, logging, alerting, and documenting networks, servers and applications. Structured system management implies that you have those fundamentals in place, you execute them consistently, and you know all cases where you are inconsistent. The converse of structured system management is what I call ad hoc system management, where every system has it own plan, undocumented and inconsistent, and you don't know how inconsistent they are, because you've never looked.

In previous posts (here and here) I implied that structured system management was an integral part of improving system availability. Having inherited several platforms that had, at best, ad hoc system management, and having moved the platforms to something resembling structured system management, I've concluded that implementing basic structure around system management will be the best and fastest path to …

The Cloud – Provider Failure Modes

In The Cloud - Outsourcing Moved up the Stack[1] I compared the outsourcing that we do routinely (wide area networks) with the outsourcing of the higher layers of the application stack (processor, memory, storage). Conceptually they are similar:
In both cases you’ve entrusted your bits to someone else, you’ve shared physical and logical resources with others, you’ve disassociated physical devices (circuits or servers) from logical devices (virtual circuits, virtual severs), and in exchange for what is hopefully better, faster, cheaper service, you give up visibility, manageability and control to a provider. There are differences though. In the case of networking, your cloud provider is only entrusted with your bits for the time it takes for those bits to cross the providers network, and the loss of a few bits is not catastrophic. For providers of higher layer services, the bits are entrusted to the provider for the life of the bits, and the loss of a few bits is a major problem. The…