Discussion Transcripts

Text Transcript for Session 3: Theory and Practice of Freedom of Expression

One of the thorniest faces of free speech debate is the tension between free expression as an abstract principle and kinds of speech that harm, such as hate speech, incitements to violence, or uses of information which can cause economic damage or threaten security or privacy. And technologies change how information can move, and harm. This week we put a historian of the earliest post-printing-press debates over free speech in dialog with a historian of the information practices of hate groups in America.

  • Kathleen Belew (use of technologies by modern US hate groups)
  • David Copeland (history and origins of free speech debates)
  • Kate Klonick (internet law)
  • Cory Doctorow (digital information policy)
  • Plus series hosts Ada Palmer and Adrian Johns
 

[Ada Palmer] Welcome all to our third Censorship and Information Control During Information Revolutions discussion. Today we’re going to look at the extreme edges of free speech and free press. And I’d like to ask my fellow panelists to introduce themselves, starting at the far end.

[Kate Klonick] Hi, I am Kate Klonick, I am a professor of law at St. John’s University in New York. And I write about the First Amendment and online speech and private governance.

[Cory Doctorow] I’m Cory Doctorow, I’m a science-fiction writer, I’m an activist with the Electronic Frontier Foundation and I have academic affiliations at MIT and UNC and the Open University in the U.K.

[Kathleen Belew] Hi, I’m Kathleen Belew, I teach here in the History department. I work on the white power movement, so extreme, right-wing mobilizations from the Vietnam War to the Oklahoma City bombing and I do some public scholarship around the role of far-right groups in the public sphere today.

[David Copeland] Hi, my name’s David Copeland, I’m a media historian. I look at the history of media from the 17th century up to the 20th, focusing primarily on the 18th century in the United States and the role it’s played in disseminating information to society. I teach at Elon University.

[Adrian Johns] I’m Adrian Johns, you’ll know who I am because I’ve been at previous ones, hi. So I’m in the history department here and I teach history of science and history of the book.

[Ada Palmer] Ada Palmer, I’m also here in the University of Chicago and I teach in the history department. I look at the Renaissance and the Enlightenment and the movement of radical and forbidden ideas in societies that are hostile to them. So I wanted to invite everyone to just kick off today, maybe actually let’s start a little bit with Kathleen talking about some of the things I’ve heard you talk about with uses of, sort of, what we now think of as marginal and semi-forgotten information technologies in the 20th century and their impact on the hate groups that you look at. And then we can dive into pre-modern versions of the same stuff.

[Kathleen Belew] So what I’m looking at is the way that different kinds of hate groups came together into a social movement after the Vietnam war. So groups like the Ku Klux Klan, Neo Nazis, radical tax resisters and others came into common cause around one set of technologies having to do with the paramilitary aftermath of violence. But the way that they sort of organized themselves as a social movement was also through information technology. So this starts in a very old-fashioned kind of thing, you guys aren’t even gonna know, but mimeographs, has anybody ever seen a mimeographed page? We still had them in high school, it’s how you would duplicate information. So they’re blue and they smell funny and you would run them, you have to have a machine, not everybody would have a machine. So they started with mimeographed newsletters and circulars, which are badly preserved in the archive, difficult to make and reproduce. They get worse after the first few copies. And then the grocery store photocopier came along and these groups went bananas. They found clip-art, they found ways to personalize stuff and you can tell that the production just went crazy. So you didn’t then have to have a printing press to make a broadsheet, newspaper or something, although they were doing that too. Anybody could make their own newsletter or their quarterly or something. So then you start seeing instead of the major movement periodicals, a whole bunch more people got access to the information. Women’s groups, particularly, put out a lot of stuff this way. So there’s one called Christian Patriot Women that was just made on a grocery store photocopier, it seems to me, with like clip-art roses and butterflies and things to kind of soften the message of what they were talking about, which was apocalyptic race war. And then the other thing that was influential in the technological development of this movement was of course the early internet. So this is widely misreported and misunderstood, and I understand some of you are editing Wikipedia and you might want to take a look if this pops up anywhere, but people usually talk about the advent of the internet and the white power movement and alt-right circles as Storm Front which is a website that went online after the Oklahoma City bombing around 1995. In fact, they were using kind of the proto internet in 1983, ’84. They had a thing called Liberty Net which is a series of computer message boards that were password protected, which by the way it took the FBI like two to three years to crack, so these were private spaces and they weren’t just for nefarious purposes, these included everything from like personal ads so that you could find love within the movement to assassination plans and hit lists, and things like that. And they used that to coordinate cell-style terrorism so the technology actually came hand-in-hand with revolution and strategy within the movement. One way to think about that for the purposes of this class is that this is the kind of social network activism that we’re all familiar with now from things like Facebook, right, but 1983, ’84, okay. So they were doing this before most of us had in fact heard of that and before the internet proper was even online. I think it’s important to understand for purposes of understanding how to sort of reckon with the scope and capacity of this kind of activism.

[Ada Palmer] So this of course is the, in our own society, that kind of, not only hate speech, but radically, actively, destructive, let’s plan to kill people, hate speech is often the forefront of where we draw lines of yeah, maybe that should be censored, or at least where we start to slide, of shouldn’t there be something on that? So it’s a useful entry point to get at the question of history of how people have framed free press, what people have thought should fall in the scope of free press. Would anyone else like to plunge in and carry this in a history of free press direction?

[David Copeland] Well, you know, you can… So I study, look at the rise of the use of press with religion and then the idea of how that developed into the idea of free press. And if you want to go back in time what you will see is the same kind of ideas when you have religious groups who have realized that they are contrary to whatever the state church is, that they needed a means to get their message out, to let people know what they believed, so they advocated for liberty of conscience which came to them from rational thought. But the means to the end for them to get to wherever they were would be the printing press. And since the printing presses were controlled by governments then they weren’t allowed to use those. But the idea was the liberty of conscience, finding people that would let them print the things that they wanted, that this information could then do what they wanted them to do, which was to get their message out to like-minded people, so even though we’re talking about early 1600s we have the same kind of concept that’s happening with people in Europe which will then translate to America, the same kind of ideas which would then slowly morph into the ideas of not just religious freedom, which would have happened in the 1600s and the early 1700s, but the idea of our political freedom, and then we will use a network in the same way that, she’s talking about the mimeograph, but it will be the printing press and we’ll end up eventually creating something in the 1770 era called the Journal of Occurrences, which was created by Samuel Adams and a couple of other printers in Boston. The idea was to create a weekly news sheet that told about the atrocities of the British within Boston, but that could then be sent to printers all over America who would reprint that information and so the ideas of why they were gonna be doing what they’re doing and the horrors of Britain and why we should no longer be British subjects could take place. It’s a very, very similar kind of tie-in.

[Cory Doctorow] I think it’s worth reiterating, something I said last week, this idea of John Gilmore’s that the reason you see early technology adoption among fringe groups is not that there’s something especially mimeograph adjacent about being a racist, but that if you pay a high cost to communicate using the channels that everybody else gets to use then the additional cost of figuring out how to use a mimeograph machine might actually be lower than the cost of evading postal interception or the risk of being prosecuted for sending unlawful materials by mail or any of the other costs, you know which is why pornography is a technology adopter and also racists and also people who have revolutionary, radical, left-wing views and lots of other views are just more apt to take this stuff on.

[Ada Palmer] You mean costs both in the sense of money, but also in the sense of your own time and effort?

[Cory Doctorow] Yeah, opportunity costs.

[Ada Palmer] How much effort you have put into learning the new technology versus how much more effective it is when you risk jail time or censure or the destruction of your materials when you’re using the well-established technology.

[Cory Doctorow] One gloss that occurs to me is that it may be that since if you have a heterodox view and you have the aptitude for adopting a new technology your views spread further, that maybe there is a selection function in the early years of a new technology where the most successful racists are the ones who are technologically adept or have the capacity to develop technological ability. And people, for who, for whatever reason, just can’t muster any kind of, you know can’t conceptualize of how say a BBS might work will never find themselves on Liberty Net and so maybe there is in that sense, there’s early adopters in heterodox views, but the causal error goes the other way, it’s a selection function.

[Kathleen Belew] I think that’s true in some ways, but one thing to think about is that the movement that I study is using sort of a century of opportunistic strategy taken from earlier versions of the KKK, in which the idea has always been to figure out to use whatever is available for their own ends. And there’s a lot of reasons for that, it has to do with the way that people interpret social threats as very dire and apocalyptic, right. So for people that I study, immigration isn’t just about diluting some ideal, right. They experience that as like the end of the race is coming because, right, or at least they say they do. So one of the things that they’re doing in this time period is using strategies that are available to them the same way that we can think about the Klan in the 1920s is sort of classically anti-Black in the South but it’s also anti-Mexican on the U.S.-Mexico border, anti-labor in the Pacific Northwest where there’s a lot of unions. Anti-immigrant in the Northeast where there are a lot of immigrants, right? That’s the way the Klan has kind of always worked in the 20th century forward. But also because they have this sense of apocalyptic fear, I think there’s actually an impetus to work together to distribute the technology. So there’s one Klan leader who actually goes around in the ’80s, teaching, well first of all there is a terror group that robs armored cars and gets, successfully nets millions of dollars from illegal activities including that and counterfeiting. They distribute that money so that everybody can buy an Apple mini computer, which is not mini at that time. Maxi computer. And this guy goes around the country teaching them how to get on Liberty Net. So there’s a reason sort of, the way the ideology is constructed is pushing people together also too.

[Cory Doctorow] That’s really interesting.

[Adrian Johns] So, let me get to, I’ll get to what I was gonna say as my opening thing in a second, but I’m just struck by that because it rings a bell with a history of a very different group which Chris Kelty, the anthropologist has written about. He wrote about Debian, the origins of Debian, the open-source operating system. And that’s a community which grew up and it’s very technically adept and it has a hierarchy, for all that it’s opened and accessible, it’s very technocratic. So there are high-ups in the Debian community who are licensed to do more to the operating system than hoi polloi members of the community. And the way this anthropologist tells it is that the way that that’s sustained is essentially by face-to-face contact. So after a while when they realized that the organization has become complex enough that they don’t just automatically know each other there has to be a sort of entrance procedure to get into it and the entrance procedure involves at a certain point actually having face-to-face contact with somebody who’s already accepted into the network. And there’s a moment of crisis when it turns out that one of the very first members of the network who’s been in it for the longest has actually never seen anybody else. And so they wonder who is this person? Does he actually exist? And they have to get a plane and fly out to the middle of America somewhere and knock on his door just to make sure they’ve actually seen this person. And obviously Debian is very different from your extreme right groups, but I’m wondering to what extent something that in Debian is actually a social mechanism for upholding something like quality and integrity, in an extreme group might become a social mechanism for security or, you know, defensiveness, keeping it together to make sure that it’s not infiltrated. I don’t know?

[Kathleen Belew] That’s hugely important, I mean the reason that they need Liberty Net is it’s part of a series of strategies that are designed to insulate themselves from prosecution. And part of that comes out of a big problem with what historians have called the Third Era Klan, which is the KKK that tried to undermine the Civil Rights Movement in the ’50s and ’60s, was the sort of ubiquity of FBI and other government informants getting into the groups and causing a lot of problems, especially before the end of, do you guys know COINTELPRO which is the FBI counter-insurgency operation that authorized agents to sort of overthrow and mess up group activities? Side note, that was overwhelmingly directed against Black activists on the left, but a little bit of it did go towards the Klan and they did get frustrated by this. So Liberty Net and then another strategy called Leaderless Resistance, which is effectively cell-style terror, it’s just the idea that a cell will work without direct orders from anybody so that it can’t be, you know if you infiltrate a cell you only get six people instead of the whole movement. And these strategies were designed to sort of obscure the ties that people could prosecute between white power activists. But I think the bigger kind of takeaway is the sense of disappearance of the movement itself, right. The other thing it does is make it very difficult for people on the outside to see how a movement is organized and connected. Anyhow I think that’s part of that story, but wait, do you?

[Kate Klonick] Yeah, I think that what’s interesting here is that you were talking, I loved the mimeograph example and this idea that like it became easier and cheaper, not just in terms of money but time, effort, cost. The one thing that I would also add, like this is the march of technology, is that the internet has now made it so that it has like just erased all geographical boundaries. And it has centralized decentralized groups, like the ones that you study. And so this has been this entire moment where all of a sudden these people, you can live in Alaska and you can live in the Ukraine and you can live in Florida and you can all be online at the exact same time talking about the same crappy stuff and sharing your ideas and it costs you like nothing, it costs you nothing and it’s instant and it’s also kind of, speaking of mimeograph, I think that the etymology of memes comes from mimeograph and that’s like, it doesn’t, no?

[Cory Doctorow] No, it’s genes. Richard Dawkins and genes.

[Kate Klonick] Oh it’s memetic, you’re right. Yes, yes, yes, you’re right it’s memetic, but it is kind of this idea of kind of like replication and things growing on each other very, very, very quickly and that’s part of like the step I think forward from meet-space technology, like a mimeograph or a copy machine to like the internet, but like you just obliterate you just allow things to grow and spread at a scale that they never were capable of spreading before.

[Cory Doctorow] One other parallel, maybe, between Debian and the KKK, although I want to stress that it’s a formal parallel, not an ideological one, is Debian was the first software project with a constitution that articulated a moral purpose. So they had this structure, one of the reasons that they needed to meet people and see why they were all there is they needed to make sure they were all working the project for the same reason.

[Adrian Johns] The other thing that struck me reading some of the text for this week was, if you like, the sheer mundaneness of it all, and this comes through in different ways in Kathleen’s work on the extreme right and Bob Darnton’s work on 18th century France where Darnton is going through the extraordinary sort of bureaucratic record of French administrators as they’re going through backstreets in Paris or provincial towns and coming upon, not sort of Voltaire and Rousseau and Helvetias and all of those people, maybe Helvetias a little bit. But these kind of now anonymous, you know poverty-stricken, fly-by-night colporteurs who are trying to scrape a living sort of distributing, selling what we call livres philosophiques which doesn’t mean philosophical books in our sense but sort of banned books. And there’s something I think that deserves to be remembered about that, the everyday-ness of it. You know people have lives. We tend to, when we go back, and we talk about issues of censorship and information management and so forth, we tend to isolate out the parts of people’s lives where they are heroic or villainous and we forget that they’re often actually very, very mundane. And the police are very mundane as well. And there’s a kind of cultural symbiosis sometimes between the two of them which Darnton picks up on very well to the extent that one of his police officers actually ends up sort of turning and dies in a dungeon because he’s been captured doing what the low-life booksellers were supposed to be doing. The other thing that I wanted to, the other bell that rang in my mind was that it’s actually a notorious problem among historians that everyday life is very hard to document because it tends to be the kind of thing that we just do, we wander around, we use our cell phones, whatever it is and it doesn’t leave a lot of traces and most of the traces that it does leave are indirect. And in Darnton’s case, he’s brilliant at getting at sort of the street life of literature, the street life of knowledge or enlightenment, but he’s doing it through police files, typically. And I wonder whether we want to think about what the implications of that are? Whether if you move to different cultures, so with Kathleen’s people, at least I think, I don’t want to impute things to you. My sense is from Kathleen’s stuff that you wouldn’t quite be able to do what Darnton did with modern police files because the police actually don’t know a surprising amount, or their knowledge is surprisingly partial in both senses of the word partial. And I think that there’s a moral component as it were to the historical investigation of censorship and information management that sort of behooves us to think through what the sources are that we’re using when we try to get to it at that kind of close anthropological level that Darnton does.

[Ada Palmer] Well you’re reminding me of when we were putting together the exhibit that accompanied this project. And one of the sections of the exhibit is the banned bookcase tour of the continents where we selected 10 sample items from each continent to look at the different kinds of censorship that were happening in each of those spaces. But it was fascinating to me comparing, and these were selected by students, it was fascinating comparing what was selected for each region because for North America and for Europe, they were all, especially North America, sort of very similar to each other. The things that were selected were very similar to each other. They were all sort of this is the thing that was on your high school syllabus and then parents called in and demanded, no, my x-year-old daughter may not read x thing. They were all almost identical cases and they were almost all works that professed liberal values of some sort. This is depicting a multiracial romance or something. So that there was a political spectrum ideology to defending this, that defending this was you know, choosing that as the example was clearly trying to demonstrate how censorship is the enemy of liberalism in a sense, not that the curators necessarily consciously thought that but that was in the unconscious space. But when we did Australia/New Zealand where there are fewer of these big, sensational cases to look at, both because there isn’t quite an equivalent of our American Library Association Office of Intellectual Freedom which does big publicity tracking on classroom stuff, but also just ’cause it’s a smaller population so there are fewer cases, they had to reach much more broadly and we had a lot of mundane things or things that are far from what you would think of as the political face of censorship. So like nudist colony newsletters about nudism and you know whether in nature man should wear no clothes, and, you know, we have a wonderful New Zealander Nudist magazine which was running an article about is it immoral for mankind to go to space because in space you have to wear clothes because there’s vacuum and you’ll die. And since humans can never be naked in space does that mean we shouldn’t go to space? And this is so far from anyone’s political spectrum and lots of the materials were like that from Australia/New Zealand, not because those materials don’t get targeted in the U.S., but because we have a very specific, hero/villain narrative of what we look for in the history of censorship which makes us much less likely to notice the mundane cases of censoring a exhibit catalog of a mediocre art exhibit in which one out of 300 pieces had a penis in it and so it got targeted for censorship but none of the people involved are particularly famous or, you know, a photo book of sexy sailors that got censored in New Zealand and all these materials which are just much more, much less in alignment with a particular framing of the heroic, free speech advocate or the heroic author who is advancing a liberal agenda and much more just all kinds of things get censored for all kinds of reasons in these spaces.

[Cory Doctorow] Bados and my colleague Charles Stross, a science-fiction writer, calls the social media era the beginning of history because it’s the first era in which we document the mundane of very large groups, large pluralities of the population, the moment to moment mundane updates.

[Ada Palmer] It is true, as someone who spends time trying to figure out what people ate, boy are we gonna know what people ate…

[Cory Doctorow] Oh yeah.

[Ada Palmer] …in the era of Instagram. You know, if we can comb through this data we really know. We will not, as my colleagues from my own period do, desperately look at paintings of The Last Supper to see what’s on a table because that’s our best clue as to what people ate.

[Cory Doctorow] We’re gonna remember 2018 as the era of activated charcoal black ice cream.

[Ada Palmer] Yeah.

[Kathleen Belew] If the information survives.

[Cory Doctorow] If the information survives.

[Ada Palmer] Yes, and if the information is sortable by any means and isn’t drowned in itself.

[Adrian Johns] There’s another aspect to the mundaneness of it which is not quite brought out in Darnton, but it’s really something that I meant to mention in the first two sessions and forgot, which is that I think one of the reasons why that, to us, kind of extraordinary baroque apparatus that developed in 18th century France where you have, you know, licenses, and tacit permissions, and tolerances, and all of these things that are sort of recorded and sort of not officially. One of the reasons why all of that, and the whole police apparatus is excepted is that to an extent those kinds of things are familiar to anybody who lives in 18th century France or for that matter 17th or 16th century France because they’re just how you maintain well-ordered industries in general. So a lot of the mechanisms like licensing, and this is true in England as well, that we tend to sort of see as having big, deep moral problems because we associate it with a kind of linear, what am I gonna say, like a, I think we’re a longitudinal history that leads into our notion of censorship. They would see latitudinally, as it were. So they would see it the same time. They would see it in parallel to the licensing of things like breweries, or bakeries, or butcher shops, or goldsmiths. And just as in those domains you need to have, well this is the assumption, you need to have some kind of oversight by guilds or something to maintain good quality beer so you need to have some kind of licensing and oversight of policing to maintain good quality books. And I think that that’s, now it’s not quite one-to-one because it actually does develop in its own way so that 18th century thing of tolerances and permitting tacit and so forth doesn’t really exist in these other domains. But at root, the kind of base assumptions of it are shared across artisanal worlds, and I think it’s important to remember when you think about the mundane everyday routinized character of this that maybe one of the reasons why people didn’t go up in arms about it is that every time they go into some craft domain they’re dealing with something that’s very like this, something where the grounding assumptions are the same, and if you do the economic history of this period you see that phrase “well-ordered trade” everywhere in English, I mean, it appears, it doesn’t counter pass in different languages, but it’s something that was often in a twinning with freedom. So you’ll see a trade should be free and it should be well-ordered. And well-ordered involves this kind of mechanism.

[Ada Palmer] And you’re making me think of Inquisition cases again on the mundanity question where even when there’s censoring a thing that is to us a major central work of literature where we feel outraged that x was censored the censorship itself is often incredibly mundane. So we looked for our exhibit at the censorship of Don Quixote and the censorship of Milton’s Paradise Lost, both of which were gone through by authorities, Catholic authorities in the case of Don Quixote, and so professional censors suppose would sit down and read this through and say is it okay and does anything in it need to be changed. And Don Quixote is a long, long book. And what is the Inquisition’s answer? We want to change one sentence, one sentence in which there is a side comment that acts of charity and virtue done without a good, without good intent, without, you know, true conscience behind them don’t mean anything. And this felt to the censors as if it might be endorsing Protestant sola fides. So this one sentence, and you have just paid someone for the man hours of reading all of Don Quixote and filling out paperwork about all of Don Quixote to do just that. And Milton himself, who is, you know, in the Aeropagitica, the great articulator that people who write histories of free press point at as the first articulation of free press, when it’s time to censor Paradise Lost, again one sentence in which there’s a metaphor about an eclipse. An eclipse is said to sort of cause a thing to happen and they say no you must say that it presages the thing, not causing it, because if you say cause then you might be endorsing the idea that astrology is more powerful than God. And therefore, you know, that the planets are so inextricably powerful that they can foretell deaths of kings and so on, and we’re anxious about eclipses right now, so that sentence must change. And so, you know, Milton talks so much about how having any licensing process, knowing that you’re beautiful epic poem is gonna go before a board and get read and then censored will utterly transform, as he says, the relationship of author and reader, utterly transform what it means to be writing. You’re now writing for the licenser, for the censor, instead of otherwise, and yet in Paradise Lost, which is full of ferocious political rhetoric, in which the main character is Satan, and the one thing they’re worried about is eclipses. You know, again gets across the mundanity and often, another way to put it is orthogonality of what any individual period is censoring versus how we imagine what should they be censoring, right. What should they be censoring if you are the Inquisition? And then there is you should be censoring atheists, right. What should you be censoring if it’s the Enlightenment? You should be censoring Voltaire. No, they don’t care about censoring Voltaire nearly as much as they care about censoring Jansenist reiterations of Calvinist positions on predestination. That’s scary to them. So, again, you get this strange mundanity, orthogonality from what we expect in our political imagination of censorship’s history.

[Kathleen Belew] This is also, just to riff on this a little bit, this is also in the archive itself, right. So I work a lot, not with police records because mostly the people monitoring white power activism were FBI, Bureau of Alcohol, Tobacco, and Firearms, and I think the CIA, although I’ve had 0 luck getting anything from the CIA, which addressed its letter to Mr. Belew also, which I’m still irritated about.

[Adrian Johns] Clearly they have you under close surveillance

[Kathleen Belew] Like, I have information they should want. Anyway, but, so you get the documents back through the Freedom of Information request from, say, the FBI or the Marshall Service, and they’re redacted, right, which is part of the law. Redacted meaning they take black marker and they black out all of the personal information. So, but this is a very imperfect technology also. The old ones, when it was done by hand sometimes if you hold it to the light you can still read it. Good for me, bad for the law I suppose. And the new ones are a little box made by like Adobe Writer so it’s much more thorough but it’s still done by people so there’s huge amounts of human error in the redaction process. I’ve gotten Social Security numbers. I’ve gotten people’s fingerprints. I’ve gotten all kinds of things like, like they won’t list the person but they’ll say like this guy was 120 pounds and was 5’8″ and I’m like I know who that is, like that’s a, I don’t know, sometimes they give you like enough defining detail that you can figure it out. And sometimes the documents adjacent will list the name so that you can make inferences. So it’s all super imperfect. But I feel like, thinking about police records is so interesting because the kind of detail that gets retained by a police record, of course, also changes with the institution of the police. So, of course, police in France in that period are not the same thing as police of any stripe in the modern age. And, you know, local history, or not local, but the historical context and politics of the moment run through the whole project.

[Cory Doctorow] If any of you are interested in learning more with the Freedom of Information Act, the MuckRock is an automated service that will allow you to do a certain number of free FOIA requests and also for a small fee will do it. They also track which agencies respond and how they respond. And the early history of redaction, digital redaction, a lot of it was either drawing black boxes on word documents that retained in the document the full text so there’d just be a bitmap of a black box or the same with PDFs where they would do this as well.

[Kathleen Belew] Yeah.

[Cory Doctorow] And you could see it. Sometimes we’d get word docs that were, had the whole revision history, including things like we’d get legislative documents that would show that they were authored by lobbyists if you looked in the metadata for the word doc.

[Kathleen Belew] Whoa!

[Cory Doctorow] And you could see that, you know, this copyright law had literally been written by a Motion Pictures Association of America staffer and then introduced by Orrin Hatch, you know.

[Kathleen Belew] Oh la la.

[Cory Doctorow and Kate Klonick] Yeah.

[Kathleen Belew] It’s all really fun, it’s fun to work with them. I encourage you too if you haven’t spent time, and it takes some time but one way to work with them as an undergraduate, or if you’re on a schedule is the FBI just has a list of like frequently requested records. So if there’s something on there that appeals to you it’ll be a partial file, but you can start working with some redacted documents and look at what they have. There’s a lot of stuff on there.

[Cory Doctorow] MIT Press just published a MuckRock book of FOIAed journalists’ files that I wrote the introduction for.

[Ada Palmer] And in a couple weeks we’re having Jonathan Craze in who works on redaction in the Guantanamo Bay documents and Iraq war related documents, and we’re pairing him with Nicholas Davidson who works on Inquisition trial record documents to look deeper into that question of what do agencies choose to record, of what they record what do they choose to keep and what moves forward through multiple versions, and then when they choose to try to remove information from a record, what gets removed versus kept. And, you know, we chuckle about the PDF, you know, retaining the old material stuff, but new adoption of the technology I think has often been very visible in exactly that way in the historical records. So you mentioned that, and I think of the secret archive of Venice, the Republic of Venice, which had a famous secret archive that has a huge building that says secret archive on it. It has a coat of arms with the secret archive and everyone knows that they have a secret archive, the purpose of which is to make you intimidated. What do they know in the secret archive? Right, they want you to think that they know everything. And you can have tours of the secret archive and all the big, shiny volumes of the secret archive. But in the secret archive when you open up the actual volumes, they have the rough copy and the clean copy. And this is in the 1300s and the 1400s and the 1500s, which is the period during which paper has arrived in Europe. It’s been in Europe since a little after 800 A.D. But people are still kind of nervous about paper as a serious thing. How long is paper gonna turn out to actually last? People trust parchment much better because parchment has been being used since Ancient Rome and very, very old parchments are around so they’re taking their permanent record versions of things on parchment after doing the rough draft versions of things on paper, and when you go into the archive you always see the big files of the fat rough it’s the actual handwriting of the random secretary who was in the room and we stuck in random pieces of paper that are pasted in that were documents that people submitted in the course of this thing, and then the clean, beautiful vellum copy, beautifully bound in the red leather with the giant coat of arms with the secret archive on the front of it will leave out a lot of that material which you can see in the paper version, which they thought would fall apart and not really be retained in the same way. That they thought of this as the retention copy and the disposable copy. And there’s a lot of very different material in there. So, for example, when you look at Galileo’s review as a professor at the University of Padua, which is the equivalent of his tenure review, like should we continue to employ this guy, Galileo? In the original one, there is his own file explaining what he’s done and why the Venetian Republic should care, and, you know, his own sort of tenure application package only survives in the rough disintegratey paper copy. Lucky for us, it turned out their paper was very good. It did last a long time but they didn’t know. And so only the clean copy, the clean copy only has the sort of final decision. Yes, we have decided to retain this man, and it’s in the original where you get the embarrassing, you know, well this telescope thing is really sketchy, I don’t think it’ll actually be useful, but he made these important mathematical and geometric advances, and he’s very good at drawing diagrams, so we’ll keep him around for his, and make him teach math and tell him to ignore the telescope part ’cause that’s not of interest to our students. But we only get this because what they thought would be the disposable impermanent medium turns out to be much more permanent than they imagined. And just like Sharpie being less actually destructive, or PDF documents retaining much more in them than people who are casual users of them can anticipate.

[Kathleen Belew] This is the same thing with recent past because there’s this divot, I don’t know, there’s a lapse between, okay, you guys have probably all used ProQuest right? For your historical research where you can handle a keyword search through newspapers? It used to be that you’d have to go to a bound edition, which is like a hardcover thing for each year, and you’d have to look up by keyword, right, in the index and make a list and then go to a microfilm to find the specific issue to see if it was an article that was relevant. It took forever. And the microfilm machine would make me motion sick, it’s the worst, right. But there’s, but it’s a very good technology and any library archivist will be, like, apoplectic with worry about having it on the internet instead of on the microfilm because we know the microfilm is gonna be okay and we have no idea what will happen to the internet, right. There’s a divot, though, between the end of really good record keeping with bound editions and microfilm and everything. It’s usually, like, around ’87 and then most newspapers started digitizing their archives only in like ’92, right. So it’s little, but there’s this thing where nobody has records from the end of the ’80s and the beginning of the ’90s. It’s just like wild, wild west out there. And sometimes it’ll be entire court cases that don’t count as part of the historical archive yet, but also aren’t yet digitized so there’s like a snatch of stuff you can’t get at all. I had to ask somebody to let me sit on the floor once to look at one in the historical archives center because it wasn’t technically historical yet. Like I said, I don’t agree.

[Ada Palmer] How old does something have to be to be historical?

[Kathleen Belew] Well, apparently like ’87 is not historical for them.

[Ada Palmer] Good to know.

[Kathleen Belew] I have a more slight, kind of a…

[Cory Doctorow] Recap.

[Kate Klonick] So, I think that, so I’m just gonna sum up for a second here something I kind of think is super interesting. I feel like we’re talking about censors in three different kind of ways. I think we’re talking about censors in, like, the actual government, which is I think the traditional way in which people think of censorship and we have worried about censorship in a world of free speech. In that model, that was a dyadic model. You had the government on the one hand and you had the people on the other, and the press kind of, and the main concern, and one of the main concerns of the First Amendment, was to keep, like, the boot of the state off the neck of, like, the people, right, so that you could have expression, have these ideas out there. And that’s not the world we live in anymore because now we can route around government censorship, for the most part, through these third party platforms in the Internet, right, but then they themselves are censoring us because they are deciding what stays up on the platform and what comes down, and they can, and one of the things that is happening right now are things like in Europe with right to be forgotten. There are just like these aura-width, I mean, I was just mentioning to Cory, PACER, which are like, you know, how you decide to find the documents, how you decide to store them, the paywalls you put them behind, those all become shadow bands. All these different, there’s all of these various types of creative ways that people have started to find in order, what is not, is or is not in your newsfeed. And an interesting thing that that reminds me of is it is somewhat like as was mentioned before kind of, like, is the press censorship? The press decides what we’re going to talk about, and what we’re going to be thinking about, and what we’re made aware of and what we’re not. They’ve censored, you can call that a type of censorship. So, too, could you call, like, a librarian type of censor. They decide what books are good enough to stay in a library and what are worthy of, like, being in a library and being talked about. And everything else just kind of, like, gets washed into oblivion, and so I think that this is kind of this interesting theme that keeps coming up, but they do seem to be distinct ideas between kind of a government FOIA redaction, right, and the idea of an expert, an elitist expert that we trust to curate for us as a society.

[Cory Doctorow] I think you can bridge the two, maybe, by saying that there’s a difference between free speech and compelled publication. You know, but that where that breaks down is where state inaction or, you know, deliberate policy choices lead to monopolies or near-monopolies over channels of communication. So, if the government has sanctioned or taken steps that create a monopolistic conduit for publication, like Facebook being allowed to buy all of its competitors, say, and then Facebook is also allowed to treat its platform as a purely private matter, and the choices that it makes about censoring a publication is a purely private matter, then we move from the idea that the state has an interest in not, or the people have an interest in the state not censoring them, but the people also have an interest in the state not telling them what to say, and we move to an area where we might say, well Facebook if you’re gonna take the king shilling in the form of a state sanction on buying all your competitors and crushing anyone who might give people a place to talk that’s not Facebook, then your choices about who gets to speak and how you order their speech take on free speech qualities that the public have a legitimate interest in discussing, and maybe even inveigling against you over. Maybe the remedy isn’t to force the government to come in and tell you how your algorithms work, but maybe the remedy is to have the government come in and force you to sell off all of those divisions that you bought to consolidate all the speech in one platform so that we can have a diversity of places where decisions are being made about who can speak and how that speech is ordered.

[David Copeland] So, a very similar kind of thing happened in America in 1770s when we’re looking at the idea of it’s perhaps time that we separate ourselves from Great Britain. And we have two groups of people, we have Patriots and Loyalists just to make it into a simplistic terms, and newspapers are printing things from both sides, but when you get to a point where those who are Patriots who decide that it’s more important for their ideas of freedom and independence and the tyranny of Great Britain to be heard and to be heard solely versus those who say perhaps there’s more going on here and being a part of Great Britain is a great advantage for those of us in the American colonies, but those who are the Patriots have the loudest power and decide that it’s best with a bit of coercion that we shut down any Loyalist voice, then it’s the same kind of idea that’s happening, but the government is not dealing in this, but we have a sense of censorship that’s taking place when those who are in the position of power can then use that position of power to subjugate the minority voice, and when nobody will, you have to remember when we’re talking about 1770s America we only have one option of technology for sharing information and that’s the printing press, and if you have control the one voice or the one technology and are not allowing the separate voices to be heard, then you’ve essentially shut down any kind of discussion, discourse, and you’ve censored what society can hear.

[Adrian Johns] I’d like to ask, I actually want to ask you a question about this. So, I don’t, I’m out of date with where the field is on this, but it used to be that there was a view that in the mid-18th century, one of the things that sets the American newspaper press apart from the European ones, and British ones, in particular, is that in Britain you have, by, say 1720, 1730, quite a diversified newspaper landscape, in London especially, and actually out in the provinces as well, where individual newspapers are partisan. But in the colonial America the idea was there’s certainly partisanship, but individual newspapers see each paper as like a field of battle where people go in and they fight within the pages of that newspaper.

[David Copeland] Correct.

[Adrian Johns] And that changes later, right?

[David Copeland] Yes.

[Adrian Johns] So is this still, is this actually right?

[David Copeland] Yes, you’re exactly right. The idea was that, so Benjamin Franklin basically wrote saying that his paper was open to all sides because we all needed to hear both sides, therefore the debate could happen within his paper, The Pennsylvania Gazette. When, after the French and Indian War, about 1763, when Britain decides that we can tax newspapers with a stamp tax on paper the idea begins to change and it was partisanship begins to grow, and then it happens from point on. Okay, and so from that point it begins to move on out the idea of the partisanship that he’s talking about. We have the idea that we can control, and that the papers become a collective. You’ll have a small group of papers that might be considered Loyalist papers and you’ll have a larger group of papers that will be considered Patriot papers and then they won’t fight against one another. I think the most interesting thing in that if you want to talk about what would be considered biased news was there was a newspaper called the New York Gazetteer that was obviously in New York City. It had a huge reach, it’s actual subtext went from Quebec all the way down to Maryland as to where it served its purpose. After the Battles of Lexington and Concord, the paper ran a report about what happened there. It was a four column front page. The first two columns were affidavits of colonials who were at the battles talking about what happened. The right two columns was a speech by a colonial governor from New Jersey talking about well, we need to slow down and look at this in a different way. The paper was considered to be biased because it had that in the paper. The paper’s editor, or printer, he was arrested. He was thrown in jail, and eventually all of his printing materials, his press and all, was destroyed because that was considered to be biased publication. That’s what I’m talking about when I’m talking about the idea that the Patriots came to the point where they actually controlled what was gonna be said and alternative voices couldn’t be heard. So, it was working exactly as you said.

[Ada Palmer] Some of Kate’s comments made me think you know, when we’re conceptualizing censorship or information control or the blurred space between them, and the difference between what we’re looking at when we’re looking at the state redacting documents versus what we’re looking at when we’re looking at Facebook controlling what appears on its platform. You know, I thought for a moment you were gonna go in the direction of talking about how powerful the axis of us-them construction is when you’re trying to think about something that you’re labeling as censorship because what you think of as the us in a situation versus what you think of as the them in a situation, which can be fluid from case to case depending on how people are framing things. And if we sit here discussing how the Bush administration doctored documents about the inspection of Iraq’s Weapons of Mass Destruction Program, the us versus them in that discussion is government slash that administration versus us, and within that us is we sitting in the room, also within that us is the press, also within that us are the tech industry, also within that us are things that are not America. If we’re talking about Facebook, suddenly maybe the government is in the us group and Facebook is the them group because it’s the government that we’re hoping might be able to be able to break up this. If we’re talking about suddenly, you know, the press is giving us fake news, the press which was in the us group in both of those earlier conversations is now in the them group as we construct that particular axis of conflict. And I think often as different people are framing discussions of censorship or information control, especially accusatory ones where they’re trying to say x is guilty of inappropriate conduct, you want to think about how they’re constructing an us versus them construction. And when the Nazi press under Hitler is using the accusation of Lügenpresse, lying press, against the press, they’re trying to construct a way of thinking about this in which Hitler, the Nazi Party, and you, the reader, are us, and the press is the them. And the very powerful tool of where and how manipulable that line is of what’s in the us group and what’s in the them group.

[Kate Klonick]  No, I completely agree. So one of the conversations that comes up when I speak with these tech companies about their speech policies and everything that’s going into this, is like, who is the us? Who is the public? Like, what are you trying to create this content moderation system for? And jurisdictionally, there are no boundaries. If you were sitting in India, it is anathema to you that there are two men kissing and that Facebook won’t take it down. And meanwhile if you’re in France you think that it is ridiculous that they can’t show women’s breast on Facebook. And so there’s all of these different cultural norms, yet the internet seems to really kind of expose to us that the things that we want to see or don’t want to see are not necessarily in line with where we just kismetically landed when we, like, when we fell out of, you know, wherever and just decided to live somewhere, be somewhere, grow up somewhere. And so, the difficulty of having, they have to use one standard, Facebook or Twitter, because it is impossible for jurisdictional reasons to decide which standards or which rules to enforce so they use one standard of things to take down or keep up. But that is constantly changing and one of the things that you’re talking about with the us versus them is that there is constantly this, I think most of it is the them is the tech companies and the us is everything else, and that that line is just now, in my opinion, just starting, people are starting to see it slightly differently. And I don’t know why that is, but after doing this for like four or five years it’s just like, there’s just suddenly been a change in the narrative, but it also reminds me, you were talking about how, you know, they used to censor, you know, black liberation movements or black revolutionaries in the U.S. And we thought, now with time and history, we have the posthoc realization that was a terrible idea.

[Kathleen Belew] But they’re still doing that.

[Kate Klonick] Oh yeah, but we, you know.

[Kathleen Belew] I mean, not COINTELPRO, but there’s still the designation at the FBI of, like, black identity extremism.

[Kate Klonick] And likewise, we probably think that there should have been more type of censorship of KKK type of stuff, or like early Nazi propaganda, or things like that. But, I just, I’m curious if, like, the posthoc rationalization, or the posthoc realizations of some of this stuff, and the slow march and change in norms makes it easy to look back and think that some of these censorship decisions should have been made one way or another.

[Kathleen Belew] I actually don’t. I personally am not in the camp of censoring Nazi and KKK propaganda.

[Kate Klonick] Me too.

[Kathleen Belew] Mainly because I think that, as I’ve said before, this is a movement that’s tried to disguise the ways that it was a movement.

[Kate Klonick] Yes.

[Kathleen Belew] And is most commonly depicted as being nonsensable and being isolated, lone wolf violence, right, single actor events. Even in mass casualties that involve a whole lot more actors, and I think the propaganda is one of the ways you see the social movement. And I think seeing the social movement is how we get more effective criminal prosecutions and civil prosecutions, and that’s what I would like.

[Cory Doctorow] So there’s a, sorry I think you were done.

[Kathleen Belew] Oh no, go ahead.

[Cory Doctorow] There’s a correlate to this in the early German debate over censoring child pornography and images of the sexual abuse of children using a national firewall of Germany, which was one of the first western countries to put up a national firewall. And survivors of sexual abuse in Germany, or at least a subset of them, argued against it because they argued that there was an out of sight, out of mind problem. They said, you know, Germans are leaving Germany to go to Thailand to abuse children and if we block those images, then no one in Germany will take an interest in prosecuting those Germans who do it, and that it’s only by, and that people will evade the blocks, the people who are dedicated to looking at the abuse of children will find ways to get around the blocks. What won’t happen is that we won’t have the kind of public realization that our countrymen are engaged in this bad conduct. And, you know, where that kind of plays out today is there’s this tension between do we censor extremism on online platforms or do the police use it as a base for surveilling and disrupting activities that they want to undermine, and this is the great fight that I think within security services they have and between security services and online platforms, and, you know, the FBI when they argue against allowing the use of working cryptography by civilians, they say, well if we allow bad guys to use cryptography, then they will go dark. The going dark problem. We won’t be able to see them anymore. But there’s, you know, the opponents of this always point out there is no going dark problem that we can see. What we actually have is not going dark. We have all the worst people in the world standing on the tallest platforms they can find shouting as loud as they can, waving their arms and identifying themselves for law enforcement, where is the going dark problem? You know, and are we all going to lose the ability to protect our bank balances with working cryptography because you’re worried about a hypothetical future in which these megalomaniacs who can’t shut up suddenly realize that crypto is how they’re gonna talk to each other.

[Kate Klonick] I, and also just to add to that, I think that there’s, I had the same, similar concerns, and one of the things that’s fascinating to me, though, as a counterpoint, is that empirically it looks like it does have a major effect to take extremists Muslim and extremist, kind of, KKK and white supremacy recruiting off of Facebook.

[Kathleen Belew] Yeah, well wait, yeah.

[Kate Klonick] And, like, it actually does stop and slow recruiting and has that effect. I also, but it does have this going, the going dark type of thing, but, yes.

[Kathleen Belew] I think recruiting is different from censoring the entirety of the speech though, right. Like, I, so in my archive there’s a whole variety of kinds of things that one might censor, right. It’s everything from cartoons and posters and outreach campaigns, like, in the 1980s they were already doing things like white student unions coming onto campuses to try to stir things up and generate recruits in that way. And then there is a violent underground that’s, like, assassination lists, right. That doesn’t, I don’t think that falls under free speech, right, because it’s already…

[Kate Klonick] Incitement.

[Kathleen Belew] It’s incitement, thank you. Sorry, I’m not a legal scholar. There’s other kinds of speech, like, the people that I look at have even tried things like seditious conspiracy, but that’s also not about speech so much as it’s about the actual criminal thing they were planning to do, right. I think it’s the criminal prosecution, and, oh, yeah, the thing about the empirical evidence showing, the thing that I think is super effective are criminal trials and civil trials that have had the byproduct of things like consent decrees that stop people from communicating with one another. Like, breaking the social movement is the thing that I think is effective. And I think that might be happening…

[Kate Klonick] Yeah, for specifically what I was speaking to.

[Kathleen Belew] Yeah, yeah, yeah. So like that’s, but that’s the thing, I wouldn’t, I mean I don’t.

[Kate Klonick] You put in more friction. You, like, block, you create friction into, like, Facebook is pretty frictionless. Cryptography, dark net, is like, has a lot more friction in it.

[Kathleen Belew] Yeah.

[Kate Klonick] It is more difficult to sign on and make messages and communicate.

[Kathleen Belew] Yeah.

[Kate Klonick] And so, like, making some of these frictionless services not available for this type of networking and this type of communication and planning is like what people have found to be kind of, like,

[Kathleen Belew] Yeah.

[Kate Klonick] What makes a big difference.

[Kathleen Belew] Yeah, but so is that really about speech or is that association?

[Kate Klonick] That is, I actually, I think that that’s a great question.

[Cory Doctorow] They’re both the First Amendment, right?

[Kate Klonick] Yeah, I mean, they’re both. Yeah, I mean, they’re both First Amendment, neither of which, I mean, unless if the government is involved here in some way, neither of which matter so, like, but they are protected, they’re ideas that we kind of want to protect.

[Cory Doctorow] Can I push back on that just a little? You’re right that it’s not a First Amendment issue if the government’s not involved, but the First Amendment upholds the principle of free association and free speech. The reason that we value it is not because it’s in the Constitution, it’s in the Constitution because we value it, and so if it negatively impacts free speech and free association, it does matter, it just doesn’t matter as a constitutional matter. It matters as an ethical matter that is coterminal with the Constitution.

[Kate Klonick] Oh, completely. I mean, I would say that that would be, I would say that that’s like a free speech value or a free association value, or, like, a First Amendment value, but it’s not a First Amendment issue. I was being a little, sorry I was being, a little, like, kind of pedantic.

[Cory Doctorow] Right.

[Kate Klonick] Like, just as a First, you know, but…

[Kathleen Belew] Kate, would you talk a little bit more about, so I was briefly after my book came out, consulted by Facebook about how to regulate hate speech on Facebook, which is one of the weirder conversations I think I’ve had as a historian. I’m like, I don’t know things after 1995.

[Cory Doctorow] Take away all the modems.

[Kathleen Belew] Like is there an off switch? No, but I, the impression that I had from the conversation that I had with this team of people.

[Kate Klonick] Which year is this?

[Kathleen Belew] This is this year.

[Kate Klonick] Okay.

[Kathleen Belew] Was that it is a very small and embattled group of people who have been tasked with the entirety of Facebook, and their job is to identify and stop all white supremacist hate speech on Facebook, and it’s like, it was like, I don’t know, like 15 people or something.

[Kate Klonick] Yeah, yeah.

[Kathleen Belew] It’s like a tiny, but it was very, very human, like it wasn’t an algorithm, it wasn’t, like, it was like a thing gets identified and then it gets escalated and then it has to get double-checked and then it has to meet a standard of criteria. But it was complicated and they were like, I mean you should talk about this. But it struck me that there were many, many points of individual decision-making where something could go dramatically one way or the other based on who was looking at it that day and how tired they were and, like, how overworked.

[Kate Klonick] To—completely, so just to kind of, this is something a lot of people don’t know mostly because it wasn’t public until very, very recently. So Facebook, I’ll just speak to you about Facebook ’cause it’s, like, kind of just, it has been like more or less the most transparent about this. Facebook decided, content moderation deciding what came up, or got taken, or what stayed up or got taken down was not a huge priority for Facebook until about 2008. Until 2008, Facebook had a group of about 15 college-aged kids sitting in a room that when something was reported or flagged they would decide whether, literally the standard was if it makes you feel bad, take it down. And so, that was, but which was a functionally fine thing when the public, when the people, when the us versus them was mostly American college-aged students were the users of Facebook and they were being moderated by American college-aged users, right, college-aged kids. So this is kind of fine. When it starts going global and everything, this quickly falls apart, and there’s a team of like, it was 10 people that were tasked with basically writing all of the rules of what was going to happen. Except it turns out that, like, Michelangelo was a naked, or Michelangelo, David is a naked body but also it is art. And so people want it to stay up, but how do you explain to content moderators in India, which is basically when you flag something you get, you check a box on why you think it’s, like, breaks their standards and rules, it goes through a queue, and basically shows up with privacy information stripped onto a computer of someone in Dublin, or Warsaw, or Hyderabad, or the Philippines, and these are people who sit all day at computers for many hours a day and just, like, click through stuff that other people think is violent, or extremist, or nudity, or pornography, or hate speech. I will say that the vast majority, so when you said it’s, like, not algorithmic at all, it is not algorithmic at all. This is still how it’s done. It is like 1% algorithmic and we can talk about the parts of that that are kind of algorithmic later, but when all of this happens, yeah, it’s a completely human process, and how does someone who doesn’t maybe know Italian art know that David is art and not, how do you train someone to set, they basically had to create a set of norms and cultural values, and for a long time, it was like, oh, our core, like, oh, they’re creating one normative objective set of cultural values. No, they’re just creating Facebook’s. And then they’re training everyone and they’re creating that and that’s how they’re kind of doing this, and so when you say that they have a team that looks first through this content, I’ve talked to those people extensively. They used to have these crazy days. This is just like a fun story, kind of. They used to have these crazy days that on 4/20, Hitler’s birthday, they would have what was called Blitzkrieg, and they normally only respond to stuff that was, like, people reported because it is just too large a task to go and take down stuff. That’s changed in recent years around terrorism and things like that. But they used to go on one day a year on Hitler’s birthday and just, like, go and search out and try to find all of the extremists, all of the extremist content, all of the white supremacy..

[Cory Doctorow] It’s spring cleaning.

[Kate Klonick[ And all of the, yeah. They called it Blitzkrieg, and they just, like, took down all of the Neo-Nazis and just deleted their accounts and pages. And it was like, I mean, it was a, it’s also a very, yeah, it’s a very dark world, the people that do this have, like, extreme PTSD, and have, like, basically, like whatever terrible incident you can think of, whether it was, like, the Newton Massacres, whether it was, like, all of this kind of stuff, that is ten times their worst day. Like the Boston Marathon bombing, because they are also just seeing the horrendous stuff that people are saying in response to it, at, like, full blast and taking it down constantly.

[Cory Doctorow] So one element of realpolitik about what Facebook’s standard ends up being, and all these platforms’ standard ends up being is where there is an enforcement nexus, and early on in the growth of the platforms when they started to, like, when their investors started to demand really big growth and when they started to worry specifically about the quote next billion, the next billion internet users, there was a real rush to put advertising sales offices in all the countries where they thought they might experience growth. And an advertising sales office is a bank account and some people who can go to jail if you don’t follow court orders in a country. So as soon as you start setting up advertising offices and bank accounts in former Soviet republics and Asia, or in sub-Saharan nations with autocratic rulers, then you also are jurisdictionally tied to them, especially in the early days before we had a lot of national firewalls. It was really common that a company like Twitter, for example, might get sales calls from overseas, someone might call up Twitter’s office in New York or San Francisco and say I would like to buy some ads and I am in this country where the things that you do are illegal but my users, you know, my customers will see them and so Twitter could still get the business, Facebook could still get the business, but as soon as a couple of them started going to those countries and hiring local sales staff, they got the lion’s share of the business. And so they all chased each other into these markets where they had jurisdictional nexus where they could be enforced against, and this is what has caused all of the standards to converge on a kind of race to the bottom of the most serious possible regime. It reminds me a little of Ada talking about how the Dutch booksellers would get the Vatican’s list of banned books and just print those, which was a thing that they could only do for so long as they didn’t have a ranch office in the Vatican, at which point it would be a lot harder.

[Ada Palmer] Yeah, or the, the localization of the abstraction that is the standard of obscenity, because a lot of modern censorship law, you know, as we’ve discussed many times, often it’s pornography that is the edge on which legal precedent is set, and in places like New Zealand and Australia, and indeed the U.S., the standard will be obscenity. What is obscenity? It’s something that is indecent. What is indecent? You know, to quote the actual Supreme Court case in the U.S., “I know it when I see it.” But there is no strict definition of it. It’s, you look at it and you say this is obscene, which is on the one hand trying to solve that problem of is Michelangelo’s David going to be left up on Facebook or taken down on Facebook, and if your standard is, as it was for that Facebook room of college-aged kids, does this make you really squikky, photos of Michelangelo’s David are gonna make you really squikky if you’re from the correct kind of society, and indeed when the David was first put up in Florence in the Renaissance, it made everyone really squikky, and they covered it with a belt of leaves made out of bronze, and those were taken down eventually, but the oldest photographs we have of it from the later 19th century have a fig leaf on it but we don’t know when that was put on or actually even when it was taken off, and then a copy of it was given to the Victorian Albert Museum in London and the Queen visited and she didn’t like it so they made a fig leaf to put on it there. So, just because it’s art, the standard of what should art be changes over time, the standard of whether this makes you uncomfortable. And having that vague definition of what is obscene, it is a thing that makes you uncomfortable, it is a thing that hurts you, right, it is a thing that causes you internally to experience a kind of negative experience and distress, is at the one hand absolutely impossible to rationally define and rationalize, but on the other hand, which is both good and bad. Good in that it can be flexible, bad in that it can be exploitable.

[Kate Klonick] Well, it’s interesting ’cause you want to know how they ended up defining art, which is I think fascinating, which is, like, oh is it a naked body? Okay, is it made out of wood? Is it made out of stone? Is it made out of metal? Is it made out of mud? Is it made, it’s like…

[Kathleen Belew] So it can’t be flushed.

[Kate Klonick] So they have to literally be able to see, you have to be able to see whatever it is that you’re looking at because there can’t be a standard just for the reasons Ada explained. There can’t be a standard because there is no standard. One will never exist and we will never be able to lock it down. So instead they have to look, and the answer to that is that sometimes it comes out wrong, and when it comes out wrong, it comes out pretty wrong, and some of the big, some of the big controversies that we see, like, the napalm girl controversy, which was the terror of war photo, which was a photo that, of a, I think seven-year-old girl running after a napalm bomb attack in a village in Vietnam, and it’s a very famous war photo, and a famous person, author in Norway posted it on his Facebook feed and it was reported and taken down, and there, you know, the context of that is all that matters, but you can’t see that in the photo. It’s not visible in the photo itself, and so what they struggle to do is put things that are visible in the photos themselves. Graphic violence is defined as can you see the outsides on the insides, or the insides on the outside. And like, there’s like, I mean you can like…

[Ada Palmer] The insides of the body.

[Kate Klonick] Like, yeah, like that’s gore. Like we don’t, or they don’t allow that.

{Kathleen Belew] But, like, strangulation is fine?

[Kate Klonick] Yeah.

[Cory Doctorow] But dental surgery not so much.

[Kate Klonick] Yeah, it’s exactly right. And so there’s all of these, like, there’s all of these exceptions, and, like, and so there’s rules about, I won’t get into all of them, but I think that it’s interesting because they do recognize that this is a completely Sisyphean task that they can never actually win or fight, or pin down, but they’ve done it, and, like, as they said, the goal is not, like they’re never going to get it completely right, but they want to get it the most, the least amount wrong that they can possibly get it.

[Kathleen Belew] That’s, the conversation that I had with them was about whether or not there can be a peaceful ethnic cleansing. That’s what they wanted to know. Because the standard doesn’t prohibit the speech of what these groups would self-define as white nationalism, and I would push back on this quite proactively in many cases, but it does prohibit violence, so the question was when somebody says ethnic cleansing or peaceful ethnic cleansing can that ever not be a violent idea, which is getting, I mean that’s getting into many levels of rhetorical gymnastics to define.

[Kate Klonick] And who are the people doing this? They are literally people who, like, are 25 or 30-years-old and, like, grappling with these, 15 people that just randomly happen to work and they’re struggling.

[Kathleen Belew] And who also thought like, hey, let’s get a historian to decide, like, ’cause I have no idea what, like, why they chose me to answer that question, or if I wasn’t really…

[Kate Klonick] They’re probably talking, they talk to tons,

[Kathleen Belew and Kate Klonick] They talk to a lot of people.

[Kate Klonick] But yes.

[Kathleen Belew] Okay, that’s good ’cause I, like I said, I can tell you what that meant between 1979 and 1995, but I really have a historian face when they ask me to go forward in time from there. You know, I have like a thing.

[Cory Doctorow] In the early 2000s, the website I co-own, Boing Boing, was blocked by a censorwares company, one of the companies that provides censorship for corporations and schools in Bahrain and the UAE, called SmartFilter. And they blocked us as a nudity site because we had Michelangelo’s David. And they said, one, and it was kind of a homeopathic standard, one drop of Michelangelo’s David penis on a website with tens of thousands of pages classed the whole site as a nudity site. But, you know, it was very ironic because the company itself was on the one hand staffed by people who had kind of fringy interests, so you know the CTO’s only online footprint was his discussions of his sexual proclivity involving diapers on Usenet and how he liked to wear a diaper and so on. But the CEO was on probation for sexually interfering with a minor, and so these people were making these decisions, they weren’t callow youths who were traumatized by looking at atrocity photos all day, they were, you know, weirdy, pervy old dudes who were deciding whether or not Michelangelo’s penis was safe for the kingdom of Bahrain and corporate America, and their answer was no. It was a genuinely surreal time in the history of the publication as we were fighting with these guys.

[Ada Palmer] That also reminds me of how objects of art literature that are already in a position of cultural privilege are very frequently used as the tools to fight back against ineptly drafted censorship material. So, you know, you’re trying to argue this is a bad system, look it censored Michelangelo’s David. The new EU standards are a bad system, look they would censor Shakespeare if you put Shakespeare into the database. There’s a case in the 1930s, or maybe 1920s, in New Zealand where New Zealand had passed a new, more restrictive law and the journalists who were working to try to take it down said okay what do we do, let’s bring suit against Boccaccio’s Decameron because Boccaccio’s Decameron is something that this law will censor, and this went up to the highest courts there and they managed to get the law, you know, lifted because they’re defending The Decameron, but you’re using works that are in a position of historical privilege, which is why they’re so effective, but it means, you know, if Boing Boing had been put on that list because you put up a picture of an equally nude and equally old and equally important to a different culture photo, right, from India of an important nude statute of the same cultural weight for them, it wouldn’t have had the same power as a story for you to tell it as the censorship of Michelangelo’s David does, which is one of the reasons that it therefore becomes much easier to renew protections on the culture of power’s works of art than it is to renew protections on less empowered cultures’ works of art.

[Cory Doctorow] And the interesting, like, denouement to that story about SmartFilter is that they said that they would unblock us if we created a category called nudity. So boingboing.net/nudity and we put all the nude pages there, and then everything else could be in the not nude half of the website. And it was again, like, a really interesting well we can’t figure out when you have nudity so if you just tell us which pages are nude, we’ll just censor those ones.

[Kathleen Belew] Wow.

[Adrian Johns] Well I think that, one of the things that I’m struck by about all this because you see it a lot I think and talk about censorship practices is that as far as one can tell, from the last 15 minutes, there seems to be no sort of empirical investigation whatsoever of the actual act of reading or viewing things, right. The stuff goes out into the world or it’s put up online, communities, people look at it and they respond in certain ways. How they respond is an empirical question. It’s as though we’ve, like, completely forgotten what actually does exist, which is decades of psychological, sociological, historical research on the practice of reading and what it actually does and how we, what it does to us and what we do to ourselves when we read or view something. It’s as though none of that exists whatsoever, and yet, it all perhaps to be absolutely central to any kind of question about whether you should, you know, stick a fig leaf on a David statue or something like that. I think the, what, the strategic neglect of that question is kind of interesting.

[Ada Palmer] Or the strategic exploitation. You’re making me think of the most famous work along those lines that I can think of, which is Fredric Wertham’s Seduction of the Innocent, which is a book from 1954 filled with statistics and studies and graphs of how looking at violent and sexually explicit comic books is causing juvenile delinquency. And it was made up. And it fully discredited, once it was discredited, the idea of studying that kind of material at all, so the appeal to data in those frameworks has also in turn been exploited in a way that makes it even more difficult to tap. Let’s continue this discussion after our coffee break. Thank you all so far and we will have more. Alright, welcome back. We will begin the second half of our discussion. Does anyone have any lingering thoughts that you wanted to share about what we were talking about right before the break?

[Cory Doctorow] I just wanted to mention a local connection with redaction. There is a Chicago, was a Chicago activist named Aaron Swartz, who’s one of the founders of the website called Reddit that you’re probably familiar with. And Aaron did a lot of interesting data things, but one of them involved a service that was mentioned in passing called PACER. So the federal courts put their court records behind a paywall called PACER. It was, it dates back to the Clinton era and it was created to do a cost recovery, so they charge a dime per page to look at these public domain court records. It’s supposed to be a break even system but they actually make $150 million a year that they use to run one computer full of PDFs. And because these documents are public domain, once you have them, you can share them freely, so there’s another project called RECAP, which is PACER in another, scrambled around, where if you have it installed on your browser, every time you go to get a PACER page, first it checks to see whether anyone else has already got it, and if they have it, pulls it from a repository, and if not, once you buy it, it adds it to the repository. So, Aaron figured out that the public libraries had free access to PACER, and he figured out what the $1.5 million worth of most cited case law was in the U.S. and he and colleagues used thumb drives on library computers to, and scripts, to retrieve a million and a half dollars worth of PACER documents and put them in RECAP. And one of the things that he discovered when he did this that wasn’t obvious because it was all behind the paywall is that they weren’t correctly redacted. Court records are supposed to be redacted to remove things like Social Security numbers, the home addresses of women seeking restraining orders against their abusive former spouses, the names and addresses of minors, trade secrets that are revealed in court and then sealed, and the court clerks were not redacting this material. And this was a thing that was understood by, like, stalkers and people who conducted industrial espionage and, you know, people who did corporate intelligence work, but not by the people who were in the court records. So when Aaron did this, the courts freaked out and the FBI freaked out, and they came after him because he blew the whistle on this improper redaction, on this lack of redaction. So it’s kind of a double whammy story there with a Chicago connection ’cause everyone’s from Chicago.

[Kathleen Belew] Just to illustrate what the problem is with charging 10 cents a page for something, so if you want to go and find, I had to do this with a trial that I work with, which is before PACER because, again, the divot of the 1980s. If you’re looking at a trial not for just the outcome and the decision of the trial but actually, like, who said what in testimony, what the evidence list is and stuff like that, we’re talking about many thousands of dollars worth of paper. There’s no finding aid like you would have in an archive, so if you want to see it you have to, like, figure out who will pay to get it duplicated and housed. So that time I had to do it through a library grant.

[Kate Klonick] Yeah, we’re not concise, lawyers.

[Cory Doctorow] Yeah.

[Kathleen Belew] No, well and, I mean for cultural historians thanks for that because the judge’s banter makes it into the book and it livens my day, but, like, the, but no it’s just a huge amount of paper. So 10 cents a page is a huge feat in many cases.

[Cory Doctorow] And this $150 million a year web server doesn’t have a search function.

[Kate Klonick[ Yeah, there’s no control f.

[Ada Palmer] So we’re touching there, I think, on, in terms of, you know, redaction, redacting home addresses of women who are filing for court orders against abuse. We’re really on the there is no way to deny that this is harmful if it’s information that spreads. We’re not in the gray zone of does this incredibly gross image of a violent bombing victim aftermath photo, or does this extremely repugnant piece of pornography harm the viewer question. The, you know, I think, all of these are in this space of when we want to restrict some kind of information from moving, what is the bull’s eye of the kind of information we want to restrict from moving? And, you know, that and how to make cheap massly reproducible nuclear weapons are probably in the center of that bull’s eye. I would love to hear anybody who works on any time period here start to comment on, you know, what is your own sphere’s version of that and different ways that people have tried to solve that, not necessarily in the abstract drafting a principle, but in the more practical coming up with a applied how do we look out for things, how do we move them, how do we try to draw a line between information that should in some way be restricted and information that shouldn’t. Not on the free speech is good, censorship is bad principle axis, or the censorship is good, we want to protect Catholicism against wicked Protestantism axis, or whatever your local axis may be, but in the actual applied, applied to technology, applied to production of books, applied to online stuff.

[Adrian Johns] Well that the, I don’t know, maybe the best known investigation of this is a documentary film by Peter Galison at Harvard on the practice of, what’s it called, where government documents in Washington, D.C. are rendered different levels of secret…

[Cory Doctorow] Classification.

[Adrian Johns] Classification, thank you, sorry. I’m getting old, now that my brain doesn’t work sometimes. And Galison found, first of all, which I still think is a vertiginous thing to have found out, that that assumption that I think we all have that the open world of documents circulating around and that gets stored up in things like the Library of Congress and the British Library and the Bibliotheque Nationale and so forth, we all assume that that’s, as it were, the real world of information and the world of classified information is going to be some smaller thing off to one side. But as far as one can tell from the quantified quantification reports that are made every year by Congress about how many acts of classification are carried out every year, it’s actually the other way around, that the world of information that’s stored in things like the BL and the Library of Congress is actually much smaller than the world of classified information. And it’s not only that it’s much smaller, but that they’re diverging, so that the amount of secret information in the world is increasing much faster than the amount of open information, and it’s like, you know, squared. It’s not only that it’s increasing much faster, but the rate at which the increase is increasing is also faster. So, and then he actually dug up what turned out to be the instruction manuals for training people about how to classify things, and to cut a long story short, it seems that since about the ’60s, when I think these things were introduced, what’s been sort of cemented in place in the D.C. sort of administrative culture is a very antiquated philosophy of science, like positivism, that there are such things as facts and there are such things as, you know, theories and experiments and observations and these are radically separable. And you can isolate out the facts and render those classified. And, so, you know, this is something like that is actually what’s expected of these staff and that happens routinely. And it’s a kind of weird thing where it’s something that, you know, philosophers of science pretty much abandoned maybe half a century ago, and yet it’s real by virtue of the fact that lots and lots of people are constructing this enormous universe of classified knowledge based on it anyway. So that’s the point that comes to mind. You can find this, I think, I think the film is openly available if you Google, Peter Galison’s a historian of science at Harvard. I forget the name of the film but it’s, I think you can find it readily enough.

[Cory Doctorow] I have an example if no one else is gonna go then for a contemporary one, which is the reporting of defects in software. So under normal circumstances a fact about a defect in software would be lawful to disclose. Facts can’t be copyrighted. The First Amendment protects your right to speak the truth, and so on. There are real ethical questions about disclosing defects in software. There’s a kind of presumption that the best way to do disclosure is what’s called coordinated disclosure, which is when you tell the manufacturer in advance, you give the manufacturer an opportunity to make and push out a patch to all of their users, and then you reveal, by the way, there’s a defect in these pacemakers that let’s you attack them wirelessly and kill the people who have them, which is an actual defect that a security researcher did in fact discover. The problem is that firms have a strong dispreference for the disclosure of defects in their products ever, and that they dress up this dispreference as a set of responsible disclosure or coordinated disclosure criteria, which is basically like, well, you know, you have to give us more time. We have to spend more time to see if we can replicate this defect. We don’t know if we can ever patch this defect so you can’t ever tell people about it. You violated our terms of service by even looking for this defect, and so on. And there has been, I mean, there’s really a good reason to want coordinated disclosure, right, like if you are someone whose car can be driven remotely over the internet, as 1.25 million Jeeps were when they were recalled in 2015, GM had built a car with a wifi hotspot in it that you could pay to turn on for 24 hours so your kids could watch Netflix on long drives, and their security assumption was that because they used a Sprint SIM to use this, and since no one had Sprint, that no one would be on the same network as them and that as a result it would be secure and so therefore it would be okay to connect the CAN bus to the car, which controls all the car’s electronics to the entertainment system, and so over the internet you could drive Jeeps, right. And, you know, so if you’re someone whose car has got this defect, or if you’re someone whose pacemaker has this defect, or someone whose phone has this defect, your home camera that’s, you know, your burglar camera that’s in your bedroom has a defect that allows third parties to switch it on without any outward indication and watch you in your bedroom, that sort of thing, you don’t want that just put on the internet before you have a chance to patch the bug. But this tension is that if companies get the right to decide who can disclose a bug, then by and large they just say no one can disclose a bug, and then the bug doesn’t get disclosed until such time as it’s so widely exploited because it’s been independently rediscovered by researchers who want to use it to attack people and not researchers who just want to warn people that it exists or get the credit for having discovered a bug or whatever, that finally the manufacturer has to disclose it. And so manufacturers have kind of back-formed a set of legal tools for suppressing defect disclosure. One is by invoking a Clinton era law called the Digital Millennium Copyright Act, section 1201 of which prohibits, yeah, prohibits bypassing access controls for copyrighted works. So this was originally to stop you from reprogramming your DVD player to play out of region DVDs, but now it’s used to stop people from bypassing a lock that would allow you to audit the source code or to audit the code of the device and figure out whether it has defects. The other is the Computer Fraud and Abuse Act, which is a Reagan era law passed after Ronald Reagan saw the movie War Games and was worried about cyber crime. Seriously, not making that up. And it’s very broadly worded, but it says that you can’t exceed your authorization on a computer that doesn’t belong to you, which means that if you have to go to another service to, you know, log into AT&T’s billing system to see whether or not it will show you other people’s billing data besides your own, when you click through the license agreement if that license agreement says you’re not allowed to investigate this system for security defects, then investigating for security defects is a problem, and so notoriously a Neo-Nazi named Weev, who’s also a security researcher, one day went to AT&T’s billing service and found it had a long url that ended with a number and that number was his customer identifier, and that if he changed that number by one, he was looking at a different customer’s records with all of their financial information and address and so on, and when he came forward with this he was convicted criminally of violating the Computer Fraud and Abuse Act, mostly ’cause the judge didn’t like him ’cause he was a Nazi. But he was put in jail and we ended up running his defense, which was not because we like Nazis but because we don’t want people who aren’t Nazis to go to jail for warning millions of people that AT&T is letting their financial information hang out there. So here’s this information that there is like a legitimate reason to maybe want to not have disclosed, maybe in the case of medical implants we want the FDA to coordinate disclosure, but which we definitely don’t want manufacturers to coordinate disclosure of. Manufacturers have arrogated that ability to themselves. And the worst part of it all is that in the face of manufacturer threats, the new sexiness of how you disclose defects is you just anonymously stick them on Pacebin, right. So this is the new coordinated disclosures. I just dumped all the defects on Pacebin. No one can track, trace them to me. I just felt you should know. The manufacturer doesn’t get to know. Nobody gets to know in advance. The criminals find out the same day the people who have the devices find out. And God help you if you get attacked before the manufacturer creates a patch.

[Adrian Johns] This is slightly different but so, if Ada’s question was something like what are the rules for deciding what gets censored or not.

[Ada Palmer] Or what have been attempts at rules.

[Adrian Johns] Yeah, I mean it’s worth, if you think back historically at episodes of censorship, we go back to the Darnton era in the 18th century or the colonial America, and actually further than that, if you remember back to when Milton was writing in the 17th century, or the Papal Index people, you know, in the same kind of period, your senses are not like the people in charge of government typically. Though actually, some of Darnton’s are impressively high up. But often they are sort of junior chaplains in the office of the archbishop of Paris or something like that. So they don’t have any privileged access to, sort of, the thoughts of the Royal Court. So when books come into their office and they’re told to censor them, and it’s about, say, the military powers of Prussia, and you’re France or England, it’s not actually that easy to know whether the military powers of Prussia is supposed to be lauded or denigrated. And, in fact, you find it’s not that rare for censors to get fired or punished or something because they misinterpret what they think they’re supposed to be censoring because they think that, you know, it’s rather like, you know, we’ve always been at war with East Asia, so that there’s, to give you an example of this, there’s actually a guy in 1670s in London who is charged with writing a book about Dutch military powers. And it’s one of the, England and Holland go back and forth about being at war in the 1650s through 1680s. And this is at one of the times when they aren’t at war. So he writes this book about how great the Dutch are, and by the time it comes out, they’re at war. So it turns out this is actually the opposite of what the state wants it to say, even though the state actually commissioned the book. And he gets put into, you know, he gets into lots of trouble by that. And this is actually, it’s not only quite common to find that people would get sort of into trouble for this, but if you look for the moments when they do it, it can be very revealing about what the sort of taken for granted expectations of a censor are. What are you supposed to know? How are you supposed to act? Usually that’s invisible. It’s made visible at moments when somebody does something disastrously wrong like that. There’s another moment right up to 1688 when there’s a, there’s a problem with 1687, so in 1688, the English king is terved out and goes to exile in France, and this is what in retrospect is called the Glorious Revolution. There’s a big problem though with explaining what actually happened then because you can’t quite say that the people rose up and threw out the king because that’s too much like an actual revolution. You can’t quite say that, I don’t know, that the king was deposed in some other way. You’re also, so in the end the thing that more or less gets settled on is that the king abdicated by virtue of kind of leaving London, although in fact that’s a rather dubious thing because he came back to London and tried to stay in London, so they had to kind of whitewash that a little bit. But one of the, there’s a moment when one of the censors in the wake of 1688 licenses a book called King William and Queen Mary, Conquerors that argues that what actually happened was that William and Mary, the monarchs of The Netherlands came in and conquered England, and so they are rightful monarchs by virtue of conquest, which is an ancient sort of law category. You could say in principle that was possible. But it’s politically completely unacceptable to say that they came in by conquest. You had to have some line about why the previous king of the century abdicated. And he gets into enormous trouble and is eventually thrown out of office for that. And there’s a certain amount of documentation about what he was supposed to have done. So, it’s like a historiographic rule of method. You look for these moments when somebody gets into trouble for sort of getting the rules wrong even though there are no written down rules. It’s actually like science in that respect. You look at the moments when people get it wrong. Very revealing.

[Cory Doctorow] I’m reminded of, there was a World of Warcraft moderator who suspended a user’s account for advertising a guild for LGBT players on the grounds that Warcraft’s terms prohibited harassment of people on the basis of their sexual identity, and advertising your sexual identity in Warcraft would give rise to harassment, and it, you know, eventually the moderator was overturned after kind of a public outcry, but again in terms of revealing, like, the submerged assumption of Warcraft, which is that you don’t, you know, like the first rule of Warcraft club is nobody talks about who they want to have sex with, right. And it was, you know, I clearly what had shocked the moderator wasn’t as much the LGBTness, although that was clearly part of it, it was just that having a guild where sexuality was part of your identity in a guild where you were also an orc, you know.

[Kate Klonick] I think that I’ll just give the example a little bit of, that I don’t think I mentioned before, which was that there was, and it relates to what we were talking about, what Cory was talking about with transparency and the problem with revealing rules, or excuse me, the problem of revealing, the problem with transparency, like the double edged sword of transparency. And, so Facebook in particular, one of the reasons that it took me so long to write this, or the piece that I wrote about Facebook’s history and YouTube and Twitter’s history, was that there was no transparency in what the rules were. There was a set of community standards that was posted for being on Facebook, but, and it said things like no graphic violence and no sexual nudity, but it did not say anything more than that, and what, and so there’s no way to know that they had an exception to nudity for art, right. And there was no way to know that, but what they did have was behind the scenes a secret 90-page Wiki document that was used by the content moderators to do all of this work. So it was a secret list of rules of exactly how you would interpret and decide whether something was sexually impermissible nudity, right, impermissible sexual nudity. And that was completely hidden from view. In 2012, a friend of mine and a journalist, Adrian Chen leaked a copy of the training documents, it wasn’t actually the list of rules, but leaked the training documents that content moderators in the Philippines were trained on, from outsourced group had decided, like, that Facebook had hired to get, to have workers. Someone leaked their, Facebook fired the entire company and the slides are still out there, but they kind of tried to shut it down. People didn’t really know what to do about it. We didn’t hear anything or see what those rules are since 2012 until they released, finally released a kind of whitewashed, cleaned up version of them about 3 months ago. But you will also notice that they are not on one page. You have to click through multiple types of pages to get to them, so you cannot control, and they’re not control f-able. You can’t control f and search them at all. So that’s purposeful because one of the reasons that they gave for years is that if we tell you exactly how we’re taking up your content, taking down your content, then you can game the system. Then people can decide and figure out exactly the loopholes. So the example I give of the problem with rules is, so standards in the law, standards are things like don’t drive too fast. A rule is something like don’t drive over 35 miles per hour, right. But the problem with rules is that if you know that there are certain exceptions to it, then you can break them, like that you can go three miles over the thing and they just won’t bother bringing a case against you or writing you a ticket, right. The example I give is like my crazy uncle was in the military and he had beer bottles lying around his bunk, in his bunk, and his, like, sergeant came in and was like clean up these empty beer bottles, we don’t allow empty beer bottles, like, in this, like, thing, and so he filled them all with water, and then like left them, like left them perfectly in place, right. It’s kind of like if you know, like, the exact rule, you can violate the spirit of it very, very easily. And so that’s kind of why Facebook didn’t want to disclose these for such a long time.

[Ada Palmer] So to repeat that, the comment was about Reddit, which has also struggled with how to police sub-communities, Reddit being a discussion forum that allows you to make sub-folders for a discussion of a particular topic. And there might be one for discussing fantasy books. And there might be one for discussing specifically my science fiction books. And there, but there have also been ones for hate speech and underage pornography and, you know, posting pictures of fat people in order to have, share fatist, prejudiced comments, and the question of Reddit trying to figure out how to police these communities in a responsible way. Also falling into this difficult space. Well, and partly paraphrasing that and partly adding to it. Reddit is an interesting space for an intersection of anonymous and non-anonymous participation, because when you have an event like an AMA, which stands for an Ask Me Anything where an author will say I’m gonna log on at a particular time and you can ask me questions, or, you know, anyone can do this. If you want to schedule an AMA on Reddit, you have to prove your identity. And you have to take a photo of yourself holding a piece of paper that says I will do the Reddit at this time and share that on your social media and they’ll certify that it’s you and then you do it, but the people who are asking you questions are all doing it anonymously and pseudonymously, so that you have an interesting space in which if you want to be a participant in the conversation who’s the sort of focus of the conversation, you’re required to do so in your own identity and to prove your identity. But all of the other people with whom you’re interacting and who then can use that space as an ambient discussion sphere in your absence can do so anonymously and pseudonymously, so that it is hitting a space where it overlaps between those two things. I think the intersection between anonymous and pseudonymous and then mandatorily, you know where it works where you mandatorily put your name on it, are an interesting sphere. You know, when we look at the Enlightenment, you have these interesting mixes where a philosophe who’s part of this radical movement will know perfectly well that if you publish a thing with your name on it, which is what you’re supposed to do, you’re supposed to get it licensed and put it out with your name on it, and it has certain material, then it’ll be banned and you’ll get in trouble. So they’ll choose very carefully what to publish with their name on it, what to publish anonymously or pseudonymously, what not to publish at all. And different individuals will make different choices with different levels of conscientiousness about this, such that Diderot, for example, worked very, very hard to keep all of his radical and atheistical materials hidden, especially after a youthful episode where you have published some pornographic, a pornography novel about a nun basically, arguably the first lesbian pornographic novel. He got in trouble for that and related things and after that he was very careful. And when he’s doing something like the Encyclopédie project, he’s very careful while putting out that with his name on it to suppress anything else that he might be doing that’s radical so that it doesn’t impinge on it because if he got in trouble, it would get in trouble. But then you have figures like Voltaire, who puts out an early work with his name on it so that everyone takes it seriously, and then he gets banned from publishing, and he’s like, fine, but he’s gonna continue publishing, you know, illicitly, but everyone wants to read him because he’s Voltaire and he just moves to an area where they can’t police him and publishes all sorts of stuff, which says Voltaire on it because it’s being released clandestinely. But then he has no way to police other people putting the name Voltaire on stuff. And we have letters of Voltaire writing to and from people in Paris where they’re saying, Voltaire 11 pamphlets that say they’re by you came out this week, which ones are you? And he’ll write back and say, well numbers two, three, and eight are me and the other ones are not me, and there’s this strange is it Voltaire is it not Voltaire that you try to keep up with. And, you know, newspapers in London are reprinting chunks of things they think are Voltaire for sure, or others will put out a book saying with a commentary attributed to Voltaire, but you can never be sure unless you have a letter in your hand from Voltaire saying, “Hi, Emily’s fine and this is by me or this is not by me.” So, you know, interfacing strategically with anonymous and pseudonymous publication versus not is something that then, it demands a lot of care and author savvy on the part of the author and the publisher. You see the explosive counter case in the example of Jean-Jacques Rousseau, who publishes his Émile, which is a big treatise on education, which is one of the first articulations of the idea that kids are not just miniature adults and respond differently, and perhaps education should involve things like games and fun and maybe sports and jokes and not just hitting them with a stick. But in the section, because throughout, you know, throughout my period in the Renaissance, you know, we have allegorical representations of different things, and the allegorical representation of grammar, which is the first class you take when you start school, is a bitter old woman with a stick for beating students with. You know, mathematics has a sextant, and astronomy has an orb, and grammar has a beating stick. So, it’s a very radical treatise. But it has in it a section where we’re gonna, we’ll have the sample of the religious education of the kid and a vicar is brought in and the vicar says, well basically all religions are the same and the supreme being who made the world doesn’t actually care whether you’re Muslim or Catholic or Protestant, as long as you’re being a generally good person, everything’s fine. And we don’t know whether there’s really an afterlife but, if there is, it’s probably fine. If there isn’t, it’s probably fine too, so whatever. And he publishes this in Catholic 18th-century France with his name on it, and his friends come to his house after it’s banned and say, you know, Rousseau your Émile has been banned, you need to get out of Paris, and he doesn’t believe that because he’s so convinced that his logic is persuasive that he believes that this could never possibly be banned because it will persuade everyone who would read it. And he won’t believe them. And they have to physically kidnap him and drag him out of Paris by force to keep him from being arrested by the Inquisition because he genuinely doesn’t understand that if you want to say these things you need to say them pseudonymously. So, we just get all of these fascinating spheres of different levels of savvy and articulation in terms of what can be said anonymously or pseudonymously or affirmatively, with your name on it.

[Adrian Johns] Well one of the pieces on the reading list of course was, I think for this week, is Michel Foucault’s article What is an Author, which is a, you know, make it, I think it’s…

[Ada Palmer] What is an Author, Foucault.

[Adrian Johns] It’s, what can I say, my instinct is it’s discredited in certain ways, but it’s a good thing to think with, and one of the claims it makes is that the idea of an author is partly descended from leasing activities in the 15th, 16th, 17th century after the invention of printing. You know, there’s a kind of sorting out that has to be done if you’re going to assign not only credit but responsibility. And this whole thing about there being different strategies, maneuvers to anonymously and pseudonymously would be grist to that, grist to the mill of that kind of thesis. Incidentally, it goes on, it’s actually a big issue now in the sciences because in the last 30 or 40 years, especially in life sciences, and actually big to which are life sciences and things like high energy physics, where you have experiments that are very big and very sort of multifaceted, so if you’re doing an experiment at somewhere like SERM, your paper that emerges from it might have 200 authors. And so there’s this big issue of principle. How do you assign authorship among 200 people? And there’s a similar kind of thing in my medicine. Mario Biagioli, who’s a historian of science now at the law school at UC Davis, was instrumental in setting up what’s become an actual policy starting at Harvard and then expanding out through the biomedical sciences. For doing that, for having different kind of categories of authorship within what used to be just a single category, the author in scientific research. So you’d have, you know, this author was responsible for, say, you know, instrumental technology. This author was responsible for, you know, the calculation aspects or something like that. And so the responsibility and the credit, both the negative and the positive side of it, can actually be parsed out explicitly in scientific research papers in a way that’s quite foreign from papers that maybe you would have encountered when you were at school or that would’ve been published, say, up to about 30 years ago.

[Ada Palmer] You’re reminding me of the opposite, which is the most interesting name you ever see in the credits of an anime series. So the name Hajime Yatate is a name that the Gundam series franchise made up for, they realized that there were always, you know, this thing of credits rolls by, but there were inevitably several dozen more people that helped on the project at some point that didn’t get credited, either because they were part-time employees or somebody’s sister came in and helped for a little while or it was the janitor, and so they were indispensable in one sense but didn’t have an artistic credit. So the name Hajime Yatate is, just represents all the other people who helped make Gundam and is always credited in every Gundam series even though there has never been such a human being. And it represents the tomb of the unknown soldier, kind of, the unknown author without whom the series could not have been made as it was but whom it is impossible to credit because crediting bookkeeping itself is a filtered system because of the chaotic structure of team production of art or science.

[Cory Doctorow] There’s a question back there.

[David Copeland] Yeah, there’s a number of questions.

[Ada Palmer] So to repeat the question, the question is in early modern Europe, once a pamphlet is out there, how much do authorities care about actually identifying the author of that pamphlet and tracking that down as opposed to identifying and tracking down, say, the printer, or the readers, or the network through which this pamphlet, which is already out in the world, is moving?

[David Copeland] Yeah, I think there was a care about it in certain situations and who might have written it. You have to remember that in early modern Europe bloc, the number of pamphlets that were being created were not tremendous numbers, even still, and selling them would have sometimes been difficult. Giving them away, might could have done it, but you were out to make some money so you wanted to do that too. I think that one of the things you have to remember about the ideas that you’re thinking about is once the idea’s in the public sphere, anyone can take that idea, morph it, do whatever they want to. They can reprint it. So there’s a problem there. One of the people that I’ve dealt with was a 17th century Baptist named Benjamin Keach. He was not a very well known human being. He did his entire, his entire ideas were done via the printing press because that was the only way he could get his messages out. By that time at the end of the 17th century, he wasn’t so much of a problem. It was early on in his publications, and so what they actually did with that is they tracked down every publication that there was of it, they collected those all, they burned it. Everything was destroyed and then that could not be reproduced again by a printing press. Unfortunately for whoever wanted to stop his thoughts from getting out, he actually had the whole thing memorized. And so after he got out of jail, he redid it. The purpose of it was to train Baptist children for education, and what it, anyway it ended up becoming the New England primers, what this piece of what it was that he created. But he was able to remember it all and I think the idea was to track down it, but in some cases they wanted to get the people, which is what happened early in the Protestant Reformation. If we can find these guys, you know, we’ll burn ’em if we can.

[Adrian Johns] I think it’s, I think they do care about authors actually, maybe more than one would expect. But often the way to get to authors is to go through printers, book publishers, book sellers, and so forth. There’s actually an interesting tract I can’t quite remember the name. So Sir Roger L’Estrange, who was put in charge of government censorship in England from 1660 to roughly 1685. He was called Surveyor of the Imprimery. And he published this tract called Considerations and Proposals in Regard to the Regulation of the Press, something very similar to that. First phrase is Considerations and Proposals. And he actually goes through what you need to do to regulate the press, and he has this little paragraph where he lists all the people you need to tackle. And you might think it would be sort of print author, printer, bookseller. And it’s actually not. There’s about 10 or 15 different kinds of people. There’s the author, printer, bookseller, hawker, the mercury women, these are like wholesaling women who move around the city selling books to booksellers, higglers, peddlers, binders, stitchers, typographers, and so ideally you would have a gaze that would extend across all of these people who to us now are almost completely anonymous, these guys. And the author is in there, and he does care about the author. He certainly wants to get the author. It’s almost the ideal is to get the author because the author is the source of the text. But you often can’t get the author, and in which case you go after all of these people. And like you say, you roll up the network, or you recruit them as informants. One of the things about censorship, we talked about this right at the beginning of this afternoon, is that it always goes, but not I won’t say always, it very often goes along with recruitment of informants. Informants are a key, key part of it because the police need to know what’s going on so there’s very often this kind of symbiotic thing where when you get somebody, you turn them and you make them an informant for you. That’s repeated across places and periods. L’estrange would certainly do that.

[Ada Palmer] I’ll contribute a Inquisition related example and then I’d love to hear modern corollary thoughts from those who are working on more modern context, but if you look at the first edition of the Roman index of condemned authors and forbidden books that’s put out by the Vatican, it’s a list of here are the bad authors, you may not have them. You know, here are the bad books by authors that have some books that are okay and some books that are bad. You may have these but you may not have those. But as the Reformation heats up and as Spain and Portugal, who have way more gung-ho and well-funded inquisitions than Rome does, are starting to ban stuff way in advance and way faster than Rome can ban stuff, and as printing presses disseminate to local areas, the Inquisition quickly realizes that a list of here are the condemned authors can’t function because it’ll become out of date by the time it’s even put to press. So in the second edition of the index it lists things like letters exchanged to and from this particular Bavarian count may not be printed because he’s been corresponding with Protestants, and so even if we don’t name the author of the thing, we know that anyone who’s been involved with that person is likely to be doing Protestant stuff so we’re banning them in advance. Or they will ban things printed in a particular city that is a Protestant controlled or a Calvinist controlled city, where they’re expecting that anything produced here is probably a problem, or anything put out by this publisher that we know did this thing, or anything by an associate of this person, so that they are in a sense trying to name authors categorically rather than individually in order to preemptively expect that they can ban stuff that hasn’t been written yet because they know that it takes them a decade to put out a new edition of the index. One of the most fascinating lines in there is “Anything printed on a printing press that has printed the works of Luther is banned.” And on the one hand you think about it and you kind of chuckle, are they thinking of Lutheranism as physically contagious and physically contaminating the printing press, which in a sense they probably are, but they’re also thinking, okay the printer who printed Luther is probably also going to print Luther’s friends and is probably also going to print a lot of material that we don’t like that we don’t know what it is yet, so by saying anything printed by that printer that has printed this is preemptively banned. That’s how you make a category, it’s like how you make a filter. It’s how you make a filter before you have computers.

[Adrian Johns] I don’t know whether they did this in but in London there were certainly serious proposals to police everything by focusing on the access point of thoughts. The idea was that it was only about three or four people who would make type in London, so if you get those and if you have them make type which is identifiable, within about five years all of the type in the country was going to be, essentially, it’s gonna have, like, a little fingerprint to it. So you can tell immediately who prints anything that appears. That’s the idea. I don’t think it was ever quite done, but it was a serious proposal.

[Ada Palmer] So, that would in particular be good for points where someone is printing something and they’re claiming it was printed in Paris and really it was printed in England and they just put Paris on it because they’re trying to bypass some kind of censorship, or the other way around. But you can look at the typeface and immediately say, yeah, this is English type.

[Kate Klonick] Well this just strikes me, what you’re describing strikes me exactly how gradually platforms have learned how to target spam and target botnets and, like, and target trolls in general, and any type of misinformation in communities. So, one of the ways that, you know, we have this enormous problem with spam, which is itself, we all decide that no one likes spam, but it is, it is definitely allowed in a First Amendment capacity, but no one wants it one their sites, and so they decided to take it, we want them please just to take it down. The best way to target spam is through looking at where it’s coming from in terms of IP addresses, and the networks of IP addresses that all kind of spam or put out the same content at the same time, and thereby recognizing patterns of how this kind of, recognizing patterns of how these filters kind of, these networks work, and then being able to recognize the activity as it’s happening, and thereby, like, shutdown, all of the content without looking at the actual content by looking at the activity, like looking at the font, you know, and so this is how kind of, this is how they’re to do that.

[Cory Doctorow] So this becomes a form of guilt by association, right?

[Kate Klonick] Absolutely.

[Cory Doctorow] You know, I used to, I helped start an NGO in Uganda and just try and buy anything with PayPal from Uganda. Every time it would trip the, “you are using your PayPal account from a territory associated with high levels of fraud. Please call us to reset your account now,” right. Every single time I needed to buy something and, you know, do an internet transaction from Uganda. I wanted to talk about another form of guilt by association. So we have two main mobile platforms, iOS and Android. And one of the things that distinguishes them is that both of them out of the box require you to use app stores that are provided by their manufacturers. But iOS doesn’t have a box you can tick that says I’d like to trust other app stores and Android does. So if you tick the little box on Android, you can install unauthorized software. But one of the things that Android’s boot loader does is it snitches on you if you tick that box. So when you want to install, say, Netflix on a phone that can install third party software that Google hasn’t authorized, Netflix can say is there a chance that there’s software running on this phone that might allow people to save Netflix streams to the drive and share them with their friends? And if you have evinced a willingness to install software from a source other than Google, Netflix can opt to boycott you and your phone will rat you out to Netflix about having done it. And here’s where it gets super, super interesting. There are people who it’s not that Google won’t sell their software in their app store, they just don’t want to give Google 30% of their revenue from their software. The most prominent example is Fortnite, which is incredibly popular, so popular that they thought we are going to distribute this through our own server, we’re gonna get an extra 30% of the money that goes from sale of the software, and even if we lose 15% of potential customers because they don’t want to go into their Android settings and tick the box that says I trust third party software, we will still gain the extra capital from the customers who want to do this. Well of course the problem here is that now users have to decide do I want to risk being cut off by Netflix on my phone in order to play Fortnite, and Fortnite is not only going up against the inertia of not wanting to find that setting in your box and tick it, but also the potential cost of having concerted industrial action taken against them by the entertainment, the other entertainment companies that are willing to force everybody into giving 30% of their lifetime revenue per customer up to Google in exchange for some guarantee that they can set the terms on which their products will be used.

[Ada Palmer] I’m very excited now by the idea of thinking of these premodern efforts to make a thing like a filter, the premodern equivalent, the fact that the problem is old on how to anticipate and categorize things that should be blocked before those things come to be. I mean, in essence anytime you’re trying to draft a information policy that spells out x is allowed, x is not allowed is another form of the same thing. So I’m thinking, again, the New Zealand censorship law history, which moved gradually over the course of the last 150 years away from a vaguely defined indecent content and that will be set just by case law, you know, just the opinion of a particular judge or jury, and then that produces a case history and that case history determines what is and isn’t indecent. And they overturned that and moved to a, you know, depictions of sexual exploitations of children is indecent, depictions of torture are indecent, but these other things are not, so that they’re trying to spell it out and circumscribe in a kind of filtering way what is and isn’t gonna be pre-kicked out. And I, the final piece of that that I find very interesting is that the moment of really flipping over on that was caused by the decriminalization of male-male homosexual intercourse in New Zealand, because when that was decriminalized, a huge portion of their case law for indecency generally had been this depicts homosexual intercourse and that is a criminal act, and by depicting this criminal act, it is indecent and endorsing crime and therefore it should be censored. When that was decriminalized they said wait, a lot of our case law is now obsolete. We need to redo our censorship metric for what should and shouldn’t be censored, which is what triggers them deciding to actually write up it should be this, it should not be that and flipping over from the equivalent of the room full of college-aged people sitting in a room deciding what should be done, to the effort to make a specified algorithmic tool.

[Kate Klonick] So, this seems like a good time to mention the one obvious way they’re basically, that Facebook and YouTube and other sites do use algorithms to do content moderation, which are two main ones are PhotoDNA, which is to screen and automatically identify child pornography, and that is not because it can look at the picture and photo recognize child pornography, it is because there is a known universe of child pornography and people who work to keep a known universe of child pornography updated, and then they basically take like the equivalent of a digital fingerprint of like pixels from that picture in like .3 seconds between the time you tried to upload a photo to Facebook or wherever else, they screen it against this known universe and they can tell whether or not it’s a piece of child pornography that they know exists already. The second thing, the second thing is Content ID, which is for copyrighted material, and that uses a very similar type of thing for known copyrighted materials that and it automatically takes it down. The interesting thing about child pornography and the PhotoDNA is that, like, it’s perfect enforcement, right, of already flagged examples of pornography. It cannot do future creation of, like, new child pornography that it does not know about yet. Okay so that’s the flaw with that one.

[Ada Palmer] And a photo taken with a camera that has taken a photo of Martin Luther.

[Kate Klonick] Right exactly. Yes, no exactly. And then the, and then the problem with the second one is obviously over-censorship. That you can’t, it doesn’t read for the, it doesn’t read for parody. It doesn’t read for fair use. It doesn’t read for any number of, it can’t read for any number of things.

[Cory Doctorow]  And there are no penalties for falsely claiming copyrighting works on Content ID, so you can upload works to Content ID and then claim a copyright in them, and there are companies that have a thriving trade in this. One company claims most bird song as its copyright. Someone uploaded 10 minutes of silence and attracted six copyright claims for his silence. And the…

[Kate Klonick] That was the Paul Simon and, sorry, silence?

[Cory Doctorow] Yeah, no. It wasn’t a four minutes and 22 seconds either, it was a whole ton of stuff. And the wheeze is that if you, one of the boxes you can tick if you’re claiming a copyright is allow the work to stay up but put my ads on it and give me the money for the ads.

[Kate Klonick] Yup.

[Cory Doctorow] And so there’s a financial incentive to overclaim. New Zealand has a really interesting recent history of copyright censorship that hasn’t, where they created a law but have not yet enforced it, which is Bill 92A or Act 92A, which is a copyright extension that established what was called a three strikes regime where if you were accused of three acts of copyright infringement without any convictions, you and your family would be disconnected from the internet for a year and no ISP would be allowed to reconnect you. And this was passed in the teeth of very stiff opposition and after it was passed, the opposition actually got stiffer. There were street demonstrations, people blacked out their Twitter avatars, and so on. And they struck the law off the books immediately before the Christchurch earthquake, and in a special emergency sitting of Parliament after the Christchurch earthquake to authorize spending to dig people out of the rubble, the member of Parliament responsible for initially introducing the bill said that he would hold up the emergency funding unless they reattached three strikes to the Christchurch relief bill, and it is now back on the books in New Zealand but has not been enforced. They’re about to remodernize their copyright. They’re having another, they just, they’re consulting on a new copyright now, and so that’s a thing that’s in train and they may end up striking it off. They have a new government now that’s a lot more progressive. And this is mostly just down to Peter Jackson, right. It’s The Hobbit’s fault. They have a giant industry that, in a small country. The other big industry there is financial secrecy, right. So like those are the two big industries and what they want ends up on the books.

[Kate Klonick] Also, sheep.

[Cory Doctorow] Also, sheep. Yeah.

[Ada Palmer] So the question is about movie ratings codes and comics codes, and I would extend that also to video game rating system codes and how they relate to these questions.

[Cory Doctorow] So very good documentary about this by Kirby Dick called This Film is Not Yet Rated. So the MPAA has a quote-unquote voluntary arrangement by which it rates movies. An NC-17 rating is basically the death knell of a movie because exhibitors won’t exhibit those movies because the audiences are too small and because they also attract disapprobation from people in their communities. And so once a movie is rated NC-17, it doesn’t show on screens and that’s kind of the end of its life. And the MPAA’s ratings panel has been described by the MPAA through its life as a voluntary group that is frequently rotated of young parents of children not associated with the studios. But what had been well understood in movie making land was that, first of all, if you’re with one of the big studios and you submitted a movie to the ratings board and they want it to give it a NC-17, they would also tell you which scenes you should cut in order to get back to an R rating, but if you weren’t associated with the studios, you’d have to play Battleship with them, right. I cut this scene, you know, miss. I cut this scene, miss. I cut this scene, you sunk my battleship. You’ve gone from NC-17 to R. The other thing that was understood is that you could just have the most gory, violent killings you wanted and that would be R, but the minute two dudes kissed, that was NC-17. And, again, the MPAA strenuously rejected this. So Kirby Dick did this amazing thing. He hired two extremely nice Midwestern middle-aged lesbian private detectives in Los Angeles to figure out who the MPAA ratings board are. And it turns out that they’re a bunch of studio executives who’ve been on that board for years and years and years. They’re not frequently rotated parents not associated with the studios. They’re people who are old and whose children are grown and who work for the studios, which is why the studios were getting preferential treatment. The best part is he submits an edit of this movie with the identities of the ratings board revealed to the ratings board. And then he gets this phone call from Chris Dodd, who’s the former Senator who’s their chief lobbyist at the time who’s in D.C., and it’s recorded because one state consent recording, and it’s in the final cut of the movie, which is not rated, and Chris Dodd calls him from D.C. and says, “Hey Kirby, I just watched your movie.” And Kirby says, “How did you watch my movie in D.C.? As far as I know the only copy is here in Hollywood?” And he said, “Well they made a copy for me and sent it to me in D.C.” And he says, “Well I never authorized you to make a copy. You sue people who make copies of movies without authorization.” And he said, “But I put it in my vault in D.C. It’s safe in my vault.” Which of course prompted a lot of people to say, like, can you copy as many movies as you want, and so long as you have a vault… So anyway, it’s, you know, they interview all kinds of filmmakers who made, like, queer-positive movies that got NC-17 ratings and were suppressed and so on. It’s a remarkable movie but it does speak directly to the way, like, industrial concentration and secrecy and opacity and, you know, heteronormativity and corruption all come together. And also just how this kind of playful business is just a really good way of diffusing it and illustrating it. I really recommend seeing it. This Film is Not Yet Rated. It’s a wonderful movie.

[Ada Palmer] And a related thing there goes back to something we talked about last week where the New Zealand equivalent because it’s state-run they have to announce who works for it and the identities of all the employees are publicly accessible. And they write up reports on exactly what they’ve done, for why they’ve rated every movie what they’ve rated it. And it’s one of these spheres where on the one hand, it’s more restrictive because an unrated movie cannot screen in New Zealand. It is illegal to screen the movie if it did not go through the ratings thing, and you can go to jail for it. But on the other hand, the screenings process itself is forced to be transparent by public demands for reasonable degrees of government transparency. In the U.S. where it’s a paralegal organization, where it’s a voluntary civilian organization but you have no choice but to go with it, that means that there’s no process by which to demand that it be transparent or even that it not lie about who’s running it. So there you’re seeing both the up and down sides of the two.

[Cory Doctorow] And I think, like, to get back to the earlier point about compelled speech, free speech, compelled action, and censorship, state action and private action, one way that the state could in fact square the circle here without, I think, falling afoul of the First Amendment is they could say, you know, Fox just got permission to sell itself to Disney, right, and that comes through an antitrust scrutiny. The FTC has to give its blessing. The FTC could make that blessing contingent on cleaning up these processes. They could say, well alright, you know, you have absolute freedom of private action, but when you come to us for a waiver that allows you to do something anti-competitive like buying the biggest studio that is your rival, we are going to in the public interest require a greater degree of transparency in your processes because we are rightly afraid that there’s a nexus between industry concentration and collusion, and so now if you wanna buy Fox, the quid pro quo is you have to tell us who’s on the MPAA rating board, you have to disclose their criteria, you have to rotate them out, and so on. You have to have a due process for challenging it. And I think that you can do that without having the First Amendment enter the picture and say, well, the state can’t really tell movie studios how they can and can’t structure their businesses. I don’t know, you tell me if you think that passes legal muster.

[Kate Klonick] No, I’m not gonna, I’m not gonna weigh in on that.

[David Copeland] We did that, right, until we got the MPAA. We had motion picture codes and boards and cities and in states, and you know, it was I think in ’57 that there was a movie, some of you may know it, saw it. It was a Disney movie about the planes and it was banned in Chicago by the censorship board because it showed the birth of a bison. That was considered obscene, not to be seen.

[Cory Doctorow] And, you know, before that of course you had the Edison Trust, which controlled all the movie patents, who would very strictly regulate what kind of movies you could make. So a bunch of Eastern European filmmakers in New York went as close to Mexico as you could get to a little dust bowl town called Hollywood and started a bunch of studios where they would violate Thomas Edison’s patent so far away that Edison’s enforcers in New Jersey couldn’t readily get at them. And they could go across the border to Mexico any time the patent enforcers got too close.

[Ada Palmer[ The question is about pseudonyms, and the question is is there such a thing going on as the death of the pseudonym, and because of time this is gonna have to be our last question. Do people want to weigh in?

[Kate Klonick] Do you have an example that you’re thinking of?

[Cory Doctorow] Oh just Facebook’s real name policy.

[David Copeland] Things with no names.

[Kate Klonick] Oh the real name’s policy.

[Ada Palmer] Yeah.

[Cory Doctorow] Well, you know, pseudonyms are hard to maintain and they get harder with more computation. Computational linguistics is making it harder. I mean, you can have shorter live ones. One of the things that’s not well explored yet is a discipline called adversarial stylometry, where you use the same computing tools that are used to re-identify text to then de-identify them. So you say what is it that is in this candidate text I’ve just written that makes it seem like I wrote it. You ask the same software that would be used to unmask you and then you change all those elements that affirmatively identify you or circumstantially identify you and you change them and you try to make it as anonymous as possible. So we’re not actually sure where the limits of pseudonymity are, but you’re right that Facebook sucked a lot of the oxygen out. You know, Zuckerberg had this doctrine that was people who don’t use their real names or people who present more than one facet of their identity to more than one group of people are quote two-faced. And, you know, you can read early accounts of sociologists who worked with Zuckerberg, got hired by the company or consulted to it, who just said this is ridiculous. What are you talking about? Like, I don’t talk to my lover the same way I talk to my child the same way I talk to my boss. Of course I have more than one facet to my identity. Are you, like, that is the most awful thing I’ve ever heard that we can’t have more than facet to our identity. And, of course, you know, the element that Zuckerberg was most interested in is if you only have one facet to your identity and it’s all linked, then it’s much easier to figure out how to target you with advertising. And so he formulated a moral doctrine that had this very self-serving element. And, you know, if all the people you want to talk to are on Facebook, then it’s very hard to be on a forum where, and have that discussion. Now the corollary of that is that if Facebook doubles in value every time a new user joins because it doubles the number of possible connection, it drops in value by 50% every time a user leaves. And so they are now, you know, they live by the sword and they’re dying by the sword. User numbers are down. They’re hemorrhaging them. With any luck, you know, we’ll have Facebook remembered as an awkward growing pain of the internet.

[Kate Klonick] So one of the, one of the big, one of the other things that I read about is online shaming and why real name policy was really designed to curb online harassment and online shaming, which it has not necessarily been shown to do at all.

[Cory Doctorow] Yeah.

[Kate Klonick] What?

[Cory Doctorow] Yeah.

[Kate Klonick] Yeah, I know. So, that’s like, it’s not actually effective, but it is none the less become what was, but the idea behind it is interesting because in, like, kind of, if we would take kind of The Scarlet Letter kind of classic example of what happened with shaming, someone did something that violated a social norm and the way that we punish them was we shamed them. And that involved a couple of things. You had to, like, they were gonna be in, let’s say they’re in the stocks. They had to go and stand in the stocks. You had to go walk by them. You had to go with your real face and walk by them and throw an apple at them or a tomato or, other people, you, it was a communal activity. There was social meaning in you shaming them. People had to see you do it. What anonymity, what is different about the internet, one of the things with anonymity, is that you can shame and pile on and destroy someone’s reputation and do all of these things and you can do it without ever putting your own personal reputation at risk. And so I think that that was part of why the real name policy, well that was part, that was like the main supposed, besides exactly as Cory said, the kind of more nefarious type of purpose, but that was the reason that was justifying the real name policy. That has not seemed to have borne out, necessarily. People still do crazy stuff because the internet feels not real in various ways and they don’t think that it has the type of ramifications that you would in real space.

[Ada Palmer] What you’re making me think of, historically speaking, and then we’ll have to wrap up is just that anonymity and pseudonymity will continually get undermined as those trying to undermine it figure out what you have to do to undermine it in the new technology. So we’ve developed the printing press and people are doing this anonymously and pseudonymously and for a while that’s burgeoning, but then we realize we can police it if we control typefaces. We can police it if we turn the printers assistants into informants. We can police it if we look at all of the wood blocks and realize that we can identify that curly fancy letter A that the printer is using over and over even if they change the name on it, so that the weaknesses of the pseudonymity and anonymity of that new technology come out bit by bit. And then people who need to be anonymous and pseudonymous move to a new method and a new technology to avoid those methods. So you have, I think, repeated deaths of a particular anonymity resulting in the movement of people to new ways to aspire to anonymity or a new temporary anonymities as they try to keep pace ahead of the anonymity hunting tools of their pursuers.

[Cory Doctorow] I know we’re out of time, but this works retrospectively. So you make a bunch of block chain transactions. They’re all siting out there in this public indelible ledger. Then one day you make a thing that identifies your bitcoin identity with your real identity, and every transaction you’ve ever made unwinds and then all the people you gave money to are now known to be within your friend group, and so a little bit of social graph analysis and we figure out who all they are too.

[Ada Palmer] Yes. Well, and everyone who’s printed an anonymous pamphlet with your printer. Once you’re busted, the printer’s busted, they’re busted, and it all spreads.

[Cory Doctorow] Privacy’s a team sport.

[Ada Palmer] Alright. Next week, see you all. Thank you.