The Means. Harm: does the technique cause unwarranted physical or psychological harm? Boundary: does the technique cross a personal boundary without permission whether involving coercion or deception or a body, relational or spatial border? Trust: does the technique violate assumptions that are made about how personal information will be treated such as no secret recordings? Personal relationships: is the tactic applied in a personal or impersonal setting? Awareness: are individuals aware that personal information is being collected, who seeks it and why?
Golden rule: would those responsbile for the surveillance both the decision to apply it and its actual application agree to be its subjects under the conditions in which they apply it to others? Public decision-making: was the decision to use a tactic arrived at through some public discussion and decision making process? Human review: is there human review of machine generated results? Right of inspection: are people aware of the findings and how they were created?
Right to challenge and express a grievance: are there procedures for challenging the results, or for entering alternative data or interpretations into the record? Redress and sanctions: if the individual has been treated unfairly and procedures violated, are there appropriate means of redress?
Are there means for discovering violations and penalties to encourage responsible surveillant behavior? Adequate data stewardship and protection: can the security of the data be adequately protected? Equality-inequality regarding availability and application: a is the means widely available or restricted to only the most wealthy, powerful or technologically sophisticated? The symbolic meaning of a method: what does the use of a method communicate more generally?
The creation of unwanted precedents: is it likely to create precedents that will lead to its application in undesirable ways? Negative effects on surveillors and third parties: are there negative effects on those beyond the subject? Beneficiary: does application of the tactic serve broad community goals, the goals of the object of surveillance or the personal goals of the data collector? Do you find that it is actually impractical, that agents are being completely overwhelmed by the numbers they're seeing every day and not at the higher levels, but down, the folks who are actually doing the collection and [unintelligible]?
John Donvan: Stewart Baker? Stewart Baker: I'll try that. There is always a risk that you will be overwhelmed by data. I think that's about 40 percent my kids' Facebook postings. And no one at NSA said, we cannot keep up with this data. Once you've put it in a proper framework, they were only doing, after all, identifier searches in a year. It was not a problem doing those searches.
It was not a problem collecting and putting the data into a database. There may be times when an ordinary agent says, "I've got too much data," but in this program and in the program I talked about earlier, with travel data, the computer systems allowed us to use it very effectively. John Donvan: I want to -- that was kind of Michael German's opening point, that there is too much. So I'd like to hear your response to Stewart's response to that. Michael German: Well, one of the major controversial programs was a similar program collecting internet data that ran.
That was the whole hospital room confrontation with attorney general -- Male Speaker: Ashcroft. Michael German: -- Ashcroft. I'm sorry. And they ended that program in because they found that it wasn't actually very effective. So for ten years, they collected our records, and it took them that long to decide they weren't actually very useful. John Donvan: Ma'am, you just had your hand up. Right behind you, sir. Female Speaker: Thank you. My name is Shelly. And I'd like to talk to the point -- ask the question about the point of what data mining means to our privacy and our safety.
I have to admit that whenever I hear a government person say, "Trust me, "I get very skeptical about what they're doing. We have a long history in this country -- John Donvan: Okay. I'm going to stop you there because I would rather let them make the speeches.
But your question is good, and when I -- and it feels like a big softball to this side. But -- but I think it's worth getting some more detail about the way in which -- that you -- you know, you proposed something that's not happening, the TV camera in the bedroom. But let's say that data mining is happening. And what are actually the risks to privacy, aside from the kind of creepy feeling that you described having, that it can be done? What are the actual risks in the implementation?
And I'll let David Cole take that. David Cole: Well, the risks are that with all of the -- I mean, the same logic that gave them access to the phone records would give them access to your email records, to your internet records, to your credit card records, to your bank records, to your phone location data.
And the danger is you put all that information together, and they can determine everything about us. They can know more about us than our closest friends know, than our spouses know. And the only thing that -- if you give them all that data, then the only thing that's stopping them from doing that are these so-called backend safeguards which were routinely violated by the NSA and which we've -- we didn't have any opportunity to debate as to whether they were adequate or not because they were put in place entirely in secret.
Richard Falkenrath: The "they" that you just described, "They can put all this together, they can understand --" David Cole: The government. Richard Falkenrath: Yeah. It sounds a lot more like Google than the government because -- and you have to take this -- this is actually a serious point. This is not the case that the government is tracking everything you do and can put it all together. And the government doesn't care what you're doing. They're actually, though, in the last 10 years, with the explosion of social media and the explosion of terms of service agreement and informed consent that no one reads, is -- what you just described emerging in the private sector, with a legal basis being a document which no one can really read and give informed consent to; unlike the program we're talking about, which is subject to extreme safeguards, rooted in the Constitution, backed up by both chambers of Congress and by the judiciary.
And so I think you've really got to shift your terms of your argument here. This isn't the '70s with the FBI running around at the behest of Richard Nixon probing into people's lives. This is a world where the government is tightly regulated and overseen in a way subject to law. And it's the private sector, if anything, which is emerging as the "they" in your scary scenario.
David Cole: Well, the fact that the private sector may -- may threaten our privacy is not a justification to allow the government to invade our privacy, for two reasons. One, we can establish limits on private sector. But two, for the reasons I suggested earlier, there are lots more reasons to be concerned about the government having access to this information than the private sector having it.
And that's reflected in our Constitution, which constitutionally limits government access to data, does not constitutionally limit private access to data. There's a reason for that. And it's a good reason. John Donvan: I want to point out, tonight's debate is being broadcast worldwide on our website, iq2us. It's also being seen on Arizona State on live stream.
And I'mjust saying that because in the time we have left, if you're watching it and would like to join in the audience participation question, you can try sending us a Tweet on it. And if we can pick it up in time, our hash tag is "spy debate.
John Donvan: So think carefully. Another question, right up there. Female Speaker: Hi. My name is Kadeeja [spelled phonetically]. I was going to say, if the NSA wanted to be safer, why did they lie to Congress about spying on us? John Donvan: Stewart Baker. Stewart Baker: I'm -- John Donvan: Well, really, the question is -- really, I think the question is, should we be troubled that the NSA is accused of having been dishonest?
He was, as far as I can tell, surprised by the question and made an error in his answer. How could he have said, I'm going to lie people after having gone out and said, "I want everybody to understand this program.
And so I don't think that we should be saying that there is a -- there's intentional lie being undertaken in that context. John Donvan:Other side want to respond? David Cole: And it was an unintentional lie? At that time, nobody knew about this program. We didn't know.
Edward Snowden hadn't stolen the information. He was asked, point blank -- Stewart Baker: This information was provided to everybody in Congress. David Cole: Everybody in -- a letter was written that we -- you're assuming every member of Congress -- every member of Congress -- John Donvan: You know, I want to move on because -- David Cole: -- reads every letter that's sent to [unintelligible].
John Donvan: -- said will not help the audience vote on the motion so I'd like to move on. Michael German: But Senator Wyden did come out and say that he had given the question 24 hours in advance. John Donvan: Okay. Right up here. If you can stand, please. My name is Cassandra. I wanted to ask about the effect of the revelations of these programs on our foreign policy.
Could you speak to that, please? John Donvan: That's another one where -- well, I'm not sure that that's relevant to the question of how we should feel about being spied on ourselves, so I'm going to pass on that question. Male Speaker: [unintelligible] I can take that. Male Speaker: Hi. My name is Jamie. It seems to me the central issue here is a reasonable suspicion. So my question is, is the massive surveillance that is being on debate here necessary to ascertain whether someone is a reasonable suspicion or not?
Stewart Baker: Let me try this one. Stewart Baker: There -- no one looks at these numbers without reasonable suspicion. That's the standard that is required by some of the safeguards, not all of them, but some of the safeguards that are already built into this program. John Donvan: And Stewart, just to clarify, so that means the data's available, but nobody actually goes and looks at it.
Stewart Baker: Right. Stewart Baker: That's right. So that was done quite consciously because everyone at NSA and in the government believed that the right answer was not to look at data without some reason to believe that the person that you're looking at is engaged in suspicious activity.
The only difference between a standard law enforcement search and the searches we're talking about in the context of NSA is they gather the information first and put it in a database, but didn't search it without a reasonable suspicion.
The reason that they gathered it first was because it was not practical to leave it where it was. They would not be able to do the searches in the time with the efficiency that they needed to use. John Donvan: Would someone like to respond?
Michael German: -- part of it is that the harm comes from the initial collection, right? If we know -- and there's studies to show this -- if we know we're under surveillance, our behavior changes, right? Every time that you're on Google and you hesitate before you put that search term in or you hesitate to go to that website that does damage to the fabric of our society, to the idea that there's a marketplace of ideas But even with the limited number of searches, they go three hops.
So it's the reasonable suspicious number but the people they called, the people they called, and the people they called. And then that -- it's like a big scoop that goes into this database, pulls out all those numbers which could rise into the millions -- Stewart Baker: Pulls out numbers.
They put in , they brought out , and they gave numbers to the FBI. They didn't bring out a great gob of data. Michael German: They said they go three hops out, so if you only talked to two people, perhaps five -- Stewart Baker:No, they go to many places looking for suspicious numbers. When they find the suspicious numbers, they take it to the FBI.
They only took numbers to the FBI. John Donvan: I thought I saw a hand. David Cole: That doesn't say how many numbers they looked at. That just says how many numbers they -- [talking simultaneously] Stewart Baker: [inaudible] names attached, it was just -- Michael German: The minimization standard says the numbers from the three hops go into the corporate store.
The corporate store can be searched by anyone for any reason. If there's -- the minimization limits aren't on the corporate store. So it's like a big scoop that gets it, puts it in the store, that data can be used for a myriad purposes.
John Donvan: We have time for one more question, so make it good. And my question is for Mr. It seems at the heart of your argument is the sense of credible threat, that the public believe that there is a credible threat. But the government can't tell us what it is. They cannot reveal -- they can only say, "We've uncovered plots.
As a journalist, I've had the privilege of talking with numerous people who do counterterrorism and journalists who cover that. And they've looked me in the eye, and they said, "The threat is real. Stewart Baker: I think that's a fair question. And the real answer is you don't need to believe the government. Sure, the government has access to particular threats. But it doesn't take a security clearance to know that there are a lot of people who would like nothing better than to kill everybody in this auditorium and would be delighted to have done it.
We live -- that's the world we live in, and the technology that we all enjoy has empowered them. What I'm really suggesting is that having empowered everyone, and increasingly empowered people on the other side of the world who hate us, to cause serious damage here, we need to let the government use the technical tools that are created by lower cost for storing data to offset the advantage that the terrorists have.
That's -- it seems to me you don't have to have a security clearance to have a commonsense appreciation of what the threat is and how empowered terrorists are these days. John Donvan: [unintelligible] you had a good long run, and we're going to wrap up. I want to give you 15 seconds for a last word.
Michael German -- Michael German: Sure. John Donvan: -- for 15 seconds if you can do it. Land this thing. So what happened now is we have less knowledge of what the intelligence community is doing and how effective it's being. That can't work. We have to have transparency. That's how we get effective government.
Debate where our motion is "Spy on me, I would rather be safe. We are about to hear brief closing statements from each debater in turn. They will be two minutes, each. And remember how you voted before the debate, we're going to have you vote again immediately after these closing statements. And the team whose numbers have changed the most in percentage point terms of your vote will be declared our winner. On to round three, closing statements.
Our motion is "Spy on me, I'd rather be safe. Michael German, ladies and gentlemen. Thank you very much for having me tonight, and thanks to Stewart and Richard for a great debate. Let me close by talking about this problem from a different standpoint. They want you to be afraid. So their only tool is to use horrible violence to try to provoke a government into taking measures that damages itself.
They know that when people are afraid they make bad decisions, they act irrationally, and their hope is that the government is going to overreact to the threat, right?
One of the interesting things I found working in terrorist groups: they always have a manifesto, right? They create a clandestine organization and the first thing they do is tell everybody in the world who they are. John Donvan: Thank you, Michael German. I hope the audience understands that neither Stewart nor I are in favor of an unfettered, unchecked executive authority to commit domestic spying. We understand that this is an incredibly difficult area of governance and requires tough oversight, the involvement of Congress, and the involvement of the judiciary.
We were part of the FBI joint terrorism task force. In September , we learned of a case. We knew nothing about it until we were told as result of electronic surveillance there was an individual in Denver developing a bomb and intending to transport it back to New York City for the purpose of attacking the New York City subway around, he thought, September 14, 15, or We found out about that because of his electronic communication with his bomb-making trainer in Pakistan.
He drove across the country, FBI surveillance team in Denver acquired him and began surveilling him in the course of his drive across the country with one to two kilograms of TATP explosive in his trunk. We began an investigation of his contact, who he was in telephonic communication with. As a result of that, dozens of people he was in communication with were identified. Quickly, his two key co-conspirators, and then subject to much higher levels of intrusion -- of intrusive investigation.
This was a real plot against the city of New York, where I was at the NYPD, and it was stopped not entirely, but in large measure because of the techniques we were talking about here tonight. This is not abstract. There is another side to this, and it something which is very, very valuable. Technology has changed the calculus of surveillance in a dramatic way. That check has gone out the window because of the -- because of Al Gore and the internet. We now -- it's now possible to learn everything about us through this third party information which Richard Falkenrath says we shouldn't be concerned about at all.
I think we need to be concerned about it. I think we can strike a proper balance between the technology that makes it possible for this kind of very, very broad surveillance and the need to find bad guys. But we can't do so if the programs are run in secret, if the NSA is lying to us about it, and if we haven't had an opportunity to have a democratic deliberation. And when we don't have that democratic deliberation, it seems to me it's very likely that the security people are going to go overboard on the side of security.
And when they're collecting texts, they're collecting data on every text that I send to my high school daughter when I go to pick her up from school, and she hasn't come out, and I say, "Where are you? I'm here. Where are you? The only reason that they have access to that information is because they did it in secret because if they had done it in public and told us they wanted to gather that information to keep us safe, I think we would have said no. And you should say no.
You know, when I started as general counsel of the National Security Agency, Janet Reno, the attorney general, came out for a visit. And this was a high- stakes meeting. She was deeply skeptical about whether this spy agency could be trusted at all, whether it understood what the Fourth Amendment was.
And we were walking through an operations center with her when the director stopped, looked at a corporal who was going over some intercepts, and he said to the corporal, "Stand up, sir. We cannot disseminate it unless there's foreign intelligence in it. We must take the -- we must anonymize the data and destroy it if there's no intelligence in it," which was the rule.
And I thought to myself, you know, the first rule of lawyering is don't ask a question if you don't know what the answer's going to be. But the director was absolutely sure that you could pick anybody out of that agency and ask him what the rules were. He would tell you and would be proud of the fact that he knew them and would obey them. That is the culture of the National Security Agency.
If you givethem the rules, they will follow those rules. They have -- we have given them rules in this context. They are subject to lots of constraints. We cannot say, "All of this public, or we might as well not try to gather intelligence. I think we can do it. And that concludes closing statements. And now it's time to learn which side you feel has argued the best here. We're going to ask you again to go to the keypad at your seat and press keys number one, two or three.
Remember, the motion is this: Spy on me, I would rather be safe. If you agree with this team that's arguing for the motion, push number one. If you agree with their opponents, this team, push number two. And if you became or remain undecided, push number three. And you can ignore the other keys. And if you make a mistake, just correct it, and the system will lock in your final result, and we're going to lock this out in about 15 seconds, and then we need about two minutes to calculate the results for you.
So while that's being done, one thing I want to say is I've been exchanging glances with our executive producer who sits in the front row.
And the thumbs-up came up about halfway through. I think this is one of the best debates that we've ever had in terms of not only the material that the teammates -- that the opponents and teammates brought to the table, but also the spirit in which this was conducted and the respect, and the -- our favorite word -- the intelligence of it. So I just want to invite a round of applause for all of them.
There was nothing wrong with any of them. This is the kind of thing that really makes you think, and your brain starts going, and I think in fact a couple of the questions that I turned down in themselves would make excellent topics for debate. So everybody who put their hands up and those who got up to ask questions, I want to thank you as well for doing that.
That's us. The hash tag is SpyDebate. And I happen to know that there are some of our New York audience members who came down for this debate. So you've got to copy that and go up to New York. The topic on December 4 will be "Don't eat anything with a face.
Neal Barnard, he is a clinical researcher who studies the effects of diet on health. Also for the motion, Gene Baur, he is president and cofounder of Farm Sanctuary. Time Magazine called him the "conscience of the food movement. A limited number of tickets are still available for that one on our website, www.
They are debating this question, "Drone wars, are we going too far? To reserve a ticket for that, go to their website, McCainInstitute. And for people who couldn't join our live audience, there are a lot of ways to catch these debates, going forward, at IQ2US. TV, and the McCain Institute debates can be seen on their site as well. And we would love to have you listen to all of our debates on NPR stations across the country. We'll almost definitely be broadcast here in Washington on WAMU, so tune in when that happens, and you can listen to your applause.
And we're hoping to make it back to Washington again next year, so go to our website for up-to- date information on that. And before I announce the final results, I want one more time to turn the lectern over to Ambassador Kurt Volker. And I want to hear do you agree with that? John Donvan: Oh, thank you. And we look forward to seeing you, as John announced, here in this room December 5, , on "Drone policy, are we going too far? John Donvan: Thank you, Kurt. I have the results.
Remember, we have you vote twice, once before the debate and once again after the debate. And the team whose numbers have moved the most in percentage point terms will be declared our winner. Our motion is this, "Spy on me, I would rather be safe. So those are the first results. Remember, you need to move the most in percentage point terms to win this game. Here is the second vote. The team arguing for the motion, their second vote is 29 percent.
They went from 26 percent to 29 percent. That's a 3 percent increase. Learn more and compare subscriptions content expands above. Full Terms and Conditions apply to all Subscriptions. Or, if you are already a subscriber Sign in. Other options. Close drawer menu Financial Times International Edition. Search the FT Search. World Show more World. US Show more US.
0コメント