1:11:18

Julian Assange Reddit AMA 1-10-2017.mp3

01/10/2017
Julian Assange
00:03:27 123123 this is Julian Assange. Can you hear me?
00:00:11 OK, I hope everyone can see me and hear me.
00:00:16 I'm here responding to this.
00:00:19 Let it OK.
00:00:25 I'm Julian Assange, founder and editor of WikiLeaks.
00:00:30 As most of you will know, WikiLeaks has been publishing for 10 years now, and during that time we've had many battles.
00:00:38 In February, the UN ruled it I've been unlawfully detained for the past six years without charge in the United Kingdom, a ruling that was.
00:00:49 Reinforced not many people noticed again on November 29th.
00:01:00 We're entirely funded by our leaders and during the US election, Reddit users mined our material, our publications, in a really very impressive way.
00:01:16 Some other people on Twitter and vote are also doing similar thing and 4 Chan.
00:01:21 Yes, OK.
00:01:22 Coming up with sometimes a lot of rubbish, but amongst that excellent excellent material Paul edit that.
00:01:34 Occurred particularly in the Reddit WikiLeaks group.
00:01:38 The Reddit, the dialogue group, which is a Donald Trump support group.
00:01:43 The DNC leaks group and a couple of groups are associated with the Bernie Sanders campaign, so.
00:01:54 Staff here at WikiLeaks are really happy to see that happen.
00:02:00 Happened as a result, well, partly as a result, WikiLeaks was the number one reference political topic across Facebook during October and the first week of November.
00:02:16 And the same is also true for Twitter.
00:02:20 On Facebook, WikiLeaks was the number two topic in absolute terms, not just political topic, but of all.
00:02:29 Topics during October, the number one topic was people sending pictures of themselves for the Halloween period.
00:02:46 We have an enormous publishing year ahead, something really very important and special and will be difficult for this organisation.
00:02:56 The difficulties have already started.
00:02:59 We're used to difficulties, a lot of the.
00:03:03 Staff are quite like that feeling of challenge.
00:03:08 The difficulties for me of being pretty much the same the last few years.
00:03:14 That's tough.
00:03:14 But having something grand to work on.
00:03:23 That is really important, intellectually and politically, is a very pleasant distraction.
00:03:33 So people, I'll go through their list of questions strictly in order of vote, provided there's no duplicates.
00:03:44 And people can transcribe my answers and we'll cut and paste the best transcriptions into my comments posted on here.
00:03:58 So there's a nice written record as well.
00:04:02 And by best, I don't mean are the transcriptions which have every.
00:04:08 That I make, but rather than ones that are the easiest to read and the most faithful to the ideas being expressed.
00:04:27 It's a.
00:04:28 Well, no.
00:04:28 It's very lovely.
00:04:29 Yeah, very, very nice.
00:04:31 OK, fine.
00:04:34 You can see that one there.
00:04:37 Where's the sound?
00:04:38 OK, let's get it.
00:04:39 Yeah, if you put.
00:04:43 People are saying.
00:04:44 The sound is too low, so I'm going to turn it up a bit.
00:05:00 Can you hear me?
00:05:01 Is it OK?
00:05:03 I think that's better.
00:05:08 121212.
00:05:14 1212.
00:05:17 OK, alright, I'll just summarize all that again.
00:05:23 OK, my name is Julian Assange.
00:05:24 I'm the founder and editor of WikiLeaks.
00:05:27 As many of you know, WikiLeaks has been in the news quite a.
00:05:30 Lot this year.
00:05:31 We've been going for 10 years.
00:05:34 We're entirely funded by our readers and publication sales.
00:05:40 We've had many, many battles.
00:05:43 A lot of those are still ongoing.
00:05:44 We have about a dozen court cases US suing people, filing criminal charges against the US government are and the same thing happening back in the other.
00:05:55 On February the fifth this year, the United Nations announced the result of 18 months of litigation by me against Sweden and the UK, and they lost and the UN ruled that I am being illegally detained.
00:06:14 Charge and I should be immediately released that the UK tried to appeal and it lost the appeal.
00:06:23 On a November 29 this year, I wasn't widely reported because it came out on a Friday, but a a major victory for us. So during the US election.
00:06:38 Many let it users produce scoop after scoop based on our material, so we publish a lot of material.
00:06:47 We get analysis and promotion ourselves, but to publish, to analyze.
00:06:55 You know about 50,000 documents during a few months is actually impossible for journalists, even if they were so motivated.
00:07:05 There's just not enough of them, and in the US electoral process, of course they had four hits of the media, perhaps more.
00:07:14 If you include the print media.
00:07:16 Really not wanting to produce stories that were critical of the Hillary Clinton campaign, there's there's some fine exceptions.
00:07:27 There were some good exceptions, but overall actually an attempt to avoid doing serious work.
00:07:35 But the.
00:07:36 Results of that mining on Reddit and a few other places, but principally red.
00:07:42 Is that are are those discoveries were then picked up in social media and in the conventional press and as a result, WikiLeaks during the last five weeks of the election was the number one political topic on Facebook.
00:08:03 And the number one political topic on Twitter, according to stats from those companies and independent research.
00:08:13 We have a huge publishing year ahead, very serious, very special, a bit scary to us, but we like the challenge that that causes and and I encourage all of you to to help and back us because we're going to need.
00:08:34 A lot of support to properly deal with the the backlash and make sure that the.
00:08:43 You know the real importance of what we're publishing comes out and is not distorted.
00:08:49 Many of you will have seen their incredible distortions that have been pushed comes out of the electoral period.
00:08:57 But a lot of those things happen even outside that it's it's naturally if you're a.
00:09:01 A publishing organization who specializes.
00:09:04 In exposing powerful institutions and governments.
00:09:11 Well, they're powerful.
00:09:12 By definition, right and the media establishment, media by definition is used to project the perception of the establishment that, broadly speaking, that the establishment wants so it's.
00:09:31 It's one of the tools used by the ruling class to cement the rule.
00:09:39 Known Chomsky has and Herman has put it this way, which is the the press is to democracy, as the truncheon is to dictatorship.
00:09:51 It is part of the system that is used to maintain rule and order the dominance of the existing ruling class so.
00:09:59 It's no surprise that you see.
00:10:03 Attacks on us and on independent media.
00:10:07 There's a lot of bad independent media as well.
00:10:09 To be fair.
00:10:10 OK, now go through the comments.
00:10:15 Can you explain your whole October?
00:10:18 Well, most of it.
00:10:21 Most of it are extremely busy, so it's.
00:10:28 Just try and conceptualize.
00:10:32 I am have been in an embassy siege for the last 4 1/2 years.
00:10:37 It's a small embassy.
00:10:40 The embassy is surrounded by a police and intelligence operation, of which there's numerous pictures and admissions by the British state they spend about.
00:10:55 $6 million a year. They admit to spending about £4 million a year just on the COVID and covert police surveillance, the course says MI 5, et cetera. They have.
00:11:11 Robot cameras, quite sophisticated types that they've installed in different buildings, plainclothes police operating on the street, and they've done deals for which we have the paperwork with some of the opposing buildings, which are owned by Harrods, which is.
00:11:27 A big department.
00:11:28 Store here, but Harrods itself is owned by the.
00:11:31 Sovereign Fund of Qatar, so it's not an easy environment to work in.
00:11:39 Spying on the outside, I'm spying on the inside, informers, verbal cameras, etc.
00:11:48 And then during the during October, there was pressure applied by John Kerry and the US administration and perhaps some other forms of pressure domestically.
00:12:06 With the.
00:12:08 That resulted in my Internet connection being cut off and quite a a increase in the security environment here in terms of people getting in and out of the building easily, etc.
00:12:27 I think was a wrong thing to do for.
00:12:32 John Kerry to politicise the officers as Secretary of State and try and use that to domestic political advantage.
00:12:42 By pressuring me in relation to my political asylum, WikiLeaks does not publish from the embassy, doesn't work from the embassy.
00:12:54 I am a political refugee stuck in this embassy because the UK refuses to obey international law.
00:13:03 And respect my asylum rights.
00:13:07 We published from France, Germany, Netherlands and so on.
00:13:13 Wide range of countries, not Ecuador.
00:13:15 So Ecuador is purely pressured because.
00:13:19 And they are responsible for my physical security as a political refugee, which is, which is pretty, which is pretty disgraceful, to be fair to Ecuador.
00:13:35 It could or has denied it. They were pressured. That's not what our sources say, and it's a small country, 16,000,000 people quite innovative Latin American.
00:13:48 Country with tough standing up to the kind of pressure from the US and UK, but it has its own election.
00:14:01 February 17 and you can see that it wouldn't want an allegation that it had interfered, which it hasn't.
00:14:09 With the US election being used as a excuse by Hillary Clinton, who was the predicted president to interfere in the election in Ecuador, so quite quite, quite intense.
00:14:24 Security and diplomatic situation in terms of the security situation, yes, they were conspicuously armed.
00:14:33 British police, which I took a photo of and which we published parking their vans right next to the.
00:14:43 To see which they haven't done since back in 2012, when the first kind of stand.
00:14:52 I was in.
00:14:52 The embassy, so.
00:14:54 It's a kind of, you know, kind of.
00:14:58 Show of force, presumably to make some kind of pressure for WikiLeaks to stop publishing.
00:15:05 But we're set up to continue on regardless of what happens to me.
00:15:13 No one person in WikiLeaks can become a single point of failure.
00:15:18 Why? Well, because we don't want to fail #1 #2, because if that person is perceived to be a single point of failure, it's it's dangerous for that person.
00:15:35 So there's a question on Edward Snowden.
00:15:50 For publication, limited publication, etc.
00:15:53 And do we differ in our perspectives well?
00:16:00 Edward Snowden is a whistleblower.
00:16:04 He committed a very important and brave act which we fully supported to the degree that I arranged without legal team to get him out of Hong Kong.
00:16:20 And to a place of asylum now, not a single other media organization did that, not the Guardian, which had been publishing his material, not amnesty, not Human Rights Watch, not even any.
00:16:35 Other institution from a government.
00:16:39 So WikiLeaks, as a small investigative publisher which understands computer security, cryptography, the National Security Agency, which I've been publishing about for 10 years, sorry more than 10 years.
00:17:00 And asylum law, because of my situation.
00:17:03 So we can't have a situation where Edward Snowden ends up in a position like Chelsea Manning and is used as a general deterrent to other whistleblowers stepping forward, and he would have been imprisoned at any moment in Hong Kong.
00:17:25 And would have then been sold to the world as well.
00:17:29 Look, if you're trying to do something important as a whistleblower, your voice will be stopped.
00:17:35 You'll be placed in prison in very adverse conditions.
00:17:40 We wanted the opposite.
00:17:41 We wanted a general.
00:17:44 Incentive for others to step forward.
00:17:47 Now that's philosophical reasons.
00:17:51 It's because we understand the threat of mass surveillance.
00:17:56 But it's also very understandable for institutional reasons.
00:18:01 WikiLeaks specializes in publishing what whistleblowers reveal, and if there's.
00:18:07 A chill on sources stepping forward, that's not good for us as an institution.
00:18:13 On the other hand, if people see, yes.
00:18:15 It's good for sources.
00:18:16 To step forward, then there'll be more of.
00:18:19 On the.
00:18:22 Full publication versus extremely limited publication where Edward Snowden hasn't really had a choice.
00:18:31 He has had various views that have shifted over time, but he's in a position where we made sure that he had given all his documents to journalists when green world principally, but also some to guardian before he left Hong Kong because both Edward Snowden.
00:18:51 And I assessed that it would be a kind of a.
00:18:56 A dangerous bait for him to be carrying laptops with material on it as he transited through Russia to Latin America.
00:19:08 That might be something that that would cause the Russians to hold them, so we made sure he had nothing.
00:19:13 So actually, since the.
00:19:15 Point of those initial disclosure disclosures, it was Stodden hasn't been able to control how his publications has been have been used.
00:19:27 He's been a very important voice in talking about the importance of different aspects of them that he's had no control.
00:19:36 The result is.
00:19:37 Is that more than 97% of the Snowden documents have been censored enormously, important materials censored, and why there have been some pretty good journalists working on them? And Glenn Greenwald, I think, is one of.
00:19:55 The best journalists.
00:19:57 Publishing in in the United States, you.
Speaker
00:20:03 You have to.
Julian Assange
00:20:03 Have hundreds of people working on material like this and engineers, et cetera to understand what's going on.
00:20:08 So we we have quite a.
00:20:11 A different position to those media organizations that have effectively privatized that material and limited it.
00:20:21 Now you can't say that the actually the initial publications was all the important stuff because there have been many more publications as time goes by.
00:20:33 Even some within the past two months and those publications, for example in include ways to find.
00:20:40 And are sites in the United States used by the National Security Agency. There's procedures for visiting those sites now. If those had been released in 2013.
00:20:55 Investigative journalists and individuals could have gone to those sites before.
00:21:00 There was a cover.
00:21:01 Up and that's true in the United States.
00:21:03 And it's true in.
00:21:03 Europe and elsewhere.
00:21:05 I I'm a bit sad about in some ways about how the impact of the Snowden archive has been minimized.
00:21:14 As a result of.
00:21:18 Not having the greatest number of eyeballs.
00:21:31 Too many people in the overall general public have the mindset.
00:21:35 If I have nothing to hide, then I have nothing to fear in regards to privacy.
00:21:40 This is absolutely false.
00:21:42 I'm reading question.
00:21:53 I mean, it's a statement really.
00:21:57 While you can reverse this statement and this extremely irritating statement, I mean when you hear people say that. So I mean, this is so 21st century. So so you know Generation Z, it's not about you, it's not about whether you have.
00:22:17 Something to hide. It's about the whether society can function and what sort of society it is. The key actors are in society who influence its political process. People who publish publishers, journalists, MP's, civil society.
00:22:38 If they can't operate in the society, then you have a an increasingly authoritarian and conformist state.
00:22:48 Even if you're someone who thinks otherwise, you're of absolutely no interest.
00:22:53 The result is.
00:22:55 You have to suffer the consequences of the society that has been that has evolved.
00:23:03 Also, you're not an island, so when you don't protect your own communications.
00:23:12 It's not just about you.
00:23:13 You're not communicating with yourself.
00:23:15 You're communicating with other people and you're exposing all those other people.
00:23:19 And even if you assess at the moment that they're not at risk, Are you sure your assessment is correct?
00:23:26 And Are you sure they're not at risk going into the?
00:23:29 Future I I think the.
00:23:31 The the biggest problem with mass surveillance, actually.
00:23:37 Is that the knowledge of mass surveillance and fear about it produces intense conformity, so people start censoring their own conversations, and eventually they start censoring.
00:23:50 Their own thoughts.
00:23:52 So it's it's not enough to create fears about mass surveillance, one at the same time.
00:23:57 Has to.
00:23:58 Great understanding of how to avoid mass surveillance or understanding that at the moment most of the mass surveillance authorities like the National Security Agency and the.
00:24:10 That feeds are pretty incompetent.
00:24:13 That can change as artificial intelligence.
00:24:18 Merges with mass surveillance, where those data streams from the.
00:24:23 NSA and prison program are massaged by artificial intelligence.
00:24:53 OK. It's a question from Sam C0.
00:25:00 And have you?
00:25:01 Seen the WikiLeaks post on Twitter saying that the thinking about making this very, very high Twitter user, complete with full name addresses, phone numbers seems a bit silly to me.
00:25:08 Well, of course we didn't.
00:25:10 It's a false story.
00:25:13 WikiLeaks never posted any such thing on Twitter.
00:25:18 Primary WikiLeaks support group.
00:25:20 WikiLeaks task force said.
00:25:23 We are thinking about.
00:25:27 Creating what data points are needed to create a map of ingredients?
00:25:32 Sorry, a map of predictors to understand the relationships between people who are involved in.
00:25:46 Influencing on Twitter so verified users are influential, who influences those users, but that's a.
00:25:55 A A discussion question by a support group, and it explicitly stated that it was not about publishing addresses, so it does that.
00:26:11 Seeing that story spread well, why is it spreading it's spreading.
00:26:15 Because of two reasons, #1 as a result of the efficacy of our publications and the damaging the ruling class in Washington and.
00:26:33 More broadly, in the United States, there's a desire to reduce our reputation.
00:26:42 In the establishment press.
00:26:46 And so those things are grabbed onto taking out of context and promoted.
00:26:51 It is a second reason which is pretty interesting.
00:26:54 And the second reason.
00:26:57 At least it's my analysis, so this is the second reason is that the there exists 2 level class hierarchy on Twitter, so people with blue ticks, people without blue ticks, there's about 230,000 with blue ticks, and they correspond to something like about 30.
00:27:18 Are percent of those people who would consider themselves to be members of the establishment are in the English speaking countries?
00:27:30 So those are MP's, journalists.
00:27:33 CEOs, etc. People who.
00:27:35 Representative in some way, and therefore they have a need to interface with the public, so about a.
00:27:39 Third of those types.
00:27:42 In particular, the younger and more upcoming ones are on Twitter, and they're blue ticks.
00:27:51 So you have here both an identity phenomenon.
00:27:57 Where someone is branded with identity or blue tick and so an identity politics emerging in this group and also a class phenomenon.
00:28:06 And so the recontextualization of the WikiLeaks task force discussion point into a.
00:28:18 Threat against this identity group was then widely spread by that identity group and lined up fairly neatly with the politics of something like maybe 80% of that identity group. It's quite interesting you.
00:28:38 You think about the.
00:28:40 This new emerging identity class?
00:28:45 Well, it has a a quality within the blue tick class that is your blue tick or you don't.
00:28:53 And then so a number of followers and so such.
00:28:59 Metrics look.
00:29:00 And at what the relationships are between those people in the blue ticket entity class and are.
00:29:09 Exterior class dynamics.
00:29:12 So relationships to power of various kinds removes some part of the equalitarian nature within the blue Ticket entity class, which in some ways is a.
00:29:29 Threat to those people who have gained are the.
00:29:33 Blue tick, otherwise perceived.
00:29:35 They're not at the heights of the power in the exterior class.
00:29:40 It's it's interesting.
00:30:10 It's a.
00:30:11 It's a bit difficult to pick the points on these questions are changing extremely rapidly so.
00:30:16 It's a little hard to.
00:30:19 To respond to them all in order.
00:30:33 OK.
00:30:39 I'm going to just.
00:30:40 Go to the bottom of these questions because they're changing the scoring quite quickly.
00:30:49 So there's a question about that I said in.
00:30:53 August that we.
00:30:54 Have some information that the public can campaign, but it's from the point of view investigative journalists.
00:30:59 It's it's pretty difficult to deal with.
00:31:05 To compete with what Donald Trump simply what he says.
00:31:09 So yes, we did.
00:31:11 We we received a couple of company registration extracts.
00:31:19 And then our team looked at them and they were already public, so there's already public information.
00:31:28 And WikiLeaks specializes in the publication of information that's not yet public.
00:31:49 Why released emails in a constant trickle near the end of the campaign?
00:31:54 If the truth is your goal, surely expressing them in a couple of batches would work just as well.
00:31:58 Constant drip drip for the last month of the campaign, WikiLeaks, hoping to have maximum political impact in their campaign.
00:32:06 It's an interesting question.
00:32:10 Why the irritation?
00:32:12 So why the irritation compared to publishing all at once, people you can imagine if we publish all at once would say you made it deliberately made a giant bomb.
00:32:25 You deliberately published all at once in order to have maximum.
00:32:30 Well, in WikiLeaks publications over the last 10 years, they're used a variety of publication strategies depending on the amount of material, how readily engaged the audience is, what the time frame is for publication and.
00:32:49 So what we've found is that you want to closely match the demand curve with the supply curve.
00:33:00 So people can read a limited amount of words each day.
00:33:03 Just think about it.
00:33:04 There's a finite number of people as a finite amount of time and finite reading speed, and so.
00:33:13 Their demand for words, even if they are 100% interested in that subject, is finite.
00:33:23 So it's optimal to match the demand for a particular type of information with the supply of that information.
00:33:33 If there's over supply of information above the demand for it, then the oversupplied part.
00:33:44 Is not red.
00:33:47 Of course, we want our publications to have maximum possible leadership, understanding and our sources of all kinds want maximum possible impact there and want to go through these risks for their material to not be read.
00:34:14 I have to say on on the strategy of our our.
00:34:19 Publication across with our you know selection related documents, we're pretty proud of it.
00:34:25 Actually, there's limited time, limited resources.
00:34:31 Yes, we could have done things slightly differently if we had had, you know, more money, more staff, etc.
00:34:40 But within our resource constraint.
00:34:44 We put together, I think, a pretty ******* publishing, publishing schedule designed to maximize uptake, leadership, engagement and not knowledge extraction from our publications.
00:35:04 Designed deliberately to make it hard to spin what we were publishing.
00:35:11 What do I mean by that?
00:35:15 In this particular case, we have the Democratic campaign of of Hillary Clinton and her associated media allies are doing everything they could to spin what we were publishing.
00:35:31 And I know how this works, says.
00:35:34 Knowledge that WikiLeaks is going to be publishing say open month long.
00:35:41 And the crisis team itself, we've had a number of those WikiLeaks war rooms and crisis teams set up against us by different governments and companies.
00:35:50 Bank of America to the Pentagon, the State Department, and they, they get ready each morning, wait for our publication, and then they try and spin it so.
00:36:01 Insofar as our publications are all predictable.
00:36:06 That spin can be lined up ahead of time, and those war rooms can be resourced, so we made sure that what we were going to publish.
00:36:18 Was unpredictable when we were going to publish.
00:36:21 It was unpredictable how much we're going to publish each day was unpredictable that we had both the.
00:36:29 A human element under looking closely at what was happening with the news and finds on Reddit, and so on, and a an algorithm which also introduced cryptographically secure noise into publication decisions in relation to amounts and timings.
00:36:51 And making that decision on the fly, not a month ahead of time, with the schedule planned out.
00:37:00 Because if we were hacked.
00:37:03 We didn't want in this case our algorithm, the stochastic Terminator, its programmatic output to be known in advance, because that would permit the Clinton campaign and others to attempt to counterspin our our publications at each moment.
00:37:22 And we want our publications.
00:37:24 To be as unspun as possible.
00:37:54 OK, there's a question.
00:37:56 Please address the allegations that WikiLeaks has a friendly relationship with Russia, the timeline.
00:38:01 Blah blah blah.
00:38:03 So it I've I've seen this.
00:38:05 I've seen this rubbish again and again and again.
00:38:09 Let's pull back and understand what's going on.
00:38:13 WikiLeaks has published more than 10 million documents over 10 years. We have a 100% accuracy rate on authenticating our publications. Everyone in the media knows that we have a 100% accuracy rate. So despite our publications.
00:38:32 Affecting powerful groups which are related to the media establishment media is in a difficult position, which is.
00:38:39 We have perfect.
00:38:40 Credibility of our publications so.
00:38:45 One has to make ad hominem attacks to try and color perception of the organization involved, and therefore colour the publications, because no direct attack is possible, and so we get all sorts of ad hominem attacks are about WikiLeaks about our sources about me.
00:39:05 I think called the dog torturer A Mossad agent, a CIA agent, a Russian agent, now a A **********.
00:39:14 Recently on MSNBC.
00:39:18 Sorry, on CNN, just twice in the last week, so.
00:39:24 Yeah. Attack simulation to Russia are just one of those factors with Helix has published more than 800,000 documents that relate to Russia or Vladimir Putin.
00:39:44 And most of those.
00:39:45 Are critical more than two million are related to Syria from Syria, about the Chinese documents a lot about China material from China.
00:39:56 We've been banned in China, etc.
00:39:57 So each country.
00:40:02 Its establishment tends to perceive.
00:40:06 WikiLeaks as a something that is difficult for them and that erodes the authority of state institutions.
00:40:17 And that's true in the United States, so.
00:40:24 Yes, WikiLeaks said it was ready to give up.
00:40:27 Bombshell on Russia not.
00:40:28 Quite, we said we have important documents.
00:40:30 Pertaining to Russia.
00:40:32 And yes, the FSB apparently was quoted as saying.
00:40:37 Well, we can electronically attack or something, WikiLeaks and then we did publish those documents.
00:40:42 Those were the Russian related documents.
00:40:47 In the US diplomatic cables, and they were extremely strong.
00:40:50 Speaking about Chechnya, Russian crime, a number of books were written from that number, calling Russia a mafia state.
00:41:01 A number of successful lawsuits against the Russian state have made use of those documents and other documents, etcetera, another.
00:41:13 Another common in truth that is told is the claim that I worked for RT, the Russian state TV.
00:41:24 It's absolutely false.
00:41:27 In 2012.
00:41:30 We set up a production company and our production company worked with Dartmouth Films, a UK production company and a distributor whose name I can't, the journeyman pictures.
00:41:45 And 10 sorry, 1212 episodes were filmed of me interviewing people. It was called the world tomorrow.
00:41:54 So it's first TV thing, David, and we licensed that to a dozen different organizations and RT was one of them.
00:42:02 And RT aggressively promoted it internationally.
00:42:08 And then people tried to twist this story into me working for RT.
00:42:24 Question am I?
00:42:25 How am I in direct control editor, author, wiki leaks at all leaks Twitter.
00:42:30 WikiLeaks, task force, etc.
00:42:32 Why the change in tone?
00:42:37 Well, those accounts are maintained by a number of people, including me.
00:42:42 I am the editor and publisher of WikiLeaks, so.
00:42:45 Normally I have all the.
00:42:47 Control, blame and responsibility and unearned credit for what we do.
00:42:54 Yes, of course.
00:42:55 If there's a difficult situation at the embassy or the Internet is cut off it, it changes.
00:43:03 If you like the amount of.
00:43:07 The speed at which I can have input into the rest of the team, so you might see a slight shift.
00:43:15 But I have full confidence in my people.
00:43:20 I mean, they've.
00:43:23 They've gone through hell and back with me.
00:43:26 They're they're really tough, smart people, incidentally.
00:43:32 You look at how tough.
00:43:33 They are are so when?
00:43:36 My Internet was cut off here, and there's also problems with radio signals coming out of this embassy.
00:43:44 It was a difficult time, lots of pressure and lots of pressure on WikiLeaks itself.
00:43:50 So that's a situation like.
00:43:54 If you like troops.
00:43:55 Are losing their commander during the middle of the battle with bombs raining down on them.
00:44:03 What usually happens to troops like that?
00:44:07 Well, they'll, they'll scatter.
00:44:09 They get scared, you know, they go home, they and they get killed.
00:44:14 They're not our people.
00:44:18 Middle of that battle, lots of political ideological attacks, smears, having more difficult contact with me.
00:44:27 No, we didn't even lose a single day in our publications as a result.
00:44:33 And that comes about because of planning.
00:44:38 And because of the experience and robustness of the staff of.
00:44:43 WikiLeaks and and.
00:44:47 Exterior support also of course keeps our spirits up.
00:44:56 Shortly after the Internet access was cut, the head moderator OF/R slash WikiLeaks at 16 users to the moderation team, etc.
00:45:05 We don't know anything about who's moderating Reddit.
00:45:09 Reddit has as most of you will know, censored things from time to time.
00:45:12 It's it's owned.
00:45:13 By Conde Nast.
00:45:16 It is perhaps the most the place for the.
00:45:20 The traditional media are all it is owned by the traditional media or media holding, which has the greatest freedom of expression.
00:45:29 But Reddit is not free from censorship.
00:45:33 They've seen that many times.
00:45:34 On the other hand, it's fairly easy for people to constantly repost things that are being censored.
00:45:48 OK, so there's a question on.
00:45:54 Am I alive?
00:45:55 Am I kidnapped etc etc.
00:45:59 OK, so this is the whole proof of life topic.
00:46:03 So it's pretty it's it's very.
00:46:07 We saw that evolve and.
00:46:11 It's both gratifying and the bit alarming, I explained why.
00:46:20 Personally and the rest of the team at WikiLeaks were very pleased that there was such an expression of concern about how we were doing.
00:46:31 We expected all these attacks and if you look, if you looked at like public statements and some of the statements tweeted by WikiLeaks in the.
00:46:40 The lead up to my Internet are being cut off and to that difficult diplomatic situation you were saying, you know your taxes are going to come in, going to need people to defend us.
00:46:53 We're going to basically need an army to get through this.
00:47:03 Then the UM.
00:47:06 Concerned for how I was doing and why I wasn't being seen, etc.
00:47:10 Arose so the the the answer to.
00:47:17 There's proof of life question is that we're interested in something quite different.
00:47:22 So any anything that we did that claimed to be some kind of proof of life would be to set the precedent on what?
00:47:34 A reduction.
00:47:35 What mechanism could be used to reduce concern?
00:47:41 So they they cause for example that I.
00:47:45 Issue a PDP signed message.
00:47:50 OK, well it's fine if you can understand that it's me who's issuing the PDP signed message but that but the PDP signed message doesn't tell you who has issued it at all.
00:48:01 It's just a claim message.
00:48:02 So let let's look at what kind of precedent.
00:48:06 We would be setting we'd we'd be setting the precedent that when there's a concern about whether one of our staff has been kidnapped or me, that concern can be alleviated.
00:48:22 Simply with the issuing of a a message of text which is coupled to a particular cryptographic key.
00:48:33 Now, if WikiLeaks is under serious threat, then it's quite possible it might lose control over its keys.
00:48:43 And the.
00:48:47 The reality is it's it's quite hard to protect keys.
00:48:54 From and that that kind of interference, the way WikiLeaks manages its keys is to its submission keys.
00:49:04 For example, they are not used to sign messages.
00:49:08 But even if they did sign a message in this case, what would?
00:49:12 Would be saying it would be.
00:49:13 It'd be setting a precedent that could be very dangerous in the future.
00:49:19 You don't have to if you like, produce the the person and show you that they're not under duress.
00:49:25 You can either hacker WikiLeaks key or take control of infrastructure.
00:49:32 Take control of the person and then have them, or rather claim that they had produced some signed message so we're much more interested in.
00:49:42 A situation creating a precedent for proof of life, creating a precedent for proof of freedom from duress or making it hard for our people to be under duress and the best way to do that is live.
00:50:04 Because even if you were under duress and there's various forms of various forms of duress that could be applied if it's live, you've got a few seconds to put things out.
00:50:17 You can slip in code words into what you're saying.
00:50:20 I'm not, by the way, I'm not, but.
00:50:23 You could slip in code words into what you're saying that people your people could then see.
00:50:31 And so yes, I'm alive and free from duress, but I am in a very difficult situation.
00:50:42 I have been for six years.
00:50:44 Let let's not let's not think that I'm not in a difficult situation.
00:50:48 As I explained, this embassy is surrounded by.
00:50:51 A high tech police operation intelligence operation A. It's a really difficult situation. Haven't seen the sunlight in 4 1/2 years.
00:51:03 It's a tough situation.
00:51:05 I'm tough.
00:51:06 But you know, you should be concerned about the situation.
00:51:12 And what we had.
00:51:13 Hoped is that the those people concerned with my safety would direct their attention.
00:51:24 To those people who are responsible for the situation, that's the UK Government, the US government and the Ecuadorian Government.
00:51:35 Now some some of you did, and that's quite possibly why the Internet was put back on because of that expression of concern, but.
00:51:49 When the concern became very prominent, the result was a black PR campaign tried to infest the concern and take it off somewhere else, and largely succeeded.
00:52:04 That was very interesting to watch and play out.
00:52:10 What happened?
00:52:13 Fabricated messages claiming to be from our staff were posted on FORTRAN, on Reddit, et cetera are on videos claiming to be from anonymous, completely fabricated dozens of them.
00:52:30 What was their intent?
00:52:34 What were they calling for?
00:52:36 They were calling for people to not trust WikiLeaks to not give it leaks and to not give it funds.
00:52:48 I mean, it's obvious who benefits from the production of such a black PR campaign, and it should be obvious in hindsight to all those people who are trying to support me that.
00:53:04 Those type of messages are were.
00:53:09 Deliberately intended to undermine WikiLeaks and in fact undermine my support.
00:53:17 Just if this sort of thing happens in.
00:53:19 The future?
00:53:20 Think to yourselves.
00:53:24 Is what is claimed.
00:53:29 The ability for WikiLeaks to operate the ability for it to get new information and the ability of of it to financially support itself.
00:53:40 And if the answer is yes, then you should be extremely skeptical about what the claim is.
00:53:48 OK.
00:53:52 But we are having seen how concerned for us can be manipulated and misled.
00:53:59 But also the.
00:54:00 Degree of concern.
00:54:01 We now have a game plan first.
00:54:05 If this kind of thing happens again and a pleasantly confident.
00:54:12 About the kind of worldwide support we can get if we get a similar type of attack again in the future.
00:54:22 And yeah, once once again.
00:54:27 Yeah, you can see that I'm speaking and.
00:54:30 Maybe apparently saying but.
00:54:34 Don't reduce your concern.
00:54:36 I am in difficult situation. That's the reality, but the difficulty of the situation is well expressed on justice for assange.com are the UN findings etcetera. WikiLeaks itself is also its staff on a difficult situation, constantly spied upon, harass, et cetera.
00:54:57 So yeah, support us now.
00:55:00 Don't wait until.
00:55:04 We are in a difficult situation that might be difficult to get out of.
00:55:09 Make sure we're strong now going into difficult situations as a result of what we publish.
00:55:34 I'll just take two more questions.
00:55:40 I'll try and refresh to see if anything has has come up that's new.
00:56:01 There's the question at the top about collateral murder and supercomputer time.
00:56:09 I can't confirm or deny anything relating to our sources there, however.
00:56:20 Yes, there is a disappeared video and that that video is on the Golani massacre. Over 80 children killed in a US airstrike in Afghanistan and more than 100 people.
00:56:36 Quite a serious video and and you can if.
00:56:41 You search for.
00:56:43 Assange affidavit you lead an affidavit about how Sweden conducted an intelligence operation on September 27th, 2010 to seize 3 laptops.
00:57:02 Not the higher security laptops backup laptops that were encrypted, but which ended up being the only copy that we had of that video.
00:57:13 We had other copies, and they're also attacked.
00:57:17 So that's.
00:57:24 A great sadness from us that this terrible proof of a war crime has been.
00:57:35 Possibly lost to history as a result of very difficult attacks on us.
00:57:43 It's something that we're a lot less susceptible to now because we have.
00:57:47 A bigger infrastructure.
00:57:55 OK, I think OK, well, the question on the Panama Papers, so does the claim repeated in, you know, the the usual idiots in the ruling class press that are?
00:58:14 WikiLeaks said the Panama Papers had been produced by CIA U.S.
00:58:18 intelligence to to attack black Vladimir Putin.
00:58:21 Absolutely not.
00:58:22 In fact, we explicitly stated that we did not believe that was so the key journalists and newspapers who collected the Panama Papers are in Germany.
00:58:32 Are our publishing partners.
00:58:35 So we knew about the story we aggressively promoted.
00:58:40 However, the particular story that came out on Vladimir Putin and which was pushed as the leading story in the western press rather than issues related to say, David Cameron or or Western figures coming out of the Panama Papers.
00:58:58 Was funded and produced.
00:59:02 Sorry was funded by USAID.
00:59:08 And solace foundation.
00:59:12 And they founded an organization called OCR, which does sometimes good work.
00:59:19 But it's based on what it's very it's I think it's based in Maryland, but it focuses exclusively on negative.
00:59:28 Stories about the Russia and the former Soviet states.
00:59:33 So you have story on Vladimir Putin.
00:59:38 Produced by.
00:59:41 An organization which exclusively focuses on Russia and the former Soviet states.
00:59:48 It's based in Maryland that's funded and the only fund is listed by US aid and the Source Foundation.
01:00:03 Is no model for integrity, and that's what we said that some good journalism but this.
01:00:09 Is a difficulty when you have negative stories about Putin being pushed forward and funded by the US government.
01:00:19 So we're we're trying to distinguish WikiLeaks model of publication where we're funded by our leaders and not by.
01:00:29 Dodgy foundations or the US government?
01:00:33 And you know, we don't.
01:00:35 Like it when we're in competition with an organization based in Washington, DC, funded by the US government, and leaders should be able to distinguish.
01:00:49 Which source is more reliable to give you the truth, one that's funded by a government attacking that you're in another government or one that's funded by its readers and has a track record of publishing everything.
01:01:07 If it has a.
01:01:09 Collection of material eventually.
01:01:26 OK, there's some people saying I should.
01:01:33 Because of the advances in technology in relation to video editing and audio etc, that I should.
01:01:40 Try and do something that is.
01:01:44 Establishes the what I'm what I'm saying.
01:01:48 I'm saying now, as opposed to these questions were planted and.
01:01:53 Said from some time ago.
Speaker
01:01:55 Well, it's a.
Julian Assange
01:01:56 I have to say it is a little.
01:01:57 Bit silly, uh?
01:02:03 Not in relation to us being under pressure, we have been under a lot of pressure but we're very good at resisting pressure, but.
01:02:11 In relation to whether I'm.
01:02:14 I'm alive or kidnapped.
01:02:16 Actually, it is a bit silly.
01:02:18 So if if you look at people like John Pilger for example, long term friend of mine runs, my defense fund is a famously brave investigative reporter.
01:02:31 My lawyers close friends.
01:02:34 People like Lori love the Ecuadorian.
01:02:37 Government, if you think about the the number of people who would actually have to conspire in the amount of work that.
01:02:42 Would have to be done.
01:02:45 To produce these false images, there's too many.
01:02:48 That's a that's a social proof, and to understand that one needs.
01:02:54 To look at the.
01:02:58 Costs and understand the costs involved in trying to.
01:03:03 Pull together all those people and trying to keep a lid on them and engage in all this kind of fabrication technology which does not yet exist in a capacity as far as anyone can tell, in a capacity to do what it's.
01:03:19 Done to do all that, that's the cost.
01:03:23 And then look for what benefit?
01:03:26 That's a interesting question.
01:03:29 But in thinking about real time.
01:03:32 Proof of life are, well, intellectually.
01:03:35 The most interesting one is to take the most recent block in the blockchain in the Bitcoin blockchain.
01:03:47 Give the number and at least you know 8 digits or something of the hash.
01:03:53 And then maybe to spell up is hashed by sign language?
01:03:58 That's very.
01:03:59 That's kind of intellectually entertaining.
01:04:06 What is the problem with it?
01:04:08 Well, let's see if I can get.
01:04:10 A recent hash.
01:04:11 But while it's intellectually entertaining.
01:04:16 The problem with it is this.
01:04:17 It's very complicated.
01:04:20 The underlying technology and so it has to, it has the same.
01:04:25 Floor that sophisticated voting machines have cryptographic voting machines, which is the average person, can't.
01:04:35 Understand whether the security claims are are in fact borne out.
01:04:43 Now experts might be able to.
01:04:45 The average person can't, and so then you're back to a social group.
01:04:48 Does the average person trust the expert?
01:04:51 And then how do they know that those experts are really experts that haven't been compromised.
01:04:56 So in fact, while it's intellectually entertaining.
01:05:02 It's not at all. Ah, good type of proof of currency. I'll give one anyway. So this is block 445706.
01:05:16 And the hash is 178374 F 687728789 C AA92 E CB49.
01:05:50 OK, I think I made a mistake in.
01:05:51 The lock camera.
01:05:54 We just gotta drive everyone crazy so that.
01:05:59 Block number 447506.
01:06:06 See, this is how you can tell it's real time with mistakes.
01:06:11 Hash 178374 F 687728789 C AA92 ECB 4/9.
01:06:25 OK, intellectually, intellectually entertaining.
01:06:27 You don't actually have to read out the whole hash number, maybe 8 digits or something combined with the block number would be.
01:06:36 To show currency within a 10 minute, our period something like that, but actually the the the better way to show currency is?
01:06:53 Used that can be widely checked, is widely spread and is unpredictable.
01:07:00 Before it happens the the the best would be you know a few different natural disasters maybe.
01:07:08 A lot of.
01:07:11 Weather measurements.
01:07:34 And we needed.
01:07:44 The yeah, the.
01:07:45 So the otherwise you need something that's not easily predicted and which can be widely checked or was widely seen at the time.
01:07:57 And a good example of that is sports scores.
01:08:03 So for example.
01:08:05 The New Orleans Pelicans versus the NY Nix kicks 1110 to 96.
01:08:15 Oklahoma 109 versus 94. Chicago, Dallas 92 versus 101 from Minnesota.
01:08:27 OK, so that can give you concern.
01:08:31 In terms of any future precedent.
01:08:39 If I disappear or someone else disappears.
01:08:44 The answer to whether we are OK and under duress is given by two things or should be given by two.
01:08:51 Things in future.
01:08:54 Number one.
01:08:55 By lawyers friends.
01:08:58 By lawyers publicly associated close friends, people who fund my defense campaign.
01:09:05 So let's look at those.
01:09:08 John Pilger the Courage Foundation people associated with it by lawyers such as.
01:09:18 Jennifer Robinson.
01:09:23 Margaret Ratner, United States.
01:09:28 Linda Taylor.
01:09:33 The ability to do live interactive video where someone, even though they might be even though theoretically they could be under duress, can interject in in the stream quickly.
01:09:50 To say such a thing or.
01:09:51 To, you know, give a a variety of messages are in a live way which each one is not comprehensible at the time that is said, but the last one, if you like, provides the conceptual key to decrypt them.
01:10:08 I'm not doing this now.
01:10:10 I'm not doing this now.
01:10:12 So yeah, I very much appreciate the support.
01:10:17 It had some good effect.
01:10:20 I think it probably contributed significantly to restoring my Internet.
01:10:28 A lot of.
01:10:28 That well-intentioned support was waylaid by a black PR campaign. So.
01:10:37 Don't let that happen again.
01:10:42 That's it.
01:10:43 Thank you, Reddit.
01:10:44 Thank you.
01:10:44 Reddit is for spending so much.
01:10:48 How material?
01:10:50 Yeah, we're really, really happy with it.