• Welcome to Taiwan. I’m glad to see a new organization formed here, the Doublethink Lab.

  • I understand that the topic today is about the media landscape, media competency-building, as well as misinformation, disinformation, malinformation, the so-called MDMI, and many other topics as well, however you want to discuss. We usually begin by making a round of introduction and people raising any topics that they want to address.

  • Then we’ll just have a conversation based on the agenda that we crowdsource here.

  • I’m Audrey Tang, Digital Minister in charge of open government, social innovation, and youth engagement. As part of social innovation, we promote the work of social innovators that tackles the disinformation crisis, including the CoFacts project, including Trend Micro, which is an antivirus company, including many of those civic tech groups, such as the Doublethink Lab, that tackles these issues. On the other side, in the open government, we also work with stakeholders of all kinds.

  • For example, when there’s a lot of automated scam bots on Facebook that sends people private messages to sell iPhones or hard disks, and what they deliver to the door is a brick, literally a brick. [laughs]

  • This is now very cheap to do because the synthetic images, synthetic text, and synthetic chatbot technology is now so democratized that anybody can write such a scam bot, abusing the postal delivery system. For that, I’m the semi-ambassador to semi-sovereign entities – read FB – to talk about how to tackle these issues together.

  • I’m both a bridge to the civic tech community to work with the government, and also with the multinationals to work with the government on specific issues concerning Internet governance. That’s my interest area and topic. Is it OK if we start from the left-hand side, perhaps?

  • Hi, I’m Fi from Doublethink Lab. It’s a new organization targeting disinformation.

  • We’re building tools to check it. I’m the lead of social communication team. We’re focusing now to do a research on how the people affected by disinformation and how do we build a counter-narrative.

  • Social effect and counter-effect. That’s a full-time job.

  • (laughter)

  • Is it better to use microphone?

  • No, it’s fine. This is a Sony top-of-the-line recorder.

  • (laughter)

  • Hi. My name is Adam King. I’m with IRI. I work in a Northeast Asia portfolio, which includes Korea, Mongolia, and anything that would have the word China involved with it. We’re about to receive a grant that looks at disinformation in Taiwan focusing on the elections. We will be working with Doublethink Lab, probably g0v. We’re just getting that up and running.

  • This is a great opportunity to come and be here as we’re getting that going.

  • Hi. I’m Dean Jackson with the International Forum at the National Democracy. I’m a program officer for research and conferences there. My portfolio touches on anything having to do with disinformation, media, and technology.

  • Hi. I’m from an organization called Association for Progressive Communications. It’s a network organization that deals with digital rights. We mainly deal with lots of stuff, like Southeast Asia, South Africa, or Latin America issues. We’re thinking of having a research done with Chinese influence.

  • Hi. I’m Felix Wu. I’m a faculty member of computer science in University of California, Davis. I’ve been working mainly on social algorithm for social media system, how social algorithm is being used by Google accounts such as Electronic Army trying to spread disinformation.

  • By Electronic Army, you mean people coordinated by electronic means?

  • It could be people, it could be bots.

  • Also, automated bot networks programmed by people or programmed by programs nowadays, software defined software, but yes. I see.

  • We’re making progress.

  • That’s my wife.

  • I’m Felix’s wife.

  • (laughter)

  • She’s coming with me.

  • Hi, I’m Dora, and I’m from Doublethink Lab. My main concern is about the disinformation campaign, election campaign, because currently it is all about election campaign. I really want to know, do we already have the proof that China already has the disinformation impacting on Taiwan’s election, especially next year’s election?

  • The next election, the coming election?

  • Yes, the coming election. How do we act? What roles an organization can do to prevent or to educate people to know that disinformation is among us?

  • Two things. You talk about raising awareness, which is the same as the social effect part, but you’re also talking about attribution, like having data with academic rigor that shows disinformation campaigns being wages.

  • Hello, everybody. I’m from Doublethink Lab. I do the software development part to help social community team and for our organization. I build software, custom too. I participated in the moedict project, too.

  • That’s right. We worked on the crowdsourced dictionaries, indigenous, and on Taiwanese Holo, Hakka, and many other languages.

  • Moedict places much less pressure on all the developers. All the developers are very happy, sharing food. Joy takes us further. I wonder if we can turn something in this into more humorous way.

  • There was a new party that was just formed a day or two ago that was the focus of building counter-narratives, building on jokes mostly. [laughs] They’re professional comedians, three of them. They form a new, happy party called “Can’t Stop This Party”.

  • (laughter)

  • Or 歡樂無法黨 in Mandarin. There may be some angle in that as well. Back in MoeDict, we made a lot of those kind of jokes to attract people to crowdsource the dictionary. That may be interesting angle. [laughs]

  • I’m Pablo Wang. I’m a journalist based in Hong Kong. Previously, I was working for an NGO on a project to collect, save, and analyze data from social media, from media. I think that this project could probably be quite useful for a lot of the actors here in this room.

  • Also, institutional media, not just social media, or just social media?

  • I’m a writer and a journalist. I’m also interested in information coming from China.

  • Hi, I’m Martha Ramos. I’m a journalist from Ecuador. We are interested also in what we call cyber wars.

  • Cyber wars, that’s something else altogether.

  • …disinformation, surveillance, cyber attacks, and all that phenomena. As journalists, sometimes we have been on the receiving part of those instruments. We are also interested in disinformation in the society and how, as journalists, have the possibility to prepare a counter-narrative of disinformation.

  • Journalists as subject to attack that takes away people’s digital right, including hard cyber attacks, and also journalists’ output as a way to help to develop counter-narratives?

  • Hi, I’m Roger Potocki from the Europe team of the National Endowment for Democracy. We’re a grant-making institution. We’re funding a lot of the programs that are working on Russian and Chinese disinformation. I know mostly the European side of this. I’m trying to learn more about the Asian side of this.

  • I’m Miriam Kornblith, also with the National Endowment for Democracy. I oversee the Latin American/Caribbean program. As Roger, I’m learning more about Asia. We are very interested in everything related to the information space, the pollution of the information space with all the different things that have been mentioned. That’s our priority for now.

  • Pollution is a great metaphor.

  • Pollution of the information space.

  • I’m Norwegian, and we primarily work to make the case in Norway, and to explain what the possible concept surveillance can be. For example, we put the cameras and all the things and we say it’s great, the technology can progress, but what is the consequence if you don’t protect their privacy?

  • Additionally, we’re working with the project of Uyghurs. It’s collecting testimonies of the people who is in the concentration camps.

  • Not just digital rights, but very basic civil rights or even human rights and technology that progresses very quickly in the negative direction. This is Joe.

  • I’m Joel from Minister Tang’s office, the PDIS. Actually, I’m from the Ministry of Foreign Office, Taiwan, and I’m the delegate here. I think originally that I work for her but I found out that I’m working with her.

  • That’s right. The Foreign Service is one of the many ministries. Each ministry can send one delegate to my office. I understand there is like a dozen in the Foreign Service. Joe has been working very well collaborating with me.

  • Me, I’m in the council of the Digital Minister’s office. I come from the National Communications Commission. My previous job is the director of the Legal Department Commission. My job on this office, the first one is to improve the communication between the minister and the civil servants.

  • That’s right. Important to work across different languages.

  • (laughter)

  • The second, make sure everything we are doing is legal. Sometimes, some legal thinking is needed.

  • If what we are doing is not illegal, but unprecedented, we work with the regulators to make it legal.

  • (laughter)

  • That’s the introductions. I think there is many topics that we can begin a conversation on. This central triangle between media, disinformation, and tech may be something that we are all focusing on. I’ll just share a little bit of my understanding on this triangle, and then maybe we just have a back and forth.

  • Both of my parents are journalists. My dad covered the Tiananmen protest up until the 1st of June. That was also the first time that digital photography was used and used fax, like modem lines, to transmit tank men and other pictures out.

  • Technologists, like people working in Kodak and journalism community, withheld the news about this new capacity of a camera directly transmitting signals through telephone lines to the international community. At that time, the PRC was setting a lot of guards in all the borders to prevent films from being brought out of the PRC territory.

  • They did not watch the telephone lines, and that is how many of the pictures how they went out and become the journalistic contributions. It’s not until later around the end of 1989 that people learnt that there is such a thing as digital photography that can connect directly to a phone line.

  • I use this anecdote to show the complex interplay between those different fields of how journalists work essentially as a defense against any disinformation campaign. Disinformation campaign relies on the fact that propaganda, rumors, more information pollutions spreads faster than genuine information, than information with a source, with attribution, with academic-rigor-backed evidences and so on.

  • For me, democracy is never about running campaigns or elections, which is like game theory, you try to win on a simple game, but rather on mechanism and design, which is you start with a goal of having people sharing their personal experiences with the goal to build to a common value. You start with the end goal and reverse design the game so that all the players have incentives to play in a more pro-social way.

  • That’s my take in the tech part. Tech is never neutral. It amplifies whichever value that the technologists imbue in it. I choose that direction as my guiding principle. I’m really happy that in Taiwan, we have a lot of people who work in their day job in hardware technology, firmware technology, renewable energy, whatever.

  • Use their weekends and their after overtime personal time to contribute to democracy by working on the technologies, as I mentioned on this direction of social innovation toward democracy, which nowadays we call it civic tech. That’s my perspective on this matter.

  • I try to work with all the large social media platform companies – Facebook, Twitter, and friends – to try to convert them into companies with social purposes or so-called social enterprises that furthers this vision rather than any counter-visions. We are seeing new norms around this year where instead of saying they are a mere conduit, they agreed that they are also governance.

  • We are talking as governor to government as a way to align on shared values and standards, and build those norms into code, rather than having the code dictate the social norm which pollutes the media landscape and everybody’s cognitive space as well. That’s my basic principle. I don’t know which questions you’d like to ask. In any order, the floor is yours.

  • How has this interaction been with the companies? To what extent are the companies accepting this perspective of how…How are willing are they to change their views?

  • Fi and I were both demonstrators in a certain demonstration five years ago. That demonstration is unique in that the demonstrators don’t see themselves as protesters only. They see themselves as literally people who demo, as showing a better alternative.

  • I remember around 20 NGOs, each focusing on one part of the Cross-Strait Service and Trade Act, the CSSTA, talking about aspects of the environment, of labor, of gender, of supply chain, agriculture. I can’t remember all 20, but you probably can.

  • The 20 NGOs each convened deliberative camps, literally camps around the Parliament, and they just had a conversation with anybody who cared to walk by, which at one day is half a million people.

  • They demonstrated that with a good mechanism, even half a million people on the street and many more online, which is more connected to the part that I was doing at that point, can converge after three weeks on five demands, not less, on the Parliament at that point.

  • The great thing is that the head of Parliament at the time accepted those consensus as people’s will around the CSSTA. The Occupy was a victory. I used this anecdote to show that although it begins with miscommunication from the government side about due process with the citizens, the citizens didn’t just storm the buildings to protest about the bad process, although they did that, too.

  • They also did a valuable demonstration that the government can then say we’ll just…In Taiwan we say harvest. We will just harvest the result of those demonstrations and make the civic tech the foundation of gov tech. At the end of that year, everybody who supported such mechanisms become mayors. Sometimes, they didn’t prepare inauguration speech. People who did not support it did not become mayors.

  • I use this to show that in our experience, a viable demonstration is usually the best way to get large companies to change their direction, to change its mind. I personally participated in the original spam wars where we coded a software called SpamAssasin. It’s very warlike, Ninja, whatever metaphors.

  • At the end, the issue of spam was solved by this kind of demonstration, by people showing to Hotmail, to Gmail, to all the mail providers that it is to everybody’s benefit to have a flag as spam button to your inbox, and it’s better if everybody work with Spamhaus instead of having your in-house analysis team only.

  • It’s for everybody’s benefit if the signatures from Spamhaus informs the algorithms so the incoming spam goes to the junk mail box rather than to inbox. Nowadays, we call it burying the incoming spam messages so it’s no longer economically viable.

  • This would not happen had we not developed already browser extensions, the Bayesian filtering, AIs, or whatever and showing the multinational companies that they can at very little cost incorporate this new thinking into it that creates a social benefit.

  • We’re seeing more or less very similar things going on around the disclosure of advertisements, around work with the social sector, third-party fact-checkers, around the opening up to the Workbench that they provide advertisers such as CrowdTangle to also social scientists. I would argue, to everybody else as well in some future, Twitter is going to that direction, Facebook maybe slower, and things like that.

  • They’re waiting for their judicial branch to be formed, and things like that, oversight board. That is where we are at at the moment. Whenever we show there is a viable alternative, they are actually willing to listen and sign on counter-disinformation package.

  • If the norm is not yet set, and if we dictated something into law, unless there is an existing law that already argues very strongly about it like spreading disinformation about diseases starts spreading. Of course, if we change talking to the public spaces to cause harm to also digital, nobody argues against that.

  • If there is a new law that doesn’t have an analogue equivalent, then there’s a lot more conversations and tensions with the multinationals. That’s the landscape.

  • Can I ask you, you talk about how journalists are one of the groups who can defend against disinformation, but I often see them as one of the worst culprits as well, because often, they will latch onto the disinformation and not do their own due diligence. They will then become a primary spreader of the disinformation and give it some legitimacy.

  • What guidance or what efforts are you guys doing to educate journalists or to do anything to stop that sort of disinformation?

  • This issue is especially confused in Taiwan, because journalism, 新聞業, is the same word as news, 新聞. Imagine a word, news speak, where journalism is just news work, news workers news. Then it became literally impossible when you hear the F word, the fake news.

  • Whether it’s an attack on journalistic integrity, or whether it’s saying that random people are pretending to be journalists, or that journalistic standards are being lowered by editors who are clickbait, and things like that. These are very different, not overlapping issues. It’s difficult to address any one of them when the word is the same.

  • In Taiwan, were facing this issue because it’s the same word that’s used to discuss all these issues, so we have to make new words. I’m not talking about this. I’m talking about this. In Taiwan, we made a new legal definition actually that used existing legal terms. Intentionality, harm to the public, not just to the minister’s image, which is good journalism by the way, and also untruths, knowing that it’s not true.

  • Any piece of communication, message, or information that fits all three criteria simultaneously is now legally defined as disinformation or 假訊息, which doesn’t touch the word news. This has two benefits. First, exactly as you said, when institutional media make a news without due diligence, source checking and fact checking, but just take a picture from social media, that’s more clearly turning a 訊息 into a 新聞. It’s very clear now that they are doing something that’s against the journalistic integrity.

  • For that, the NCC have done some actions on spectrums that are public on cable TV, that uses public land that’s within NCCs purview. NCC did issue fines, penalties, and rulings and regulations about the institutional media doing exactly that sort of thing. The other part that you mentioned where they create their own narratives. Maybe they didn’t say they take a picture of social media and repeat the message, which is very quickly finable, because they say that they will have a fact checking process in their original application to the license.

  • If this process is shortened to five seconds, maybe some innovative technology I want to know, but more likely than not, it cascades. That is very easily finable.

  • The other one, which is more insidious is instead of saying we just repeat what social media does, actually create intentionally a narrative that fails journalistic standards by using clickbait titles and question marks, lots of question marks, to create intentionally the possibility of doublethink.

  • They can say that what they have printed is literally not wrong, but it creates a lot of opportunity for people, because phone screens are small, to take a screenshot of a partial information, which would then fit the disinformation criteria by essentially creating weapons for mass confusion, destruction, distraction. That is the more insidious point.

  • The remixing for disinformation is done through A/B testing online channels. Meanwhile, the ministries are posting out clarifications. This process now on average finish within an hour. Then you will see a popular disinformation package being quoted but always less than 20 characters, which says, “Perming your hair multiple times in a week will be subject to one million dollars fine? It’s not true.””

  • The Premier, as he looks when he was young, says, “I may be bald now but I will not punish people like thathair,” in fine print, saying, “What we have actually done is labeling requirements for hair products that takes effect on 2021.” This part must be less than 200 characters.

  • Finally, the second image, because it needs to include at least two pictures, the second image the Premier looks now, says, “If you perm your hair many times, there’s no fine but you will damage your health. When serious, you may look like me.”

  • (laughter)

  • It fits on the phone screen. If you click this, of course, it leads to more actual narratives, but even just the picture itself went viral. If you on the search engine type perm hair fine, the Google Image just showed this image. It never showed the disinformation packages, because this is more viral, has more page links, so the page rank is higher and so on. It’s more relevant.

  • We show nowadays – it’s been two years – that if structured like this and within this amount of time, almost always, I would say always, this gets more viral than this repackage that you just alluded to. People who have seen this and laughed will not share the disinformation package, because humor is a stronger emotion than outrage.

  • I think I will just go with my experience about the Uyghurs. Maybe you know about the Xinjiang situation. It’s really horrible, and we see that technology is not always good thing, or it can be misused in the future.

  • Of course. In the now.

  • Yeah. Coming to Taiwan, it was really creepy for me, because I see cameras all the places.

  • There is even a Sony recording.

  • The effect for me is a little bit different, because there is no labels that there is a cameras or surveillance, even in the buses, in the taxi, in the street, all the places we can see there is cameras. You don’t know the intention for the usage. Is there any platform that will protect the Taiwanese about the misuse issue? Have you done some work? Can you elaborate?

  • Yes. Two things. The first one is I hear you. Whenever I walk on the street, there is just random people using their phones and rush to my side and taking selfies with me. It’s not just stationary cameras for me. There is any number of mobile cameras on the street wherever I walk. This really is delineating a clear, this is public space, feeling to people. It creates a tension psychologically. This is true.

  • The second part, though, I want to say – maybe Ning has more to contribute – is that we are a GDPR adequacy-seeking, almost finalizing jurisdiction. Our Personal Information Privacy Protection Act or PIPPA is modeled quite closely to the European privacy law prior to GDPR. We are making necessary adjustments according to GDPR.

  • For us, private data is a beginning of a relationship. The relationship begins with the right to ask for a copy, to update, to delete, to start a accountability relationship with. While it is true that, for example, I’m recording this conversation, I’m required by PIPPA to say what’s the intended use.

  • What would be construed as misuse according to my interpretation, I’m required to ask for your consent before I even start the record button. There is that side of it as well. There is many conversations nowadays around specifically facial recognition. It’s one thing that’s camera that’s low resolution that just measures the height of the water during a flood.

  • We have a lot of camera angles for that particular purpose, because we are an island of resilience, meaning that water flows easily. Instead of just making floods not happen, we have a lot of cameras designed to make floods go away very quickly, and we need those information, but those cameras must never do facial recognition, which owuld be out of original purpose scope use.

  • There is a whole governance structure with dedicated, including cyber security personnel, but also privacy protection office in the National Development Council overseeing that. Each ministry is actually also overseeing all the privacy invasions and misuses in all the different commercial entities and noncommercial entities, that they are the ministry in charge of, so two layers of oversight.

  • That still doesn’t convince everybody. Actually, there are many conversations around cams just around now in public places such as railroad stations. Not only it cannot be used outside its original design purpose, perhaps we shouldn’t install them in the first place.

  • We should use more IR or other technologies that can achieve the same thing of detecting anomalies and things like that, of people stepping into the railway track trying to do suicide and things like that, but without having even the chance of it being abused. Even we say we don’t keep a log, and the independent auditor says, yeah, they don’t keep a log, they don’t even have a hard disk.

  • It doesn’t really convince everybody. The easiest way is just to say this is not a camera. This is just a motion detector. It detects when somebody crosses into the track. We understand that this is now bringing bad associations, even in light of regulations and control mechanisms.

  • We’re also looking into different devices that are also clearly marked, but they measure the things that could be abused in very narrow purposes, rather than general purpose camera. That’s my take, but Ning may have a different take.

  • (laughter)

  • Just to say, your selfie or photo taken by others, that is a different case because you are a public figure.

  • I see… I’m a celebrity?

  • (laughter)

  • Not the same for for ordinary people. The second, as Audrey mentioned, a camera or a recording in the public space, basic case of law, is allowed because it’s a public domain. But if we use video picture to identify someone or to… First of all, it has to do something more than the original purpose such as traffic or, I don’t know, maybe safety, that is not there in that case.

  • So, statistical purpose.

  • Unless you have, for example, criminal case, and a prosecutor or police wanted to collect videos or pictures have taken by others to investigate the specific case, by the law, that is allowed.

  • I just want to say that we make the recorder so large, and the video cam that you saw so prominent, and the motion detectors for people crossing over the tracks, assuming the railway station is going to switch to those, those are going to be very visible, because these are required by law to be visible, to be clearly indicating that this is measuring data about…

  • Facial recognition for a wanted criminal. It’s good to hear about this. I’m coming from Norway, and Norway is really trying to keep this privacy thing. There was actually one interesting incidence, issue, or whatever. You know these cameras that register the passing of the…We have tolls in between when you’re close to the city in order to protect the environment.

  • Some of them are not detected automatically. It has been revealed and it was really shocking that the information of the car plates has been sent to China to make the manual verification of the plates, just to verify it’s the correct number. We see that we collected, there is a regulation, but still it can be sent to a different party that is not maybe a good place to send. How you can…?

  • In the GDPR worldview, you classify the world into adequate places where if you send your data to it, it will not create toxicity or pollution. The not adequate places, the individual data processors have to prove that they are adequate even though they are in a desert of not adequate laws, regulations and market.

  • Because we are in the process of finalizing our GDPR adequacy, we understand the general requirements of having a data controller. Even the EU itself doesn’t have a good idea on when things are jointly controlled, like in a civic tech project where people donate their own air quality measurement.

  • That’s my favorite example, because it’s a controller of, what, 2,000, 3,000 super nodes, whereas in Sunflower, it was 20 or so super nodes. Each NGO, each vote, right? In Hong Kong, there is like, I don’t know, 10,000 super nodes. When the joint data controllers are in this number, it’s very much unclear what you said can still be applied.

  • We cannot independently audit all 10,000 data controllers that they’re not sending data to PRC territory or other non-adequate territory. Some other mechanism needs to be designed, both in law and algorithm, to make joint data controllership work, for crowdsource to work, without over relying on just a single institutional entity that worked only because data collection and processing was expensive.

  • Nowadays, it’s very cheap. It’s an open problem when everybody can use a mobile phone. It’s not just celebrities. It’s also cute children. We really need to empower children to require deletion of their pictures once they get out. The point is that the norm around us is evolving, but we at least know that the adequacy criteria is a good touchstone on which to test about a specific case that you mentioned.

  • Imagine this as not surveillance but sous-veillance, when everybody keeps everybody’s GPS or tracking positions, self-driving cars keeping each other’s position, there will become joint data controllership, and that’s the next horizon we’re currently intently working on.

  • I don’t know if I understood well, but you said there were a lot of people from the governments that were overseeing these cameras were used properly, but its nobody from other branch of power keeping control, like from the legislative or something like that. In countries like mine, if you say that the executive or some ministry is going to control that, it’s like saying nothing. You have to…

  • We have a branch just called the control branch. It’s independent of the legislation, the administration, or the judiciary. It’s another branch. The control branch at the moment, for example, campaign donation. Those are sent to the control branch. They publish nowadays unstructured data and raw data for people to analyze.

  • It used to be that they do all the data analysis to detect corruption and so on, exactly as you said. Otherwise, if its controlled by the legislative investigators, then it pays exponentially to bribe them, because once they get into the legislative chamber, they will then cancel the investigation about their own election.

  • Having this control branch as a separate branch with independent budget and independent personnel is a constitutional design that specifically guards against this kind of abuse. That’s the first thing. There is currently serious talks about making the Control Yuan also the national branch for human rights enforcement.

  • That will then expand it not just the public places and public politicians, but also any human rights abuse that occurs within the Taiwan republic citizen’s territory. That’s the idea.

  • May I follow up on that question but from a different angle?

  • You’re talking about the Taiwanese government working with international media companies, I guess for instance Facebook or Google.

  • Twitter, Google, Line.

  • It’s also a fact and something as a Chinese journalist can probably add a line to it. Chinese government controls basically all media in China, and Chinese government has access to all the back doors for Chinese social media companies including, of course, Weibo, of course, WeChat, etc., etc.

  • When they don’t, they change the chair.

  • Yeah. What is the legal relationship? How do we deal with these companies in Taiwan?

  • You mean how do we deal with?

  • With Chinese companies, Chinese tools, and Chinese media in Taiwan.

  • (laughter)

  • That is user choice. If you choose to use WeChat, that means you give your privacy or personal data to the PRC government.

  • I’m talking more about, for instance, Weibo. We know that people go on Weibo. People go on Taobao to buy stuff. There is a promotion on my phone on Line, not WeChat, Line, tell me that there is a special package, to be paid on delivery. What is the legal framework for these kind of exchanges?

  • That is exactly what I said. If you buy something in Taobao or Alibaba online…

  • Or on LINE, actually…

  • You have given your personal data, your privacy, to that company. That is the same story on WeChat. Furthermore, you also give your personal data and the privacy to the Chinese government. For me, that’s a user choice, but maybe the government duty is to raise the public awareness of that.

  • Let me rephrase it this way. It’s not so much the privacy part, but mostly the media literacy and disinformation part. You get a lot of information from social media. My parents get a lot terrible fake news from WeChat, but they don’t know it’s from WeChat, because they’re using Line. Or they get it from their friends who forward it.

  • For example, the disinformation we have talked to everyone about the case of Kaisai airpot. The fake news that is actually originated at Weibo. Someone saw that on Weibo and then posted on PTT and that become the headline of next morning. It’s actually a chain of disinformation and we’re affected.

  • I know there probably is not much that we can do in freedom of speech about this, but it’s a serious problem that government should be aware of.

  • I think there is plenty that we can do to protect freedom of speech. That’s why the Public Digital Innovation Space exists. The thing is that what you’re saying is right. The shortcut, the easy way, for especially our nearby jurisdiction but for jurisdiction around the world is passing a law saying a minister’s word is better than a journalist’s word.

  • Then minister can force take-downs, apologies, change of language, shaming the journalist. In pretty much all the jurisdictions that are looking at a more top down way of controlling the media in the first place, disinformation is a perfect excuse to infringe on journalistic freedom.

  • It is so perfect now that according to CIVICUS Monitor, we are the only one in our nearby jurisdictions that are not walking back because of this disinformation crisis. Japan is doing OK, but everybody else, excluding, of course, Australia and New Zealand, but all the way to East Africa, this is providing a perfect excuse for top down governance to become even more top down when it comes to freedom of journalism.

  • If we don’t do anything, our nearby jurisdictions will think that the only viable way is to infringe on journalistic freedom. If we develop useful ways, effective ways that builds more trust into the journalistic community, then our nearby jurisdictions will just run a workshop in Chulalongkorn University in Thailand.

  • Their media people like PBS, that day I had like seven interviews, and all the journalists, despite their original outline, expressed a strong preference for the Taiwan model to prevail, because they want to be honored as people in journalistic profession. There is plenty of what we can do. There’s a more philosophical thing. I want to say that aside from clarification, there’s more a government side.

  • The social sector, which is more trustworthy than government in people’s minds in Taiwan anyway, the social sector with higher legitimacy is actually driving most of the coping mechanisms when this kind of message comes.

  • The TFCC in particular in a public interview, the Taiwan Fact Checking Center, said that if they had set up a bit earlier with a bit more funding at the beginning and staff, they could have prevented that airport issue because they were just investigating that issue when the PTT post happened.

  • They were just calling the Japan airport, but they were waiting for a Japanese interpreter, I think, to translate the response from the airport. It took them quite a few days, and then the suicide case happened from our Foreign Service. Then after, I don’t know, 40 hours or something, TFCC published their clarification. It’s a race in time.

  • If they published 72 hours earlier, at least they could have prevented a suicide probably. Of course, not the disinformation, but the suicide.

  • If they had the algorithmic pathway into Facebook as they do now, they could also make everybody who see this information on Facebook see a mandatory related link that links to the clarification report and to bury this disinformation that you have to scroll two hours to see it, to reduce its virality and therefore the pressure on the Foreign Service.

  • If this algorithmic co-governance is in place at that time, I wouldn’t say it would have prevented, but it would massively reduce the likelihood leading to a Foreign Service suicide. That’s the social sector intervention. There is a lot to do in this regard.

  • It’s in our best intent to make sure that all the platform provides all the analyzable information on its advertisement, banning foreign-sponsored propaganda and so on for people working on the Workbench, such as the Doublethink Lab, to provide quantitative evidence to the qualitative TFCC-like communities to very quickly empower them to publish counter-narratives.

  • This is a very good example. This is File Reader Plus. They run a public, open source GitHub data analysis on the raw data published by the Control Yuan to find political contributions and how they connect to international as well as inter-jurisdictional commercial networks. This is a good example of investigative data journalism. The same team I think is now teaming up with a lot of institutional media.

  • You probably already know this, but I will bring this out anyway, the presidential fact check thing. They run a fact check around the presidential election, which interestingly is a combination of institutional, social, and crowdsourced media where they ask people to type in all the transcripts of the presidential candidates’ utterings in public media.

  • After they are typed into transcripts, they then turn the attention into tagging, labeling the different parts of the sentence as maybe making a over claim or maybe have no factual basis. This part is also crowd contributed. The point here is that at the end, it is the institutional media that reviews the claims, I think. That is the key word to use.

  • It’s the institutional media that finally looks at those crowdsourced reports from…This is the crowd, so large crowd, [laughs] and these crowd contributions. Sorry, I was browsing with Firefox Focus which, by default, hides all images to protect my emotional landscape.

  • (laughter)

  • This is a better way. These are the institutional media that look at all those individual claims and then use their expertise to do journalistic source and fact-checking to make sure that all the presidential candidates. For example, “Ohashi Mitsuo visited Taiwan 14 times” claimed Dr. Tsai Ing-wen, but he actually visited 19 times.

  • That’s, so far, the only nonfactual information that Dr. Tsai have said.

  • In any case, that is what I mean by the institutional media working with the social sector to empower the social sector and, along the way, get more people getting interested into the hobby of fact-checking and become kind of part-time journalists.

  • Then contribute into the veracity of institutional media who participate in this kind of collaboration, which then is, I think, a viable defense against disinformation, while empowering, not belittling, institutional journalists.

  • Any other questions? Thoughts?

  • I can comment on the Weibo thing.

  • As you mentioned, it’s like you have a choice, but for Weibo, it’s not a choice. It’s just like you have to in order to keep your content. We’re aware that the [laughs] information can be taken from there and misused against us, because there have been a lot of the harassment because they are using enough the robo-calls and other technologies. What would be your suggestion in this case for us?

  • You said you cannot use non-Weibo software?

  • Because it’s forbidden to use WhatsApp or any kind of the alternatives in Xinjiang. You can just use WeChat if you want to contact your family members and loved ones.

  • Is it currently illegal to send encrypted messages over Weibo? I understand the cryptography act is being enacted…

  • In Weibo, it’s a little bit different. In Weibo, you’re disabled. You can’t write. You can’t comment. You can just read.

  • Yeah, it’s reading.

  • There’s no way to send messages?

  • Yeah, for the accounts of Weibo, not all.

  • So your question is in a read-only platform, that is the only platform that you’re legally allowed to use…

  • WeChat and Weibo. In Weibo, you have just reading.

  • In WeChat, of course, everything is controlled and all the information is checked by the China government. What would be your suggestion?

  • I’m now talking as a technologist, not as digital minister. In my previous life…

  • (laughter)

  • …I worked with the localization team of Freenet and also many related technologies. Nowadays, I think about it, that’s how adversarial training on both sides, that’s how Great Firewall became so complex because we keep pushing to make it grow, I guess.

  • Back in the days, we would primarily rely on steganography, like hiding messages in images, hiding messages in words. We would then use end-to-end encrypted tools such as, at that time, Pretty Good Privacy, or OpenPGP. Nowadays, there’s many lighter alternatives, I will not get into technical details.

  • Once we have met face to face and exchanged our keys, then it becomes actually possible to send secure messages to you that look like a harmless picture or a harmless message, but you can decrypt on your end into the original message that I’m trying to send. At that point, the GFW has no way to do content layer deep learning, deep packet inspection, so that’s a way to masquerade circumventing traffic as legal traffic, as normal traffic.

  • One has to rely on more creative ways in other parts of Internet protocol, though. It’s not clear that this will continue to hold after IPv6. That’s another thing to watch.

  • In your case then, it depends on whether it is illegal to do steganography because, if it is, all of these suggestions I’m making would be putting you in risk. From what I understand of the cryptography law that was just being considered or now passed by the PRC, this behavior that I just mentioned might actually become illegal, and so I am not making any technical recommendations now.

  • I know it’s not a satisfying answer… It’s not a satisfying network out there.

  • No more questions? Are we good? Cheers. Thank you for visiting.