Evidently, people in Taiwan, for all the language models sponsored by the government, including the national academy, they insist that the traditional characters they use not just look like from Taiwan, but the vocabulary, the ideology they use may not conform to this Beijing standard. And so, the read teaming and so on, and so on was all about trying to elicit PRC responses in a Taiwanese context. And then, of course, this part, once it’s generated, can be automatically verified. And GPT-3.5, which is just 20 billion parameters, we heard, fails like habit. Next slide.