Yeah. So, of course many of them do not have the way to get into this full-size GPU which is of course very expensive, also energy intensive, but I’m quite happy that recently the language model and other community generative communities on Hugging Face and with Stanford Alpaca, RedPajama, and many other teams have now worked on a state called quantized form that can actually easily run on Raspberry Pi or similar size hardware, which is much more affordable to people in the Global South. That’s also one part where Taiwan can help because we can also produce chips that are inexpensive and it’s dedicated to run this kind of generative models in an energy efficient way.