Future customer of artificial intelligence?
I’ve spent some time thinking about these articles and thinking about what tomorrow brings. The purpose of these columns is to provide a glimpse of future potential. Today I’m going to talk about a term that I briefly introduced in terms of my use of the identifier, but also beyond that. Today’s thesis or central thesis, or whatever you want to call it, is not a hypothesis; It’s just a hypothesis: the impact of artificial intelligence on the device in your pocket. Or, if I may put it this way: the cell phone is a future government customer for artificial intelligence systems.
First of all, at least with the current processing power of mobile phones, they will not be able to support your extended language. This requires GPU, C, PU and a lot of memory. It also requires more memory than most phones. A large language model consisting of 7 billion objects, i.e. H. 7 billion objects included in the model construction requires about 13 GB of storage space. Today’s largest mobile phones have access to the cloud, but only have 1 gigabyte of local storage. You want to avoid constantly switching to the large language model when using it on your phone. You’ll regret it very soon. Today’s major language models have referred customers, and many companies have referred their customers to various app stores. You can download an AI or artificial intelligence client and run it on your mobile phone.