首页 News 正文

Combining AI with iPhone? Apple's latest paper provides breakthrough solutions

米哈伊尔叔叔
1277 0 0

Is the Apple GPT in your pocket? This may become a reality faster.
Apple artificial intelligence (AI) researchers recently published a paper on the preprint website arXiv, which mentioned an innovative "flash utilization" technology that can deploy large language models (LLMs) on iPhones and other memory limited Apple devices, which is almost a major breakthrough.
Memory constraints
LLM based chatbots (such as ChatGPT, Claude, etc.) rely heavily on data and memory, requiring a large amount of data to be processed simultaneously, often requiring a large amount of memory to run.
Therefore, running LLM is a challenge for devices such as iPhones with limited DRAM (generally referring to memory) capacity.
Usually, the standard method for computing data is to load the data from flash memory into DRAM, and then perform data inference in DRAM.
DRAM with high performance can increase data processing speed by millions of times, but the downside is its capacity. Running on DRAM severely limits the maximum model size that can be run.
To address this issue, Apple researchers have developed a new technology that uses larger capacity flash memory to store data from artificial intelligence models, which can then be transferred to DRAM memory for processing when needed.
Storing AI on flash memory
In a new research paper titled "LLM in Flash: Efficient Large Language Model Reasoning in Limited Memory", the author points out that flash memory in mobile devices is more abundant than traditional memory used to run LLM.
This method cleverly bypasses capacity limitations. The paper proposes two key technologies to minimize data transmission and maximize flash processing capabilities:
One of them is called "windowing" technology, which is equivalent to a recycling method. AI models do not need to load new data every time, but instead reuse some already processed data. This reduces the need for continuous memory acquisition, making the process faster and smoother.
The second is called "Row Column Bundling" technology. This technology is achieved by grouping data more effectively, that is, by setting the order of accessing data blocks based on the data characteristics of flash memory, which can read data from flash memory faster and accelerate the ability of artificial intelligence to understand and generate language.
According to this paper, the combination of these methods enables the running capacity of artificial intelligence models to reach twice the available memory of iPhones. This means that under this method, the inference speed in the CPU has increased by 4-5 times compared to traditional loading methods, and the inference speed in the GPU has increased by an astonishing 20-25 times.
The author of the paper wrote, "This breakthrough is particularly important for deploying advanced LLMs in resource limited environments, thereby expanding their applicability and accessibility."
Apple's AI Strategy
The breakthrough in artificial intelligence efficiency has opened up new possibilities for future iPhones, such as more advanced Siri features, real-time language translation, complex AI driven photography, and augmented reality capabilities.
The new technology in the paper also lays the foundation for iPhone to run complex artificial intelligence assistants and chatbots on devices, and it is said that Apple is already developing this technology.
Apple's work in generative artificial intelligence may eventually be integrated into its voice assistant Siri. Apple introduced its large-scale language model work to employees at the February Artificial Intelligence Summit this year. According to previous media reports, Apple's goal is to launch an intelligent version of Siri that is deeply integrated with artificial intelligence.
There are also rumors that Apple plans to add artificial intelligence to as many Apple applications as possible.
In addition, according to reports, Apple is also developing its own generative artificial intelligence model, "Ajax," which runs on 200 billion parameters to compete with OpenAI's GPT-4 model.
Internally known as "Apple GPT," Ajax aims to unify the entire Apple machine learning development, highlighting Apple's broader strategy of integrating artificial intelligence deeper into the Apple ecosystem.
According to the latest report, Ajax is considered more powerful than the early ChatGPT 3.5. However, the new model GPT-4 launched by OpenAI in September 2023 may have surpassed the capabilities of Ajax.
Fruit Chain analyst Jeff Pu has pointed out that Apple will launch some kind of generative artificial intelligence feature on iPhones and iPads around the end of 2024, which will be included in iOS 18. Pu also stated that Apple will build hundreds of artificial intelligence servers in 2023, and there will be more in 2024.
CandyLake.com 系信息发布平台,仅提供信息存储空间服务。
声明:该文观点仅代表作者本人,本文不代表CandyLake.com立场,且不构成建议,请谨慎对待。
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

  •   每经AI快讯,据亿航智能官微消息,公司EH216-S无人驾驶电动垂直起降航空器(eVTOL)获得巴西国家民航局颁发的试验飞行许可证书,并计划在巴西进行测试和试飞。关于EH216-S无人驾驶eVTOL在巴西的认证,中国民航局 ...
    潇湘才子
    昨天 08:41
    支持
    反对
    回复
    收藏
  •   今年7月,美国三大海外“债主”所持美国国债齐刷刷缩水,其中日本美债持仓已降至去年10月以来最低。   根据美国财政部当地时间9月18日公布的国际资本流动报告(TIC),2024年7月,美国前三大海外“债主”日本 ...
    520hacker
    3 天前
    支持
    反对
    回复
    收藏
  •   上证报中国证券网讯(记者俞立严)9月19日,蔚来全新品牌乐道的首款车型——乐道L60正式上市。新车定位家庭智能电动SUV,在采用BaaS电池租用服务后,L60的售价可低至14.99万元,电池租用月费最低为599元。乐道L6 ...
    anhao007
    前天 11:03
    支持
    反对
    回复
    收藏
  •   每经记者袁园   日前,国务院印发的《关于加强监管防范风险推动保险业高质量发展的若干意见》提出,以新能源汽车商业保险为重点,深化车险综合改革。   “车险综改”从2015年就已经开始逐步推进了,经过 ...
    moshulong
    前天 21:50
    支持
    反对
    回复
    收藏
米哈伊尔叔叔 新手上路
  • 粉丝

    0

  • 关注

    0

  • 主题

    2