Thanks to a groundbreaking small language model (SLM) created by a group of AI researchers at Adobe Inc. working with teams from Auburn University and Georgia Tech, new smartphone AI applications will now be able to handle documents directly on the device without needing to connect to the cloud. This model is named SlimLM, and all relevant information about it has been shared on arXiv.
Efficient On-Device Processing
"Through thorough experiments on a Samsung Galaxy S24, we pinpoint the best trade-offs between the model's size (which varies from 125M to 7B parameters), context length, and inference time for effective processing on the device," shared the research group. "SlimLM is pre-trained on SlimPajama-627B and fine-tuned on DocAssist, our specially created dataset for tasks like summarization, question answering, and suggestions. Our smallest model shows great performance on the S24, while the bigger versions provide enhanced features within the limits of mobile devices."
Future Availability
SlimLM isn't available to the public just yet, but that could change soon. With this innovative code, smartphones can harness AI's power locally to process documents without needing internet access. While cloud-based AI solutions typically rely on less local computing power compared to apps that run directly on devices, the key benefit they provide is privacy. Major players in this space, like Google, Apple, and Meta, have already launched similar applications, but their status is largely experimental, and they haven't reached widespread use. If the goals of these researchers are successful, SlimLM could potentially be the first widely adopted solution of its kind.
Source: Link