ArXivIQ

ArXivIQ

MemOS: A Memory OS for AI System

Jul 10, 2025
∙ Paid

Authors: Zhiyu Li, Shichao Song, Chenyang Xi, Hanyu Wang, Chen Tang, Simin Niu, Ding Chen, Jiawei Yang, Chunyu Li, Qingchen Yu, Jihao Zhao, Yezhaohui Wang, Peng Liu, Zehao Lin, Pengyuan Wang, Jiahao Huo, Tianyi Chen, Kai Chen, Kehang Li, Zhen Tao, Junpeng Ren, Huayi Lai, Hao Wu, Bo Tang, Zhengren Wang, Zhaoxin Fan, Ningyu Zhang, Linfeng Zhang, Junchi Yan, Mingchuan Yang, Tong Xu, Wei Xu, Huajun Chen, Haofeng Wang, Hongkang Yang, Wentao Zhang, Zhi-Qin John Xu, Siheng Chen, Feiyu Xiong
Paper: https://arxiv.org/abs/2507.03724
Code: https://github.com/MemTensor/MemOS

TL;DR

WHAT was done? The paper introduces MemOS (Memory Operating System), a novel framework that treats memory in Large Language Models (LLMs) as a first-class, manageable system resource. It unifies the management of disparate memory types—plaintext (external documents), activation (KV-cache, hidden states), and parameter (model weights)—into a standardized unit called the MemCube. Each MemCube encapsulates not only memory content but also crucial metadata for provenance, versioning, and access control. MemOS is built on a three-layer architecture that provides OS-like services, including a MemScheduler for dynamic loading, MemLifecycle for state management, and MemGovernance for security and compliance.

WHY it matters? This work represents a significant paradigm shift from stateless, ad-hoc memory solutions like Retrieval-Augmented Generation (RAG) to a deeply integrated, stateful memory architecture. By enabling controllability, plasticity, and evolvability, MemOS addresses critical LLM limitations in long-context reasoning, continual personalization, and knowledge consistency. Experimentally, it achieves state-of-the-art performance on the LOCOMO benchmark and dramatically reduces inference latency via KV-cache acceleration. This provides a foundational infrastructure for developing the next generation of persistent, adaptive, and scalable AI agents.

ArXivIQ is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

Details

Keep reading with a 7-day free trial

Subscribe to ArXivIQ to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Grigory Sapunov · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture