AIVille
  • Welcome to AIVille
    • ๐ŸŒพThe Genesis of AIVille
    • ๐Ÿš‚Architecture of Generative Agents
      • ๐Ÿ“šMemory Stream & Retrieval
      • โ˜๏ธReflection
      • ๐Ÿ“†Planning and Action
  • ๐ŸŽฎAI Agents in AIVille Game
    • ๐Ÿค–Roles in AIVille
    • ๐Ÿš‚Core Gameplay Loops in AIVille
  • AIVille Evolution
    • ๐ŸšจThe Current Limitations in AI and dApp Integration
    • ๐Ÿค–The Rise of eMCP
      • ๐Ÿ”–eMCP: The Bridge Between AI and Blockchain
      • ๐Ÿš€The Evolution of AIVille
    • ๐Ÿง—โ€โ™€๏ธAIVille: From AI Simulation to Web3 MCP Infrastructure to AI Game Framework
      • ๐ŸPhase 1: AI-Native Simulation on BNB Chain
      • ๐ŸPhase 2: Multi-Chain AI Agent Coordination
      • ๐ŸPhase 3: eMCP Infrastructure Layer for Web3
      • ๐ŸPhase 4: Modular AI Framework for Web3 Games
  • Architecture and Highlights
    • โ›“๏ธAIVille System Architecture
    • ๐Ÿ’ŽDesign Highlights
  • TOKENOMICS
    • ๐Ÿ’ธDual-Token Economics
      • ๐Ÿ’ต$DINAR (In-game & AI Behavioral Token)
      • ๐Ÿ’ต$AGT (Governance & Utility Token)
      • โžฐValue Creation Loops
      • ๐ŸŒ‰Cross-Chain Design & Expansion
      • โœˆ๏ธGovernance & Treasury
  • ROADMAP
    • ๐ŸŒณAIVille Roadmap
  • Official
    • ๐Ÿ–‡๏ธOfficial Links
Powered by GitBook
On this page
  1. Welcome to AIVille
  2. Architecture of Generative Agents

Memory Stream & Retrieval

A central limitation in modern AI is the restricted context window โ€” LLMs like ChatGPT can only process a finite number of tokens (~128k), meaning older memories are discarded. To simulate long-term cognition, Generative Agents use a Memory Stream, a dynamic, structured log of experiences, to preserve and prioritize meaningful information.Each memory in the stream contains:

  • A natural language description

  • A creation timestamp

  • A last-accessed timestamp

The fundamental unit is an observation โ€” a direct perception or experience. For example:

  • Arjay is composing a new song.

  • Logan is discussing town planning with Owen.

  • Lulu is organizing a bonfire event with Arjay.

  • A new visitor arrived at Selenaโ€™s tavern.

To keep context relevant, the agent continuously retrieves a subset of these memories based on:

  • Recency: Prioritizing recent events using an exponential decay function.

  • Importance: Scored by the LLM at memory creation โ€” highly significant events persist longer.

  • Relevance: Calculated using embedding similarity between memory text and the current situation.

This ensures that only the most contextually relevant and meaningful memories are fed into the LLM for behavior generation.

PreviousArchitecture of Generative AgentsNextReflection

Last updated 13 days ago

๐Ÿš‚
๐Ÿ“š