3.10 Security & Data Privacy
From the very beginning, Pop was designed with privacy protection, local-first processing, and secure controllability as core principles.
Whether you are using local models, cloud models, knowledge bases, plugins, or workflows, Pop ensures transparency, safety, and user control over all data processing.
This chapter explains Pop’s security mechanisms and privacy protection strategies in detail.
🔐 1. Local First
Pop stores all data locally by default, including:
- Chat history
- Knowledge base documents and vectors
- Workflow configurations
- Plugin and tool configurations
- Document editing content
Unless you explicitly enable a cloud model, no data is uploaded anywhere.
🔑 2. Model API Keys Stored Locally
All model API Keys (OpenAI, DeepSeek, self-hosted models, etc.) are:
- Stored locally or in encrypted storage
- Never uploaded to any server
- Used only for model requests you manually trigger
You may delete, reset, or replace API Keys anytime.
🔏 3. Full Control Over Model Request Content
When using cloud models (GPT, DeepSeek, etc.), Pop sends only the following:
- Your text input
- Files you explicitly upload (images, PDFs, etc.)
- Necessary model parameters (temperature, max_tokens, etc.)
Pop will not send:
- Your entire local knowledge base
- Other sessions
- Unrelated private data
You control exactly what gets sent.
🛡️ 4. Tool Permission Control (MCP / Plugins)
Tools may involve accessing local files or executing commands, so Pop enforces strict permissions:
- Tools are disabled by default (manual enable required)
- High-risk tools (Shell, file writing) require explicit confirmation
- All tool operations are auditable with visible execution logs
- Tools follow the principle of least privilege
This ensures AI cannot access your system without permission.
🔍 5. Knowledge Base Security
All knowledge base content is stored locally:
- Documents are not uploaded to external models
- Vector indexes are saved in local storage
- Chunking, slicing, and summarization are performed locally
During knowledge base Q&A:
- Pop only sends small, relevant text fragments to the model
- All other content remains local and inaccessible
Ideal for enterprise documents, private notes, internal architecture files, and sensitive data.
🧊 6. Multi-Window Isolation
Each chat window maintains its own independent context:
- Window A’s messages do not affect Window B
- Switching roles does not inherit other window contexts
- Different models do not share conversation data
Perfect for handling sensitive tasks across separate contexts.
🗄 7. Local Data Access & Storage Location
Pop allows users to view local storage directories, including:
- Chat sessions
- User settings
- Model configurations
- Cache files
- Knowledge base indexes
You may:
- Clear caches
- Delete sensitive data
- Move knowledge bases to encrypted drives
- Use private cloud or on-premise server mode (enterprise edition)
🧯 8. Privacy Mode (Future Expansion)
Future versions of Pop will include Privacy Mode, offering even stricter constraints:
- Disable cloud model access
- Prohibit file uploads
- Disable all tool calls
- Enable silent local-only mode
Suitable for enterprise, government, or highly sensitive environments.
🧾 9. Your Data Is NOT Used for Model Training
Anything you input into Pop:
- Is not used for model training
- Is never shared with other users
- Is never provided to third-party datasets
This applies to:
- Chat messages
- Uploaded files
- Documents
- Knowledge base materials
You retain full ownership and control.
🧩 10. Enterprise / Team Advanced Security (Optional)
Enterprise deployment supports:
- Local model servers
- Intranet-only knowledge base indexing
- Zero data exfiltration
- Audit logs
- Centralized key management (KMS)
- RBAC permission control
- Encrypted storage
Ensures corporate-level security, compliance, and isolation.
📌 Summary
Pop’s security system is built on three principles:
✔ Local First
All data is stored on your own device.
✔ Explicit Permissions
AI cannot perform system actions without your approval.
✔ User Control
All uploads, transfers, and tool calls are explicitly user-triggered.
This allows Pop to deliver powerful AI capabilities while ensuring your data always remains private, secure, and under your full control.