This is an automated archive made by the Lemmit Bot.
The original was posted on /r/opensource by /u/Another__one on 2026-02-07 17:15:11+00:00.
For the past two and a half years I’ve been working on an open-source project aimed at giving people more control over how they interact with their personal data. It’s called Anagnorisis, a completely local recommendation and search system for personal media libraries.
The problem I want to solve that recommendation algorithms on cloud services optimize for their business metrics, not for what users actually want. I figured there should be a transparent, open-source alternative where the algorithm works for you, not against you.
The technical architecture is straightforward. Point it at your local folders containing music, images, documents, or videos. It uses open embedding models (LAION CLAP for audio, Google SigLIP for images, Jina embeddings v3 for text) to enable semantic search across everything. You can search for “relaxing instrumental music” or “papers about neural networks” and it understands actual content instead of just matching filenames.
The recommendation engine lets you rate files on a 0-10 scale, then fine-tunes PyTorch models to learn your preferences. The models predict ratings as if you had rated new content yourself. Everything processes locally on your hardware with full transparency into how the algorithm works.
Right now three search modes are available: filename-based fuzzy search, content-based semantic search using embeddings, and metadata-based search that analyzes file metadata plus custom notes stored in simple .meta text files. Temperature control adds controlled randomness to results, useful for discovery while maintaining relevance.
Version 0.3.1 just released with a unified search interface. You can watch a video showcasing the update here: youtu.be/X1Go7yYgFlY
Stack is Flask backend, Bulma CSS frontend, PyTorch and Transformers for ML. Runs in Docker with CUDA support.
Licensed under AGPL-3.0 license. Contributions and feedback welcome. Happy to discuss implementation details or answer technical questions.