Week 3 2026
Hightlights
New datastore implementation: Fast dataset store running at SSD speeds.
Specs: Dataset, DatasetStore and Merkle trees
UI progress: New Logos Storage UI repository with Nix integration, debug panel implemented
Research: POC RLN 3x performance improvements
Status Integration
- achieved: Working out the tasks from the review of the integration PR. We have much cleaner history now, which now grows again while I am working though it - yet soon we should get it resolved. This turned to be a slightly bigger task than anticipated. So far achieved: - moving the codex-related code to its proper space. There are many dependencies, and some things require a bit more thinking so it is not just mechanical operation. - big renaming: we cannot keep “codex” in all abstractions, logs and comments. Thus, I have already renamed everything to “LogosStorage”
- next:
- update the “libcodex” to also “liblogosstorage” (or something), will be looking at next
- update the logos-storage related nix configs to take advantage of the new flakes in logos-storage
Testnet filesharing client
- achieved:
-
brand new implementation of a fast dataset store which can now run at SSD speeds; comes with a spec draft: [11/DATASET-STORE];
-
new spec draft for the logos storage data model: [10/DATASETS];
-
Jacek’s improvements to our Merkle trees merged;
-
new Merkle tree repo - which will now contain our Merkle tree implementatons - created, specs underway.
-
Removed all unused modules from the codebase for Logos Storage going forward. Prior to this work, relevant aging PRs were merged into master, and a
classicbranch was created, containing a mostly complete state of the old Codex. Going forward,masterwill repesent the new vision of Logos Storage.- “downsizing” PR (removes Marketplace and prover code) merged;
-
Begin moving the merkletree implemenation to a separate repo (https://github.com/logos-storage/nim-merkletree).
-
- next:
- Finish merkletree move
Achieved:
- Refactored the implementation as per the API spec.
- Added unit tests for all dataset operations.
Next:
- Add support for managing storage quota’s.
- Improve test coverage.
- Integrate with the current Logos Storage code.
- Add multi-threaded I/O support.
Privacy requirements
- achieved:
- Document: An analysis of the file sharing components , still work-in-progress analysis to look into the components of the file sharing client identifying what information are leaked during the file sharing process.
- Finalize research on related work on Tor and Tribler. Research outcome presented during the IFT research call and Complete research post is to be posted on the Vac forum.
- next:
- Continue the analysis on identifying where privacy is required.
- Continue the research into other related anonymous communication protocols.
AnonComm Collaboration
- achieved:
- Ongoing discussion and meetings with the AnonComm team.
- Defined the general scope of our collaboration and future research direction and started initial research into related work for provider anonymity.
- better understanding of different privacy areas and the corresponding solutions available at hand:
- Tor/MixNet at the transport/network layer
- DHT security/privacy for content discovery
- RLN as “spam” prevention in the application layer
- RLN improvement POC
- next:
- Continue to refine the scope of collaboration.
- Continue research into provider anonymity using SURBs with the goal of creating an RFC as an outcome.
- Discuss the next steps for the RNL improvement.
- Looking at the related work on how to pragmatically scale the solution and figuring out what would be the minimal first increament to include in the file-sharing client
Integration with Logos Core
-
achieved:
- Initiated the UI repository with Nix integration https://github.com/logos-co/logos-storage-ui
- Improved code organization and callbacks for the Storage module
- Implemented APIs: version and debug
- Started displaying debug information in a right-side panel in the UI
-
next:
- Finalize the debug information in the UI app
- Continue the integration with upload APIs