- Desktop app is built with Tauri and Rust.
- See Tauri architecture for how Tauri works.
- All useful global state is persisted to SQLite.
- WebView is built with Solid.
- Loads data from SQLite and passes it to WebView.
- Uses rusqlite crate for all SQLite queries.
- Data sent from Tauri into WebView, gets stored in Solid stores and can be used globally in all components of the app.
- If user is authenticated and has an account, a GraphQL query will be sent to server to get all the latest changes.
- The changes coming from the server will update the state of SQLite and thus update the UI too.
- User actions made in app (WebView) sends messages to Tauri which writes everything to SQLite. Or reads things from SQLite wherever needed.
- Tauri/Rust connects to a local folder in user OS file system. All topics are persisted as markdown files in the folder.
- markdown-rs crate is used to convert from
.md
file content to Topic and vice versa. - File watcher is present that listens to any changes made to the folder. If any of the files get modified or deleted, it will update SQLite accordingly too. If file was deleted, it will do
soft delete
of theTopic
in SQLite. Users can thus revert .md file deletes.
- markdown-rs crate is used to convert from
- If user is authenticated and has an account, the app will optionally sync or publish to the server
- Rust/tauri binary will either embed a 7B or 13B LLaMA model with the binary. Users will be provided the choice to download app with the language model embedded or not.
- LLaMA.rs is used to embed the language model into the binary and provide inference
- Potentially the language model is provided separately as a download. And then inference is served via HTTP server
- This can allow sharing LLaMA model inference with other apps too. TextSynth Server can potentially be used.
- llm-chain is used to connect to the local LLaMA and make and chain prompts
- The model is continuously fine tuned on user data
- TODO:
- Front end is built with Solid
- All useful global state is persisted to local storage with TinyBase
- Hanko is used for user authentication
- on succesful auth, a cookie get saved and is then used to do authorised GraphQL queries
- GraphQL Mobius used to do typed GraphQL queries
- All data received back is then saved to Solid stores which then updates the UI
- Depends on page but mostly it will:
- load data from local storage via TinyBase into Solid stores, UI updates instantly
- send GraphQL request to load fresh and missing data
- load it into solid stores and update UI
- Users do actions in the website, update local solid stores
- on each solid store update, GraphQL request gets sent to persist the changes to the server
- there are Server-Sent Events setup to live update the stores with data from the server (if there is any)
- Grafbase is used to provide a GraphQL access layer to all the API.
- Grafbase is hosted with Cloudflare Workers and has low latency on each API call
- Grafbase allows you to write TS or Rust via WASM to run code at the edge
- Grafbase API is setup to do all CRUD operations on top of EdgeDB (creating, reading, updating, deleting data)
- Grafbase is also setup to run any logic that needs to be ran on server too, such as creating Stripe checkout sessions, processing payments etc.
- A lot of server code will be written in Go exposed as GraphQL too.
- Stiched together with Grafbase to provide one GraphQL interface to everything
- Go code is deployed as Docker container to Google Cloud with proper logging, observability setup
- Built with Expo & Tamagui
- using Tamagui starter
- All useful global state is persisted to SQLite
- by embedding rust in react native, sharing most code from desktop rust code
- or with TinyBase
- Valtio is used as global react state, emulating Solid stores/signals DX as close as possible
- If any native module is required, it is written in Swift or Kotlin and added using Expo
- Google like scraper is built in Go using Colly
- it watches over many websites and ingests the data into the system
- the data then gets processed and added to the database
- 70B LLaMA fine tuned model is deployed with Modal
- goal of the model is similar to Google, user writes free form queries, and the model finds all the relevant LA notes/links/.. that are matching the query
- potentially it will be 70B model + some custom search heuristics
- EdgeDB + pgvector is used as vector database
- Monaco Editor for pure text editing
- with various plugins like vim mode
- TipTap for interactive text editing
- Allow switching between Monaco and TipTap easily
- everything seralises to same structure
- interactive parts can be presented as JSX Solid components
- made using mdxjs-rs
- Resend using React Email
- Most all logs are collected/sent to Tinybird
- llm-chain locally or in Grafbase WASM resolvers with rust
- can also use LangChain in Python/TS in some services depending on use case
- Figma and Midjourney for graphic designs
- Cloudflare for DNS, website analytics, web asset serving and more.
- Discord for everything
- VitePress serving markdown files to docs.learn-anything.xyz
- Equals connected to EdgeDB via SQL read only Postgres SQL support
- Inlang
- for both solid and react native
- TODO:
- TODO:
- most likely in either go or rust
- Inngest for queues, background jobs
- what problems can it solve?
- vite-plugin-ssr
- can be used to improve on solid start or potentially add Houdini Solid support with it for GraphQL