Compare commits
68 Commits
c3cc94a305
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
| 7f4a8dae77 | |||
| 86dce15d28 | |||
| 26401f5316 | |||
| cc97c99422 | |||
| 59c417c470 | |||
| 95968f6b47 | |||
| 6bbedff561 | |||
| bcdfd8cdf5 | |||
|
8768b285df
|
|||
|
47565c9e95
|
|||
|
365efe3482
|
|||
|
994f39d308
|
|||
|
a31c82d184
|
|||
|
c9d0035cc2
|
|||
|
6f86827d6c
|
|||
|
f0d7da5141
|
|||
|
e6d7e1a73a
|
|||
| 44a5d01960 | |||
|
9cf4ba928a
|
|||
|
59e6f7b3b9
|
|||
|
4f98023203
|
|||
|
bbdc478e84
|
|||
|
675fc7f805
|
|||
|
141754ca39
|
|||
|
4b173ef164
|
|||
|
3201b5726e
|
|||
|
3c7bdc0c42
|
|||
|
2ebb7fc90d
|
|||
|
72320e2332
|
|||
|
b5a94520fe
|
|||
|
9ee3547d5d
|
|||
|
ce288ccdb0
|
|||
|
da87f82a66
|
|||
|
90f2c27c2c
|
|||
|
450b77140b
|
|||
|
318373c09c
|
|||
| d55743a9e7 | |||
|
8ab4924948
|
|||
|
8bd148dc96
|
|||
|
2ab1c855ec
|
|||
|
f67ec5bde6
|
|||
|
112b85f3fb
|
|||
|
86cf624027
|
|||
|
1df3a303f5
|
|||
| 07a5276e40 | |||
| f3d21f16fb | |||
|
5b2a1a652a
|
|||
|
665793668d
|
|||
| 5ccd84c77e | |||
| 7721c9d3a2 | |||
| b41a547589 | |||
| d122842995 | |||
| d65d991118 | |||
| 06ccc337c1 | |||
|
a3f7a19cc2
|
|||
|
e019f2d4fb
|
|||
|
22282691e7
|
|||
|
bc3652c782
|
|||
|
0a8b863e4b
|
|||
|
0901f5edf0
|
|||
|
a1924849d6
|
|||
|
fdd5c5fba0
|
|||
| d00ff42e8e | |||
|
8cdb9c4381
|
|||
|
3902ad5de3
|
|||
|
0538907674
|
|||
|
90ce41cd9e
|
|||
|
1be21b6c52
|
38
AGENTS.md
38
AGENTS.md
@@ -7,7 +7,7 @@ Unified NixOS flake for three hosts:
|
||||
| Host | Role | nixpkgs channel | Activation |
|
||||
|------|------|----------------|-----------|
|
||||
| `mreow` | Framework 13 AMD AI 300 laptop (niri, greetd, swaylock) | `nixos-unstable` | `./deploy.sh` locally |
|
||||
| `yarn` | AMD Zen 5 desktop (niri + Jovian-NixOS Steam deck mode, impermanence) | `nixos-unstable` | pull from CI binary cache |
|
||||
| `yarn` | AMD Zen 3 desktop (niri + Jovian-NixOS Steam deck mode, impermanence) | `nixos-unstable` | pull from CI binary cache |
|
||||
| `muffin` | AMD Zen 3 server (Caddy, ZFS, agenix, deploy-rs, 25+ services) | `nixos-25.11` | deploy-rs from CI |
|
||||
|
||||
One `flake.nix` declares both channels (`nixpkgs` and `nixpkgs-stable`) and composes each host from the correct channel. No single-channel migration is intended.
|
||||
@@ -36,10 +36,11 @@ lib/
|
||||
overlays.nix # jellyfin-exporter, igpu-exporter, reflac, ensureZfsMounts
|
||||
patches/nixpkgs/ # applied to nixpkgs-stable for muffin builds
|
||||
secrets/
|
||||
desktop/ # git-crypt: mreow + yarn share these (wifi, nix-cache-netrc, secureboot.tar, password-hash, disk-password)
|
||||
secrets.nix # agenix recipients (who can decrypt each .age)
|
||||
desktop/ # agenix *.age (mreow + yarn) + disk-password (install-time only, git-crypt)
|
||||
home/ # git-crypt: per-user HM secrets (api keys, steam id)
|
||||
server/ # agenix *.age + git-crypt *.nix/*.tar/livekit_keys
|
||||
usb-secrets/ # USB-resident agenix identity key (git-crypt inside the repo)
|
||||
server/ # agenix *.age + git-crypt *.nix/*.tar/livekit_keys (muffin)
|
||||
usb-secrets/ # USB-resident agenix identity for muffin (git-crypt inside the repo)
|
||||
```
|
||||
|
||||
**Never read or write files under `secrets/`.** They are encrypted at rest (git-crypt for plaintext, agenix for `.age`). The git-crypt key is delivered to `muffin` at runtime as `/run/agenix/git-crypt-key-nixos.age`.
|
||||
@@ -89,7 +90,7 @@ If Nix complains about a missing file, `git add` it first — flakes only see tr
|
||||
| `common-` | imported by ALL hosts | `common-doas.nix`, `common-nix.nix`, `common-shell-fish.nix` |
|
||||
| `desktop-` | imported by mreow + yarn only | `desktop-common.nix`, `desktop-steam.nix`, `desktop-networkmanager.nix` |
|
||||
| `server-` | imported by muffin only | `server-security.nix`, `server-power.nix`, `server-impermanence.nix`, `server-lanzaboote-agenix.nix` |
|
||||
| *(none)* | host-specific filename-scoped; see file contents | `age-secrets.nix`, `zfs.nix`, `no-rgb.nix` (yarn + muffin) |
|
||||
| *(none)* | host-specific filename-scoped; see file contents | `zfs.nix`, `no-rgb.nix` (yarn + muffin) |
|
||||
|
||||
New modules: pick the narrowest prefix that's true, then add the import explicitly in the host's `default.nix` (there is no auto-discovery).
|
||||
|
||||
@@ -117,14 +118,18 @@ New modules: pick the narrowest prefix that's true, then add the import explicit
|
||||
## Secrets
|
||||
|
||||
- **git-crypt** covers `secrets/**` per the root `.gitattributes`. Initialized with a single symmetric key checked into `secrets/server/git-crypt-key-nixos.age` (agenix-encrypted to the USB SSH identity).
|
||||
- **agenix** decrypts `secrets/server/*.age` at activation into `/run/agenix/` on muffin.
|
||||
- **USB identity**: `/mnt/usb-secrets/usb-secrets-key` on muffin; the age identity path is wired in `modules/usb-secrets.nix`.
|
||||
- **Encrypting a new agenix secret** uses the SSH public key directly with `age -R`:
|
||||
- **agenix** decrypts `*.age` into `/run/agenix/` at activation on every host:
|
||||
- **muffin**: identity is `/mnt/usb-secrets/usb-secrets-key` (ssh-ed25519 on a physical USB). Wired in `modules/usb-secrets.nix`.
|
||||
- **mreow + yarn**: identity is `/var/lib/agenix/tpm-identity` (an `age-plugin-tpm` handle sealed by the host's TPM 2.0). Wired in `modules/desktop-age-secrets.nix`; yarn persists `/var/lib/agenix` through impermanence.
|
||||
- **Recipients** are declared in `secrets/secrets.nix`. Desktop secrets are encrypted to the admin SSH key + each host's TPM recipient; server secrets stay encrypted to the muffin USB key.
|
||||
- **Bootstrap a new desktop**: run `doas scripts/bootstrap-desktop-tpm.sh` on the host. It generates a TPM-sealed identity at `/var/lib/agenix/tpm-identity` and prints an `age1tag1…` recipient (legacy `age1tpm1…` recipients still decrypt but `age-plugin-tpm` 1.0+ refuses to encrypt to them; `modules/desktop-age-secrets.nix` symlinks `age-plugin-tag → age-plugin-tpm` so rage's plugin dispatch finds the binary under both prefixes). Append it to the `tpm` list in `secrets/secrets.nix` (label as a Nix `# host` comment, not as a trailing word inside the recipient string — rage's bech32 parser rejects the trailing whitespace), run `agenix -r` to re-encrypt, commit, `./deploy.sh switch`.
|
||||
- **Encrypting a new server secret** uses the SSH public key directly with `age -R`:
|
||||
```sh
|
||||
age -R <(ssh-keygen -y -f secrets/usb-secrets/usb-secrets-key) \
|
||||
-o secrets/server/<name>.age \
|
||||
/path/to/plaintext
|
||||
```
|
||||
For desktop secrets, prefer `agenix -e secrets/desktop/<name>.age` from a shell with `age-plugin-tpm` on PATH — it reads `secrets/secrets.nix` and encrypts to every recipient listed there.
|
||||
- **DO NOT use `ssh-to-age`**. It produces `X25519` recipient stanzas, which the SSH private key on muffin cannot decrypt (it only decrypts `ssh-ed25519` stanzas produced by `age -R` against the SSH pubkey). Mismatched stanzas show up as `age: error: no identity matched any of the recipients` at deploy time.
|
||||
- Never read or commit plaintext secrets. Never log secret values.
|
||||
|
||||
@@ -191,11 +196,26 @@ lib.mkIf config.services.<service>.enable {
|
||||
|
||||
Existing registrations live in `services/jellyfin/jellyfin-deploy-guard.nix` (REST `/Sessions` via curl+jq) and `services/minecraft-deploy-guard.nix` (Server List Ping via `mcstatus`). Prefer soft-fail on unreachable — a service that's already down has no users to disrupt.
|
||||
|
||||
## Deploy finalize (muffin)
|
||||
|
||||
`modules/server-deploy-finalize.nix` solves the self-deploy problem: the gitea-actions runner driving CI deploys lives on muffin itself, so a direct `switch-to-configuration switch` restarts the runner mid-activation, killing the SSH session, the CI job, and deploy-rs's magic-rollback handshake. The failure mode is visible as "deploy appears to fail even though the new config landed" (or worse, a rollback storm).
|
||||
|
||||
The fix is a two-phase activation wired into `deploy.nodes.muffin.profiles.system.path` in `flake.nix`:
|
||||
|
||||
1. `switch-to-configuration boot` — bootloader-only, no service restarts. The runner, SSH session, and magic-rollback survive.
|
||||
2. `deploy-finalize` — schedules a detached `systemd-run --on-active=N` transient unit (default 60s). The unit is owned by pid1, so it survives the eventual runner restart. If `/run/booted-system/{kernel,initrd,kernel-modules}` differs from the new profile's, the unit runs `systemctl reboot`; otherwise it runs `switch-to-configuration switch`.
|
||||
|
||||
That is, reboot is dynamically gated on kernel/initrd/kernel-modules change. The 60s delay is tuned so the CI job (or manual `./deploy.sh muffin`) has time to emit status/notification steps before the runner is recycled.
|
||||
|
||||
Back-to-back deploys supersede each other: each invocation cancels any still-pending `deploy-finalize-*.timer` before scheduling its own. `deploy-finalize --dry-run` prints the decision without scheduling anything — useful when debugging.
|
||||
|
||||
Prior art: the 3-path `{kernel,initrd,kernel-modules}` diff is lifted from nixpkgs's `system.autoUpgrade` module (the `allowReboot = true` branch) and was packaged the same way in [obsidiansystems/obelisk#957](https://github.com/obsidiansystems/obelisk/pull/957). nixpkgs#185030 tracks lifting it into `switch-to-configuration` proper but has been stale since 2025-07. The self-deploy `systemd-run` detachment is the proposed fix from [deploy-rs#153](https://github.com/serokell/deploy-rs/issues/153), also unmerged upstream.
|
||||
|
||||
## Technical details
|
||||
|
||||
- **Privilege escalation**: `doas` everywhere; `sudo` is disabled on every host.
|
||||
- **Shell**: fish. `bash` login shells re-exec into fish via `programs.bash.interactiveShellInit` (see `modules/common-shell-fish.nix`).
|
||||
- **Secure boot**: lanzaboote. Desktops extract keys from `secrets/desktop/secureboot.tar`; muffin extracts from an agenix-decrypted tar (see `modules/server-lanzaboote-agenix.nix`).
|
||||
- **Secure boot**: lanzaboote. Every host extracts keys from an agenix-decrypted tar at activation — desktops via `modules/desktop-lanzaboote-agenix.nix`, muffin via `modules/server-lanzaboote-agenix.nix`.
|
||||
- **Impermanence**: muffin is tmpfs-root with `/persistent` surviving reboots (`modules/server-impermanence.nix`); yarn binds `/home/primary` from `/persistent` (`hosts/yarn/impermanence.nix`).
|
||||
- **Disks**: disko.
|
||||
- **Binary cache**: muffin runs harmonia; desktops consume it at `https://nix-cache.sigkill.computer`.
|
||||
|
||||
@@ -12,11 +12,11 @@ Browser: Firefox 🦊 (actually [Zen Browser](https://github.com/zen-browser/des
|
||||
|
||||
Text Editor: [Doom Emacs](https://github.com/doomemacs/doomemacs)
|
||||
|
||||
Terminal: [alacritty](https://github.com/alacritty/alacritty)
|
||||
Terminal: [ghostty](https://ghostty.org/)
|
||||
|
||||
Shell: [fish](https://fishshell.com/) with the [pure](https://github.com/pure-fish/pure) prompt
|
||||
|
||||
WM: [niri](https://github.com/YaLTeR/niri) (KDE on my desktop)
|
||||
WM: [niri](https://github.com/YaLTeR/niri)
|
||||
|
||||
### Background
|
||||
- Got my background from [here](https://old.reddit.com/r/celestegame/comments/11dtgwg/all_most_of_the_backgrounds_in_celeste_edited/) and used the command `magick input.png -filter Point -resize 2880x1920! output.png` to upscale it bilinearly
|
||||
|
||||
207
flake.lock
generated
207
flake.lock
generated
@@ -25,6 +25,22 @@
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"android-skills": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1777305465,
|
||||
"narHash": "sha256-9idRHLKyiVx16EhqYOSGPtUy8BOsf8N02mpUAnRSE7U=",
|
||||
"owner": "android",
|
||||
"repo": "skills",
|
||||
"rev": "640c9e3d0d335c186652f412731ddf831659661c",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "android",
|
||||
"repo": "skills",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"arr-init": {
|
||||
"inputs": {
|
||||
"flake-utils": "flake-utils",
|
||||
@@ -77,7 +93,6 @@
|
||||
"llm-agents",
|
||||
"flake-parts"
|
||||
],
|
||||
"import-tree": "import-tree",
|
||||
"nixpkgs": [
|
||||
"llm-agents",
|
||||
"nixpkgs"
|
||||
@@ -92,16 +107,16 @@
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1776182890,
|
||||
"narHash": "sha256-+/VOe8XGq5klpU+I19D+3TcaR7o+Cwbq67KNF7mcFak=",
|
||||
"owner": "Mic92",
|
||||
"lastModified": 1777369708,
|
||||
"narHash": "sha256-1xW7cRZNsFNPQD+cE0fwnLVStnDth0HSoASEIFeT7uI=",
|
||||
"owner": "nix-community",
|
||||
"repo": "bun2nix",
|
||||
"rev": "648d293c51e981aec9cb07ba4268bc19e7a8c575",
|
||||
"rev": "e659e1cc4b8e1b21d0aa85f1c481f9db61ecfa98",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "Mic92",
|
||||
"ref": "catalog-support",
|
||||
"owner": "nix-community",
|
||||
"ref": "staging-2.1.0",
|
||||
"repo": "bun2nix",
|
||||
"type": "github"
|
||||
}
|
||||
@@ -109,11 +124,11 @@
|
||||
"cachyos-kernel": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1776608760,
|
||||
"narHash": "sha256-ehDv8bF7k/2Kf4b8CCoSm51U/MOoFuLsRXqe5wZ57sE=",
|
||||
"lastModified": 1776881435,
|
||||
"narHash": "sha256-j8AobLjMzeKJugseObrVC4O5k7/aZCWoft2sCS3jWYs=",
|
||||
"owner": "CachyOS",
|
||||
"repo": "linux-cachyos",
|
||||
"rev": "7e06e29005853bbaaa3b1c1067f915d6e0db728a",
|
||||
"rev": "1c61dfd1c3ad7762faa0db8b06c6af6c59cc4340",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -125,11 +140,11 @@
|
||||
"cachyos-kernel-patches": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1776792814,
|
||||
"narHash": "sha256-39dlIhz9KxUNQFxGpE9SvCviaOWAivdW0XJM8RnPNmg=",
|
||||
"lastModified": 1777002108,
|
||||
"narHash": "sha256-PIZCIf6xUTOUqLFbEGH0mSwu2O/YfeAmYlgdAbP4dhs=",
|
||||
"owner": "CachyOS",
|
||||
"repo": "kernel-patches",
|
||||
"rev": "d7d558d0b2e239e27b40bcf1af6fe12e323aa391",
|
||||
"rev": "46476ae2538db486462aef8a9de37d19030cdaf2",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -140,11 +155,11 @@
|
||||
},
|
||||
"crane": {
|
||||
"locked": {
|
||||
"lastModified": 1776635034,
|
||||
"narHash": "sha256-OEOJrT3ZfwbChzODfIH4GzlNTtOFuZFWPtW7jIeR8xU=",
|
||||
"lastModified": 1777242778,
|
||||
"narHash": "sha256-VWTeqWeb8Sel/QiWyaPvCa9luAbcGawR+Rw09FJoHz0=",
|
||||
"owner": "ipetkov",
|
||||
"repo": "crane",
|
||||
"rev": "dc7496d8ea6e526b1254b55d09b966e94673750f",
|
||||
"rev": "ad8b31ad0ba8448bd958d7a5d50d811dc5d271c0",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -199,11 +214,11 @@
|
||||
"doomemacs": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1776590253,
|
||||
"narHash": "sha256-wU0gAHaCCX/sTUvbsgSxXAPxb1xEazfu5PClDe3SbXA=",
|
||||
"lastModified": 1777326848,
|
||||
"narHash": "sha256-7ErKUgw6Ch7hP1oBjMSos8xXRD+rxxjaOldRn+TcClo=",
|
||||
"owner": "doomemacs",
|
||||
"repo": "doomemacs",
|
||||
"rev": "707da6f7e90f26a4e00e5f8f98f29fd08824e71e",
|
||||
"rev": "6be3337b49867bd86f90fe5ca4beeb6b38afaddb",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -222,11 +237,11 @@
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1776849698,
|
||||
"narHash": "sha256-t2I9ZhBuAcaLV1Z65aVd/5BmDFGvyzLY5kpiSedx2uY=",
|
||||
"lastModified": 1777467165,
|
||||
"narHash": "sha256-eY/ZMttiaujiuqA+BpgEMHoZaHE1wFwQ5Zxm60W8HqI=",
|
||||
"owner": "nix-community",
|
||||
"repo": "emacs-overlay",
|
||||
"rev": "87dff52c245cba0c5103cf89b964e508ed9bb720",
|
||||
"rev": "986e76b9224a9f52525ea92b474f73de9153f7c0",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -266,11 +281,11 @@
|
||||
},
|
||||
"locked": {
|
||||
"dir": "pkgs/firefox-addons",
|
||||
"lastModified": 1776830588,
|
||||
"narHash": "sha256-1X4L6+F7DgYTUDah+PDs7IYJiQrb7MwYfateq2fBxGY=",
|
||||
"lastModified": 1777435375,
|
||||
"narHash": "sha256-2WRfJbipnTz+EY3rHRnCoG4kWkzPczb/cLcWwhy/0QA=",
|
||||
"owner": "rycee",
|
||||
"repo": "nur-expressions",
|
||||
"rev": "f3db83bc13aee22474fab41fa838e50a691dfbc5",
|
||||
"rev": "4d89e8e2c50711ee3fea3a25e662cfa5c6628e07",
|
||||
"type": "gitlab"
|
||||
},
|
||||
"original": {
|
||||
@@ -484,11 +499,11 @@
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1776891022,
|
||||
"narHash": "sha256-vEe2f4NEhMvaNDpM1pla4hteaIIGQyAMKUfIBPLasr0=",
|
||||
"lastModified": 1777476904,
|
||||
"narHash": "sha256-EeLoE8n4+QCbteyAsYXxhfr97RFfWL1ga0xwfL6lpKw=",
|
||||
"owner": "nix-community",
|
||||
"repo": "home-manager",
|
||||
"rev": "508daf831ab8d1b143d908239c39a7d8d39561b2",
|
||||
"rev": "8c8e5389e75a36bee53920de8ee24f017b3ae03e",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -541,21 +556,6 @@
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"import-tree": {
|
||||
"locked": {
|
||||
"lastModified": 1763762820,
|
||||
"narHash": "sha256-ZvYKbFib3AEwiNMLsejb/CWs/OL/srFQ8AogkebEPF0=",
|
||||
"owner": "vic",
|
||||
"repo": "import-tree",
|
||||
"rev": "3c23749d8013ec6daa1d7255057590e9ca726646",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "vic",
|
||||
"repo": "import-tree",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"jovian-nixos": {
|
||||
"inputs": {
|
||||
"nix-github-actions": "nix-github-actions",
|
||||
@@ -564,11 +564,11 @@
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1776874528,
|
||||
"narHash": "sha256-X4Y2vMbVBuyUQzbZnl72BzpZMYUsWdE78JuDg2ySDxE=",
|
||||
"lastModified": 1777364832,
|
||||
"narHash": "sha256-c9AvQrhfwCAVLG8WdFuNOOx/oSPnkw4WDpIlVJ9Cilk=",
|
||||
"owner": "Jovian-Experiments",
|
||||
"repo": "Jovian-NixOS",
|
||||
"rev": "4c8ccc482a3665fb4a3b2cadbbe7772fb7cc2629",
|
||||
"rev": "f0813b4aefdca5f66a5de7633902d136a9753f88",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -610,11 +610,11 @@
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1776797459,
|
||||
"narHash": "sha256-utv296Xwk0PwjONe9dsyKx+9Z5xAB70aAsMI//aakpg=",
|
||||
"lastModified": 1777299656,
|
||||
"narHash": "sha256-c0r3xXp2+xFJwkryS+nhyQwoACbFzSt4C1TVs3QMh8E=",
|
||||
"owner": "nix-community",
|
||||
"repo": "lanzaboote",
|
||||
"rev": "4eda91dd5abd2157a2c7bfb33142fc64da668b0a",
|
||||
"rev": "079c608988c2747db3902c9de033572cd50e8656",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -631,11 +631,11 @@
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1776862155,
|
||||
"narHash": "sha256-EDvbwsGNE/N5ul+9ul1dJP3Ouf72+Ub2C0UMbDWcxyQ=",
|
||||
"lastModified": 1777066729,
|
||||
"narHash": "sha256-f+a+ikbq0VS6RQFf+A6EuVnsWYn2RR3ggRJNkzZgMto=",
|
||||
"owner": "TheTom",
|
||||
"repo": "llama-cpp-turboquant",
|
||||
"rev": "9e3fb40e8bc0f873ad4d3d8329b17dacff28e4ca",
|
||||
"rev": "11a241d0db78a68e0a5b99fe6f36de6683100f6a",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -657,11 +657,11 @@
|
||||
"treefmt-nix": "treefmt-nix"
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1776883427,
|
||||
"narHash": "sha256-prHCm++hniRcoqzvWTEFyAiLKT6m+EUVCRaDLrsuEgM=",
|
||||
"lastModified": 1777439951,
|
||||
"narHash": "sha256-1Bs4ZbBayXWicrOrQQn3/BnnqhEy+tQjdFn40wHu1dw=",
|
||||
"owner": "numtide",
|
||||
"repo": "llm-agents.nix",
|
||||
"rev": "6fd26c9cb50d9549f3791b3d35e4f72f97677103",
|
||||
"rev": "2641c18f5bb9d0b95e81beca1b0415e174d7e650",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -704,11 +704,11 @@
|
||||
"xwayland-satellite-unstable": "xwayland-satellite-unstable"
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1776879043,
|
||||
"narHash": "sha256-M9RjuowtoqQbFRdQAm2P6GjFwgHjRcnWYcB7ChSjDms=",
|
||||
"lastModified": 1777472199,
|
||||
"narHash": "sha256-gJr/OrHv6s8ANqv915sb69LLThow1u5yAO/ouElVGGM=",
|
||||
"owner": "sodiboo",
|
||||
"repo": "niri-flake",
|
||||
"rev": "535ebbe038039215a5d1c6c0c67f833409a5be96",
|
||||
"rev": "323a80f2ce4541c595d491acbd15a8800201cbae",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -737,11 +737,11 @@
|
||||
"niri-unstable": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1776853441,
|
||||
"narHash": "sha256-mSxfoEs7DiDhMCBzprI/1K7UXzMISuGq0b7T06LVJXE=",
|
||||
"lastModified": 1777468255,
|
||||
"narHash": "sha256-lBZc1UMy+1P1T/E41j3jQrpS7EFI3qegd+ktHZdamIg=",
|
||||
"owner": "YaLTeR",
|
||||
"repo": "niri",
|
||||
"rev": "74d2b18603366b98ec9045ecf4a632422f472365",
|
||||
"rev": "dd1c3bcb9f1ef416df33ffa22d1d9bcee1398e7d",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -761,11 +761,11 @@
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1776796985,
|
||||
"narHash": "sha256-cNFg3H09sBZl1v9ds6PDHfLCUTDJbefGMSv+WxFs+9c=",
|
||||
"lastModified": 1777227006,
|
||||
"narHash": "sha256-A7GcOXjfo2xmZ3ERgN0j6GcqaVzqIf5zpYQcdfDaMr0=",
|
||||
"owner": "xddxdd",
|
||||
"repo": "nix-cachyos-kernel",
|
||||
"rev": "ac5956bbceb022998fc1dd0001322f10ef1e6dda",
|
||||
"rev": "0f7e2bea4088227a80502557f6c0e3b74949d6b5",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -787,11 +787,11 @@
|
||||
"systems": "systems_6"
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1776851701,
|
||||
"narHash": "sha256-tdtOcU2Hz/eLqAhkzUcEocgX0WpjKSbl2SkVjOZGZw0=",
|
||||
"lastModified": 1777381936,
|
||||
"narHash": "sha256-ti3Rfha4cpWy+23yH/8TSMiSVLnt4SQpj64fKzzaT5U=",
|
||||
"owner": "marienz",
|
||||
"repo": "nix-doom-emacs-unstraightened",
|
||||
"rev": "7ac65a49eec5e3f87d27396e645eddbf9dc626de",
|
||||
"rev": "b342035050387785c6ed7559292c5b6041ece109",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -802,11 +802,11 @@
|
||||
},
|
||||
"nix-flatpak": {
|
||||
"locked": {
|
||||
"lastModified": 1776625032,
|
||||
"narHash": "sha256-edvwHiFhgOiwywt6/Iwe+sSn6ybhU3WZGnIoiGcKjfQ=",
|
||||
"lastModified": 1777402031,
|
||||
"narHash": "sha256-6gkfl9y3+ti0Z6dgby8/R4/DRT8sWU0I0TLCIxwWtjk=",
|
||||
"owner": "gmodena",
|
||||
"repo": "nix-flatpak",
|
||||
"rev": "479e19f1decb390aa5b75cae13ddf87d763c74cc",
|
||||
"rev": "22a3adbe7c5c8c8a10a635d32c9ef7fc01a6e4b8",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -846,11 +846,11 @@
|
||||
"systems": "systems_7"
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1776828595,
|
||||
"narHash": "sha256-LkFpFnPTK6H0gwyfYezN3kEKHVxjSdPp/tBnrQRFP3E=",
|
||||
"lastModified": 1777434588,
|
||||
"narHash": "sha256-fpAeeLfsGnvUsQkbnsw9vSB1y/O2ZEZORFE5xQ2CqYk=",
|
||||
"owner": "Infinidoge",
|
||||
"repo": "nix-minecraft",
|
||||
"rev": "28f0f2369655a5910e810c35c698dfaa9ccec692",
|
||||
"rev": "c34d520453c162a216b134ee21c075d4a906a890",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -861,11 +861,11 @@
|
||||
},
|
||||
"nixos-hardware": {
|
||||
"locked": {
|
||||
"lastModified": 1776830795,
|
||||
"narHash": "sha256-PAfvLwuHc1VOvsLcpk6+HDKgMEibvZjCNvbM1BJOA7o=",
|
||||
"lastModified": 1776983936,
|
||||
"narHash": "sha256-ZOQyNqSvJ8UdrrqU1p7vaFcdL53idK+LOM8oRWEWh6o=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nixos-hardware",
|
||||
"rev": "72674a6b5599e844c045ae7449ba91f803d44ebc",
|
||||
"rev": "2096f3f411ce46e88a79ae4eafcfc9df8ed41c61",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -877,11 +877,11 @@
|
||||
},
|
||||
"nixpkgs": {
|
||||
"locked": {
|
||||
"lastModified": 1776548001,
|
||||
"narHash": "sha256-ZSK0NL4a1BwVbbTBoSnWgbJy9HeZFXLYQizjb2DPF24=",
|
||||
"lastModified": 1777268161,
|
||||
"narHash": "sha256-bxrdOn8SCOv8tN4JbTF/TXq7kjo9ag4M+C8yzzIRYbE=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "b12141ef619e0a9c1c84dc8c684040326f27cdcc",
|
||||
"rev": "1c3fe55ad329cbcb28471bb30f05c9827f724c76",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -937,11 +937,11 @@
|
||||
},
|
||||
"nixpkgs-stable": {
|
||||
"locked": {
|
||||
"lastModified": 1776734388,
|
||||
"narHash": "sha256-vl3dkhlE5gzsItuHoEMVe+DlonsK+0836LIRDnm6MXQ=",
|
||||
"lastModified": 1777077449,
|
||||
"narHash": "sha256-AIiMJiqvGrN4HyLEbKAoCSRRYn0rnlW5VbKNIMIYqm4=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "10e7ad5bbcb421fe07e3a4ad53a634b0cd57ffac",
|
||||
"rev": "a4bf06618f0b5ee50f14ed8f0da77d34ecc19160",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -991,11 +991,11 @@
|
||||
"noctalia-qs": "noctalia-qs"
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1776888984,
|
||||
"narHash": "sha256-Up2F/eoMuPUsZnPVYdH5TMHe1TBP2Ue1QuWd0vWZoxY=",
|
||||
"lastModified": 1777427472,
|
||||
"narHash": "sha256-kqcfLdxb+CqTroMErCScvx6YQcZYJcf6X+z5I8kBJK8=",
|
||||
"owner": "noctalia-dev",
|
||||
"repo": "noctalia-shell",
|
||||
"rev": "2c1808f9f8937fc0b82c54af513f7620fec56d71",
|
||||
"rev": "9f8dd48c8df5ab1f7f87ddf9842627e1e5682186",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -1014,11 +1014,11 @@
|
||||
"treefmt-nix": "treefmt-nix_2"
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1776585574,
|
||||
"narHash": "sha256-j35EWhKoGhKrfcXcAOpoRVgXEPQt41Eukji/h59cnjk=",
|
||||
"lastModified": 1777380063,
|
||||
"narHash": "sha256-q5mWOEICcZzr+KnjIwDHV9EXiBxOC9cnBpxZbDAViU8=",
|
||||
"owner": "noctalia-dev",
|
||||
"repo": "noctalia-qs",
|
||||
"rev": "75d180c28a9ab4470e980f3d6f706ad6c5213add",
|
||||
"rev": "8742a7a748c43bf44eb6862a8ebd3591ed71502d",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -1037,11 +1037,11 @@
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1775585728,
|
||||
"narHash": "sha256-8Psjt+TWvE4thRKktJsXfR6PA/fWWsZ04DVaY6PUhr4=",
|
||||
"lastModified": 1776796298,
|
||||
"narHash": "sha256-PcRvlWayisPSjd0UcRQbhG8Oqw78AcPE6x872cPRHN8=",
|
||||
"owner": "cachix",
|
||||
"repo": "pre-commit-hooks.nix",
|
||||
"rev": "580633fa3fe5fc0379905986543fd7495481913d",
|
||||
"rev": "3cfd774b0a530725a077e17354fbdb87ea1c4aad",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -1075,6 +1075,7 @@
|
||||
"root": {
|
||||
"inputs": {
|
||||
"agenix": "agenix",
|
||||
"android-skills": "android-skills",
|
||||
"arr-init": "arr-init",
|
||||
"deploy-rs": "deploy-rs",
|
||||
"disko": "disko",
|
||||
@@ -1133,11 +1134,11 @@
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1776827647,
|
||||
"narHash": "sha256-sYixYhp5V8jCajO8TRorE4fzs7IkL4MZdfLTKgkPQBk=",
|
||||
"lastModified": 1777432579,
|
||||
"narHash": "sha256-Ce11TStDsqCge2vAAfLKe2+4lDI5cSX5ZYZOuKJBKKQ=",
|
||||
"owner": "oxalica",
|
||||
"repo": "rust-overlay",
|
||||
"rev": "40e6ccc06e1245a4837cbbd6bdda64e21cc67379",
|
||||
"rev": "3ecb5e6ab380ced3272ef7fcfe398bffbcc0f152",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -1190,11 +1191,11 @@
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1776653059,
|
||||
"narHash": "sha256-K3tWnUj6FXaK95sBUajedutJrFVrOzYhvrQwQjJ0FbU=",
|
||||
"lastModified": 1777415788,
|
||||
"narHash": "sha256-71R6QKZiVT7G3zJK+8Nme3jgPdEHM3810hUlg514joQ=",
|
||||
"owner": "nix-community",
|
||||
"repo": "srvos",
|
||||
"rev": "4968d2a44c84edfc9a38a2494cc7f85ad2c7122b",
|
||||
"rev": "c5298b54100a89a119faeda69bfabe2383cd0383",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -1356,11 +1357,11 @@
|
||||
"trackerlist": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1776809383,
|
||||
"narHash": "sha256-r4V5l+Yk3jxVfZNQk2Ddu8Vlyshd9FWcnGGFyaL4UCw=",
|
||||
"lastModified": 1777414176,
|
||||
"narHash": "sha256-fspwkkNNCmlXy1hrY0oWtLAPCOMw6oCO2pfV0qkExr8=",
|
||||
"owner": "ngosang",
|
||||
"repo": "trackerslist",
|
||||
"rev": "37d5c0552c25abf50f05cc6b377345e65a588dc2",
|
||||
"rev": "6330998ce573abfd3b585dc99a0532315b203be8",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -1524,11 +1525,11 @@
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1776844129,
|
||||
"narHash": "sha256-DaYSEBVzTvUhTuoVe70NHphoq5JKUHqUhlNlN5XnTuU=",
|
||||
"lastModified": 1777356688,
|
||||
"narHash": "sha256-fOhJpz7QAkBWAAih72CmnIfIN0pHfuZjhZQ/hBLNWxo=",
|
||||
"owner": "0xc000022070",
|
||||
"repo": "zen-browser-flake",
|
||||
"rev": "90706e6ab801e4fb7bc53343db67583631936192",
|
||||
"rev": "b3c972b3d8537a9cf7a0db96b164c9c3e580884a",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
|
||||
71
flake.nix
71
flake.nix
@@ -82,6 +82,12 @@
|
||||
url = "github:ChrisOboe/json2steamshortcut";
|
||||
inputs.nixpkgs.follows = "nixpkgs";
|
||||
};
|
||||
# Google's official agent-skills for Android development (Apache 2.0).
|
||||
# Consumed by home/progs/pi.nix and exposed under ~/.omp/agent/skills/.
|
||||
android-skills = {
|
||||
url = "github:android/skills";
|
||||
flake = false;
|
||||
};
|
||||
|
||||
# Server (muffin) — follows nixpkgs-stable
|
||||
nix-minecraft = {
|
||||
@@ -163,6 +169,9 @@
|
||||
|
||||
niriPackage = inputs.niri.packages.${system}.niri-unstable;
|
||||
|
||||
# --- Desktop-channel pkgs (used by portable homeConfigurations) ---
|
||||
desktopPkgs = import nixpkgs { inherit system; };
|
||||
|
||||
# --- Server (muffin) plumbing ---
|
||||
bootstrapPkgs = import nixpkgs-stable { inherit system; };
|
||||
patchedStableSrc = bootstrapPkgs.applyPatches {
|
||||
@@ -177,17 +186,20 @@
|
||||
targetPlatform = system;
|
||||
buildPlatform = builtins.currentSystem;
|
||||
};
|
||||
serviceConfigs = import ./hosts/muffin/service-configs.nix;
|
||||
siteConfig = import ./site-config.nix;
|
||||
serviceConfigs = import ./hosts/muffin/service-configs.nix { site_config = siteConfig; };
|
||||
serverLib = import ./lib {
|
||||
inherit inputs;
|
||||
lib = nixpkgs-stable.lib;
|
||||
pkgs = serverPkgs;
|
||||
service_configs = serviceConfigs;
|
||||
site_config = siteConfig;
|
||||
};
|
||||
testSuite = import ./tests/tests.nix {
|
||||
pkgs = serverPkgs;
|
||||
lib = serverLib;
|
||||
inherit inputs;
|
||||
site_config = siteConfig;
|
||||
config = self.nixosConfigurations.muffin.config;
|
||||
};
|
||||
|
||||
@@ -200,6 +212,7 @@
|
||||
specialArgs = {
|
||||
inherit inputs username hostname;
|
||||
niri-package = niriPackage;
|
||||
site_config = siteConfig;
|
||||
};
|
||||
modules = [
|
||||
home-manager.nixosModules.home-manager
|
||||
@@ -219,6 +232,7 @@
|
||||
niri-package = niriPackage;
|
||||
homeDirectory = "/home/${username}";
|
||||
stateVersion = config.system.stateVersion;
|
||||
site_config = siteConfig;
|
||||
};
|
||||
home-manager.users.${username} = import ./hosts/${hostname}/home.nix;
|
||||
}
|
||||
@@ -238,6 +252,7 @@
|
||||
hostname = "muffin";
|
||||
eth_interface = "enp4s0";
|
||||
service_configs = serviceConfigs;
|
||||
site_config = siteConfig;
|
||||
lib = serverLib;
|
||||
};
|
||||
modules = [
|
||||
@@ -346,6 +361,9 @@
|
||||
(
|
||||
{ ... }:
|
||||
{
|
||||
home-manager.extraSpecialArgs = {
|
||||
site_config = siteConfig;
|
||||
};
|
||||
home-manager.users.${username} = import ./hosts/muffin/home.nix;
|
||||
}
|
||||
)
|
||||
@@ -364,11 +382,33 @@
|
||||
nixosConfigurations = {
|
||||
mreow = mkDesktopHost "mreow";
|
||||
yarn = mkDesktopHost "yarn";
|
||||
patiodeck = mkDesktopHost "patiodeck";
|
||||
muffin = muffinHost;
|
||||
};
|
||||
|
||||
# Standalone home-manager profile — usable on any x86_64-linux machine
|
||||
# with nix installed (NixOS or not). Activate with:
|
||||
# nix run home-manager/master -- switch --flake ".#primary"
|
||||
# Ships the shared terminal profile (fish, helix, modern CLI, git).
|
||||
homeConfigurations.primary = home-manager.lib.homeManagerConfiguration {
|
||||
pkgs = desktopPkgs;
|
||||
extraSpecialArgs = {
|
||||
site_config = siteConfig;
|
||||
};
|
||||
modules = [
|
||||
./home/profiles/terminal.nix
|
||||
{
|
||||
home = {
|
||||
username = username;
|
||||
homeDirectory = "/home/${username}";
|
||||
stateVersion = "24.11";
|
||||
};
|
||||
}
|
||||
];
|
||||
};
|
||||
|
||||
deploy.nodes.muffin = {
|
||||
hostname = "server-public";
|
||||
hostname = siteConfig.hosts.muffin.alias;
|
||||
profiles.system = {
|
||||
sshUser = "root";
|
||||
user = "root";
|
||||
@@ -382,7 +422,27 @@
|
||||
# want to avoid when the deploy is supposed to be a no-op blocked by
|
||||
# the guard. Blocking before the deploy-rs invocation is the only
|
||||
# clean way to leave the running system untouched.
|
||||
path = deploy-rs.lib.${system}.activate.nixos self.nixosConfigurations.muffin;
|
||||
#
|
||||
# Activation uses `switch-to-configuration boot` + a detached finalize
|
||||
# (modules/server-deploy-finalize.nix) rather than the default
|
||||
# `switch`. The gitea-actions runner driving CI deploys lives on
|
||||
# muffin itself; a direct `switch` restarts gitea-runner-muffin mid-
|
||||
# activation, killing the SSH session, the CI job, and deploy-rs's
|
||||
# magic-rollback handshake. `boot` only touches the bootloader — no
|
||||
# service restarts — and deploy-finalize schedules a pid1-owned
|
||||
# transient unit that runs the real `switch` (or `systemctl reboot`
|
||||
# when kernel/initrd/kernel-modules changed) ~60s later, surviving
|
||||
# runner restart because it's decoupled from the SSH session.
|
||||
path =
|
||||
deploy-rs.lib.${system}.activate.custom self.nixosConfigurations.muffin.config.system.build.toplevel
|
||||
''
|
||||
# matches activate.nixos's workaround for NixOS/nixpkgs#73404
|
||||
cd /tmp
|
||||
|
||||
$PROFILE/bin/switch-to-configuration boot
|
||||
|
||||
${nixpkgs-stable.lib.getExe self.nixosConfigurations.muffin.config.system.build.deployFinalize}
|
||||
'';
|
||||
};
|
||||
};
|
||||
|
||||
@@ -395,6 +455,11 @@
|
||||
path = test;
|
||||
}) testSuite
|
||||
);
|
||||
|
||||
# Buildenv of every binary in the portable terminal profile. Install
|
||||
# without home-manager via:
|
||||
# nix profile install ".#cli-tools"
|
||||
cli-tools = self.homeConfigurations.primary.config.home.path;
|
||||
}
|
||||
// (serverPkgs.lib.mapAttrs' (name: test: {
|
||||
name = "test-${name}";
|
||||
|
||||
@@ -8,8 +8,7 @@
|
||||
{
|
||||
imports = [
|
||||
./no-gui.nix
|
||||
# ../progs/ghostty.nix
|
||||
../progs/alacritty.nix
|
||||
../progs/ghostty.nix
|
||||
../progs/emacs.nix
|
||||
# ../progs/trezor.nix # - broken
|
||||
../progs/flatpak.nix
|
||||
@@ -87,6 +86,21 @@
|
||||
|
||||
signal-desktop
|
||||
|
||||
# alternative GTK signal client; carries five local feature patches
|
||||
# under patches/flare/ on top of upstream 0.20.4 (typing indicators,
|
||||
# formatted messages, edited messages, multi-select with delete-for-me,
|
||||
# and in-channel message search).
|
||||
(pkgs.flare-signal.overrideAttrs (old: {
|
||||
patches = (old.patches or [ ]) ++ [
|
||||
../../patches/flare/0001-feat-typing-Implement-typing-indicators.patch
|
||||
../../patches/flare/0002-feat-messages-Implement-formatted-messages.patch
|
||||
../../patches/flare/0003-feat-messages-Implement-edited-messages.patch
|
||||
../../patches/flare/0004-feat-messages-Multi-select-messages-and-delete-for-m.patch
|
||||
../../patches/flare/0005-feat-messages-In-channel-message-search.patch
|
||||
../../patches/flare/0006-feat-messages-Show-This-message-was-deleted.-placeho.patch
|
||||
];
|
||||
}))
|
||||
|
||||
# accounting
|
||||
# gnucash
|
||||
|
||||
@@ -227,4 +241,11 @@
|
||||
uris = [ "qemu:///system" ];
|
||||
};
|
||||
};
|
||||
|
||||
# macOS-style clipboard aliases — depend on wl-clipboard, so scoped here
|
||||
# rather than in the shared fish config.
|
||||
programs.fish.shellAliases = {
|
||||
pbcopy = "${pkgs.wl-clipboard}/bin/wl-copy";
|
||||
pbpaste = "${pkgs.wl-clipboard}/bin/wl-paste";
|
||||
};
|
||||
}
|
||||
|
||||
@@ -59,83 +59,16 @@ let
|
||||
# jasmin
|
||||
];
|
||||
|
||||
common_tools = with pkgs; [
|
||||
# hex viewer
|
||||
hexyl
|
||||
|
||||
# find typos in code
|
||||
typos
|
||||
|
||||
# replacements for common posix tools
|
||||
eza # ls replacement
|
||||
bat # pretty `cat` clone
|
||||
delta # viewer for `git` and `diff` output
|
||||
dust # pretty `du` version
|
||||
duf # better `df` clone
|
||||
gping # `ping`... but with a graph!!
|
||||
tldr # `man` but more straight-forward and simpler
|
||||
ripgrep # grep, but written in rust, respects .gitignore, and very very fast, command is `rg`
|
||||
fd # alternative to `find`
|
||||
|
||||
# status tools
|
||||
htop
|
||||
bottom
|
||||
|
||||
# other tools
|
||||
unzip
|
||||
wget
|
||||
killall
|
||||
file
|
||||
b3sum
|
||||
|
||||
# "A hexadecimal, binary, and ASCII dump utility with color support"
|
||||
tinyxxd
|
||||
|
||||
# networking tool
|
||||
lsof
|
||||
|
||||
# view SMART status of drives
|
||||
# hardware diagnostics — wanted on dev machines, not part of the shared
|
||||
# terminal profile (which is meant to be portable to any machine).
|
||||
hw_diag = with pkgs; [
|
||||
smartmontools
|
||||
|
||||
# adds `sensors` command
|
||||
lm_sensors
|
||||
|
||||
# lspci
|
||||
pciutils
|
||||
|
||||
# convert between various units
|
||||
units
|
||||
|
||||
jq
|
||||
|
||||
# DNS things
|
||||
dig
|
||||
|
||||
bun
|
||||
];
|
||||
|
||||
in
|
||||
{
|
||||
imports = [
|
||||
../progs/fish.nix
|
||||
../progs/helix.nix
|
||||
../progs/pi.nix
|
||||
(
|
||||
{ ... }:
|
||||
{
|
||||
nixpkgs.overlays = [
|
||||
inputs.rust-overlay.overlays.default
|
||||
];
|
||||
}
|
||||
)
|
||||
];
|
||||
|
||||
home.stateVersion = stateVersion;
|
||||
|
||||
home.packages =
|
||||
with pkgs;
|
||||
lib.concatLists [
|
||||
[
|
||||
# dev-only tools. Universal CLI (bat, rg, htop, jq, …) lives in terminal.nix.
|
||||
dev_tools = with pkgs; [
|
||||
# python formatter
|
||||
ruff
|
||||
|
||||
@@ -143,23 +76,13 @@ in
|
||||
hugo
|
||||
go
|
||||
|
||||
# for benchmaking stuff
|
||||
hyperfine
|
||||
|
||||
pfetch-rs
|
||||
waypipe
|
||||
|
||||
sshfs
|
||||
|
||||
# nix formatter
|
||||
nixfmt-tree
|
||||
|
||||
# serial viewer
|
||||
minicom
|
||||
|
||||
# "~~matt's~~ my trace route"
|
||||
mtr
|
||||
|
||||
ffmpeg-full
|
||||
|
||||
# microcontroller tooling
|
||||
@@ -189,10 +112,6 @@ in
|
||||
clang
|
||||
gdb
|
||||
|
||||
git-crypt
|
||||
|
||||
imagemagick
|
||||
|
||||
nixpkgs-review
|
||||
|
||||
nmap
|
||||
@@ -212,51 +131,52 @@ in
|
||||
powerstat
|
||||
|
||||
yt-dlp
|
||||
]
|
||||
|
||||
# JS runtime
|
||||
bun
|
||||
|
||||
# convert between various units
|
||||
units
|
||||
];
|
||||
in
|
||||
{
|
||||
imports = [
|
||||
./terminal.nix
|
||||
../progs/pi.nix
|
||||
(
|
||||
{ ... }:
|
||||
{
|
||||
nixpkgs.overlays = [
|
||||
inputs.rust-overlay.overlays.default
|
||||
];
|
||||
}
|
||||
)
|
||||
];
|
||||
|
||||
home.stateVersion = stateVersion;
|
||||
|
||||
home.packages = lib.concatLists [
|
||||
rust_pkgs
|
||||
lsps
|
||||
java_tools
|
||||
common_tools
|
||||
hw_diag
|
||||
dev_tools
|
||||
];
|
||||
|
||||
# fish aliases that depend on packages only present in this profile.
|
||||
# Universal aliases (ls/la/ll/lt, git-size) live in home/progs/fish.nix.
|
||||
programs.fish.shellAliases = {
|
||||
c = "${lib.getExe pkgs.cargo}";
|
||||
cr = "${lib.getExe pkgs.cargo} run";
|
||||
cb = "${lib.getExe pkgs.cargo} build";
|
||||
|
||||
gcc-native = "${lib.getExe pkgs.gcc} -Q --help=target -mtune=native -march=native | ${lib.getExe pkgs.gnugrep} -E '^\\s+\\-(mtune|march)=' | ${pkgs.coreutils}/bin/tr -d '[:blank:]'";
|
||||
};
|
||||
|
||||
# https://github.com/flamegraph-rs/flamegraph
|
||||
home.file.".cargo/config.toml".text = ''
|
||||
[target.${lib.strings.removeSuffix "-linux" pkgs.stdenv.hostPlatform.system}-unknown-linux-gnu]
|
||||
linker = "${lib.getExe pkgs.clang}"
|
||||
rustflags = ["-Clink-arg=-Wl,--no-rosegment"]
|
||||
'';
|
||||
|
||||
# git (self explanatory)
|
||||
programs.git = {
|
||||
enable = true;
|
||||
package = pkgs.git;
|
||||
|
||||
lfs.enable = true;
|
||||
|
||||
ignores = [ ".sisyphus" ];
|
||||
|
||||
settings = {
|
||||
init = {
|
||||
# master -> main
|
||||
defaultBranch = "main";
|
||||
};
|
||||
push.autoSetupRemote = true;
|
||||
user = {
|
||||
name = "Simon Gardling";
|
||||
email = "titaniumtown@proton.me";
|
||||
};
|
||||
};
|
||||
|
||||
# gpg signing keys
|
||||
signing = {
|
||||
key = "9AB28AC10ECE533D";
|
||||
signByDefault = true;
|
||||
};
|
||||
};
|
||||
|
||||
# better way to view diffs
|
||||
programs.delta = {
|
||||
enable = true;
|
||||
enableGitIntegration = true;
|
||||
};
|
||||
}
|
||||
|
||||
103
home/profiles/terminal.nix
Normal file
103
home/profiles/terminal.nix
Normal file
@@ -0,0 +1,103 @@
|
||||
# Shared terminal-tools profile.
|
||||
#
|
||||
# The set of CLI tooling I want available on every machine I use:
|
||||
# - mreow + yarn pick this up via home/profiles/no-gui.nix
|
||||
# - muffin picks this up via hosts/muffin/home.nix
|
||||
# - any non-NixOS machine picks it up via the homeConfigurations output in flake.nix
|
||||
#
|
||||
# Scope is intentionally narrow: the daily-driver shell (fish + helix + modern
|
||||
# CLI replacements + git). No language toolchains, no hardware-specific admin
|
||||
# tools, no GUI-adjacent utilities — those belong in profiles layered on top.
|
||||
{
|
||||
lib,
|
||||
site_config,
|
||||
pkgs,
|
||||
...
|
||||
}:
|
||||
{
|
||||
imports = [
|
||||
../progs/fish.nix
|
||||
../progs/helix.nix
|
||||
];
|
||||
|
||||
home.packages = with pkgs; [
|
||||
# modern CLI replacements for POSIX basics
|
||||
eza # ls
|
||||
bat # cat
|
||||
delta # diff viewer (also wired into git below)
|
||||
dust # du
|
||||
duf # df
|
||||
gping # ping, with a graph
|
||||
ripgrep # grep, respects .gitignore
|
||||
fd # find
|
||||
tldr # man, simpler
|
||||
|
||||
# system / process tools
|
||||
htop
|
||||
bottom
|
||||
lsof
|
||||
file
|
||||
killall
|
||||
unzip
|
||||
tmux
|
||||
wget
|
||||
|
||||
# network
|
||||
dig
|
||||
mtr
|
||||
|
||||
# text / data
|
||||
jq
|
||||
hexyl
|
||||
tinyxxd
|
||||
b3sum
|
||||
typos
|
||||
|
||||
# media (handy from a shell, lightweight enough to be universal)
|
||||
imagemagick
|
||||
|
||||
# universal dev-adjacent
|
||||
git-crypt
|
||||
hyperfine
|
||||
|
||||
# nix
|
||||
nixfmt-tree
|
||||
|
||||
# shell greeter (invoked from fish's interactiveShellInit)
|
||||
pfetch-rs
|
||||
];
|
||||
|
||||
# Git: mechanical config + identity lives here so `git` works out of the box
|
||||
# on every machine. Signing is opt-in via lib.mkDefault so machines without
|
||||
# my GPG key can override `signing.signByDefault = false` without fighting
|
||||
# priority.
|
||||
programs.git = {
|
||||
enable = true;
|
||||
package = pkgs.git;
|
||||
|
||||
lfs.enable = true;
|
||||
|
||||
ignores = [ ".sisyphus" ];
|
||||
|
||||
settings = {
|
||||
init.defaultBranch = "main";
|
||||
push.autoSetupRemote = true;
|
||||
user = {
|
||||
name = "Simon Gardling";
|
||||
email = site_config.contact_email;
|
||||
};
|
||||
};
|
||||
|
||||
signing = {
|
||||
format = "openpgp";
|
||||
key = lib.mkDefault "9AB28AC10ECE533D";
|
||||
signByDefault = lib.mkDefault true;
|
||||
};
|
||||
};
|
||||
|
||||
# Pretty diff viewer, wired into git.
|
||||
programs.delta = {
|
||||
enable = true;
|
||||
enableGitIntegration = true;
|
||||
};
|
||||
}
|
||||
@@ -1,131 +0,0 @@
|
||||
{ pkgs, ... }:
|
||||
{
|
||||
home.sessionVariables = {
|
||||
TERMINAL = "alacritty";
|
||||
};
|
||||
|
||||
programs.alacritty = {
|
||||
enable = true;
|
||||
package = pkgs.alacritty;
|
||||
settings = {
|
||||
# some programs can't handle alacritty
|
||||
env.TERM = "xterm-256color";
|
||||
|
||||
window = {
|
||||
# using a window manager, no decorations needed
|
||||
decorations = "none";
|
||||
|
||||
# semi-transparent
|
||||
opacity = 0.90;
|
||||
|
||||
# padding between the content of the terminal and the edge
|
||||
padding = {
|
||||
x = 10;
|
||||
y = 10;
|
||||
};
|
||||
|
||||
dimensions = {
|
||||
columns = 80;
|
||||
lines = 40;
|
||||
};
|
||||
};
|
||||
|
||||
scrolling = {
|
||||
history = 1000;
|
||||
multiplier = 3;
|
||||
};
|
||||
|
||||
font =
|
||||
let
|
||||
baseFont = {
|
||||
family = "JetBrains Mono Nerd Font";
|
||||
style = "Regular";
|
||||
};
|
||||
in
|
||||
{
|
||||
size = 12;
|
||||
|
||||
normal = baseFont;
|
||||
|
||||
bold = baseFont // {
|
||||
style = "Bold";
|
||||
};
|
||||
|
||||
italic = baseFont // {
|
||||
style = "Italic";
|
||||
};
|
||||
|
||||
offset.y = 0;
|
||||
glyph_offset.y = 0;
|
||||
};
|
||||
|
||||
# color scheme
|
||||
colors =
|
||||
let
|
||||
normal = {
|
||||
black = "0x1b1e28";
|
||||
red = "0xd0679d";
|
||||
green = "0x5de4c7";
|
||||
yellow = "0xfffac2";
|
||||
blue = "#435c89";
|
||||
magenta = "0xfcc5e9";
|
||||
cyan = "0xadd7ff";
|
||||
white = "0xffffff";
|
||||
};
|
||||
|
||||
bright = {
|
||||
black = "0xa6accd";
|
||||
red = normal.red;
|
||||
green = normal.green;
|
||||
yellow = normal.yellow;
|
||||
blue = normal.cyan;
|
||||
magenta = "0xfae4fc";
|
||||
cyan = "0x89ddff";
|
||||
white = normal.white;
|
||||
};
|
||||
in
|
||||
{
|
||||
inherit normal bright;
|
||||
primary = {
|
||||
background = "0x131621";
|
||||
foreground = bright.black;
|
||||
};
|
||||
|
||||
cursor = {
|
||||
text = "CellBackground";
|
||||
cursor = "CellForeground";
|
||||
};
|
||||
|
||||
search =
|
||||
let
|
||||
foreground = normal.black;
|
||||
background = normal.cyan;
|
||||
in
|
||||
{
|
||||
matches = {
|
||||
inherit foreground background;
|
||||
};
|
||||
|
||||
focused_match = {
|
||||
inherit foreground background;
|
||||
};
|
||||
};
|
||||
|
||||
selection = {
|
||||
text = "CellForeground";
|
||||
background = "0x303340";
|
||||
};
|
||||
|
||||
vi_mode_cursor = {
|
||||
text = "CellBackground";
|
||||
cursor = "CellForeground";
|
||||
};
|
||||
};
|
||||
|
||||
cursor = {
|
||||
style = "Underline";
|
||||
vi_mode_style = "Underline";
|
||||
};
|
||||
};
|
||||
};
|
||||
}
|
||||
@@ -50,7 +50,7 @@
|
||||
(vc-gutter +pretty) ; vcs diff in the fringe
|
||||
vi-tilde-fringe ; fringe tildes to mark beyond EOB
|
||||
;;window-select ; visually switch windows
|
||||
workspaces ; tab emulation, persistence & separate workspaces
|
||||
;; workspaces ; tab emulation, persistence & separate workspaces
|
||||
;;zen ; distraction-free coding or writing
|
||||
|
||||
:editor
|
||||
|
||||
@@ -1,7 +1,12 @@
|
||||
# Shared fish configuration — imported from home/profiles/terminal.nix, so it
|
||||
# runs on every host (mreow, yarn, muffin, and any machine using the portable
|
||||
# homeConfigurations output).
|
||||
#
|
||||
# Desktop/dev-specific aliases (cargo, gcc, wl-clipboard) are added from the
|
||||
# profile that owns their dependencies, not here.
|
||||
{ pkgs, lib, ... }:
|
||||
let
|
||||
eza = "${lib.getExe pkgs.eza} --color=always --group-directories-first";
|
||||
cargo = "${lib.getExe pkgs.cargo}";
|
||||
coreutils = "${pkgs.coreutils}/bin";
|
||||
in
|
||||
{
|
||||
@@ -9,21 +14,17 @@ in
|
||||
enable = true;
|
||||
|
||||
interactiveShellInit = ''
|
||||
#disable greeting
|
||||
# disable greeting
|
||||
set fish_greeting
|
||||
|
||||
#fixes gnupg password entry
|
||||
# fixes gnupg password entry
|
||||
export GPG_TTY=(${coreutils}/tty)
|
||||
|
||||
#pfetch on shell start (disable pkgs because of execution time)
|
||||
# pfetch on shell start (disable pkgs because of execution time)
|
||||
PF_INFO="ascii title os host kernel uptime memory editor wm" ${lib.getExe pkgs.pfetch-rs}
|
||||
'';
|
||||
|
||||
shellAliases = {
|
||||
c = cargo;
|
||||
cr = "${cargo} run";
|
||||
cb = "${cargo} build";
|
||||
|
||||
# from DistroTube's dot files: Changing "ls" to "eza"
|
||||
ls = "${eza} -al";
|
||||
la = "${eza} -a";
|
||||
@@ -38,12 +39,6 @@ in
|
||||
${coreutils}/sort --numeric-sort --key=2 |
|
||||
${coreutils}/cut -c 1-12,41- |
|
||||
${coreutils}/numfmt --field=2 --to=iec-i --suffix=B --padding=7 --round=nearest'';
|
||||
|
||||
# aliases for (I think) macos commands
|
||||
pbcopy = "${pkgs.wl-clipboard}/bin/wl-copy";
|
||||
pbpaste = "${pkgs.wl-clipboard}/bin/wl-paste";
|
||||
|
||||
gcc-native = "${lib.getExe pkgs.gcc} -Q --help=target -mtune=native -march=native | ${lib.getExe pkgs.gnugrep} -E '^\\s+\-(mtune|march)=' | ${coreutils}/tr -d '[:blank:]'";
|
||||
};
|
||||
|
||||
shellInit = ''
|
||||
|
||||
@@ -1,12 +1,76 @@
|
||||
{ pkgs, ... }:
|
||||
{ ... }:
|
||||
{
|
||||
# https://mynixos.com/home-manager/option/programs.ghostty
|
||||
programs.ghostty = {
|
||||
enable = true;
|
||||
enableFishIntegration = true;
|
||||
|
||||
# custom palette ported verbatim from the previous alacritty config
|
||||
# (poimandres-ish). lives in ~/.config/ghostty/themes/poimandres and is
|
||||
# selected by `theme = "poimandres"` below.
|
||||
themes.poimandres = {
|
||||
palette = [
|
||||
"0=#1b1e28"
|
||||
"1=#d0679d"
|
||||
"2=#5de4c7"
|
||||
"3=#fffac2"
|
||||
"4=#435c89"
|
||||
"5=#fcc5e9"
|
||||
"6=#add7ff"
|
||||
"7=#ffffff"
|
||||
"8=#a6accd"
|
||||
"9=#d0679d"
|
||||
"10=#5de4c7"
|
||||
"11=#fffac2"
|
||||
"12=#add7ff"
|
||||
"13=#fae4fc"
|
||||
"14=#89ddff"
|
||||
"15=#ffffff"
|
||||
];
|
||||
background = "131621";
|
||||
foreground = "a6accd";
|
||||
cursor-color = "a6accd";
|
||||
cursor-text = "131621";
|
||||
selection-background = "303340";
|
||||
selection-foreground = "a6accd";
|
||||
};
|
||||
|
||||
settings = {
|
||||
theme = "Adventure";
|
||||
background-opacity = 0.7;
|
||||
theme = "poimandres";
|
||||
|
||||
# font
|
||||
font-family = "JetBrainsMono Nerd Font";
|
||||
font-size = 12;
|
||||
|
||||
# window
|
||||
window-decoration = false;
|
||||
window-padding-x = 10;
|
||||
window-padding-y = 10;
|
||||
window-width = 80;
|
||||
window-height = 40;
|
||||
|
||||
# semi-transparent background
|
||||
background-opacity = 0.90;
|
||||
|
||||
# cursor
|
||||
cursor-style = "underline";
|
||||
|
||||
# always open new windows at $HOME instead of inheriting whatever cwd the
|
||||
# currently-focused ghostty window has. with gtk-single-instance, the
|
||||
# focused-window inherit rule otherwise sticks the daemon's first cwd to
|
||||
# every subsequent niri Mod+T launch.
|
||||
window-inherit-working-directory = false;
|
||||
working-directory = "home";
|
||||
|
||||
# ssh into hosts that lack ghostty's terminfo: ssh-terminfo auto-installs
|
||||
# it remotely on first connect (and caches), ssh-env is the fallback that
|
||||
# downgrades TERM to xterm-256color when the install can't run.
|
||||
shell-integration-features = "ssh-env,ssh-terminfo";
|
||||
|
||||
# keep one daemon alive so subsequent launches (e.g. niri Mod+T) are
|
||||
# instant instead of paying GTK + wgpu init each time. relies on the
|
||||
# dbus-activated systemd user service that the HM module wires up.
|
||||
gtk-single-instance = true;
|
||||
};
|
||||
};
|
||||
|
||||
|
||||
@@ -17,7 +17,8 @@
|
||||
bar = {
|
||||
position = "top";
|
||||
floating = true;
|
||||
backgroundOpacity = 0.93;
|
||||
backgroundOpacity = 0.0;
|
||||
useSeparateOpacity = true;
|
||||
};
|
||||
general = {
|
||||
animationSpeed = 1.5;
|
||||
@@ -32,6 +33,7 @@
|
||||
};
|
||||
wallpaper = {
|
||||
enabled = true;
|
||||
skipStartupTransition = true;
|
||||
};
|
||||
};
|
||||
};
|
||||
|
||||
@@ -34,22 +34,69 @@ let
|
||||
};
|
||||
};
|
||||
};
|
||||
|
||||
# Pull Google's official agent-skills (github:android/skills, Apache 2.0).
|
||||
# The upstream tree nests skills as <category>/<name>/SKILL.md (build/agp/…,
|
||||
# jetpack-compose/migration/…, performance/r8-analyzer, etc.). omp expects a
|
||||
# flat layout, so we walk the tree, find every SKILL.md, and mount each
|
||||
# parent directory at ~/.omp/agent/skills/<basename>/. Every leaf basename
|
||||
# in upstream is unique, so flattening is lossless. New skills upstream show
|
||||
# up automatically on `nix flake update --input-name android-skills`.
|
||||
findSkillDirs =
|
||||
path:
|
||||
let
|
||||
entries = builtins.readDir path;
|
||||
hasSkillMd = builtins.pathExists (path + "/SKILL.md");
|
||||
subdirs = lib.filterAttrs (_: t: t == "directory") entries;
|
||||
recurse = lib.concatLists (lib.mapAttrsToList (n: _: findSkillDirs (path + "/${n}")) subdirs);
|
||||
in
|
||||
if hasSkillMd then [ path ] else recurse;
|
||||
|
||||
androidSkillFiles = lib.listToAttrs (
|
||||
map (
|
||||
dir:
|
||||
lib.nameValuePair ".omp/agent/skills/${builtins.unsafeDiscardStringContext (baseNameOf dir)}" {
|
||||
source = dir;
|
||||
}
|
||||
) (findSkillDirs inputs.android-skills)
|
||||
);
|
||||
|
||||
# Browser path for the playwright skill body.
|
||||
playwrightChromium =
|
||||
let
|
||||
browsers = pkgs.playwright-driver.browsers;
|
||||
chromiumDir = builtins.head (
|
||||
builtins.filter (n: builtins.match "chromium-[0-9]+" n != null) (
|
||||
builtins.attrNames browsers.passthru.entries
|
||||
)
|
||||
);
|
||||
in
|
||||
{
|
||||
browsers = "${browsers}";
|
||||
chrome = "${browsers}/${chromiumDir}/chrome-linux64/chrome";
|
||||
};
|
||||
in
|
||||
{
|
||||
home.packages = [
|
||||
# `bun2nix.hook` sets `patchPhase = bunPatchPhase`, which only runs `patchShebangs` and
|
||||
# silently ignores the standard `patches` attribute. Apply patches via `prePatch` instead
|
||||
# so they actually take effect. Tracking: nothing upstream yet.
|
||||
(inputs.llm-agents.packages.${pkgs.stdenv.hostPlatform.system}.omp.overrideAttrs (old: {
|
||||
patches = (old.patches or [ ]) ++ [ ];
|
||||
prePatch = (old.prePatch or "") + ''
|
||||
patch -p1 < ${../../patches/omp/0001-fix-reasoning_content.patch}
|
||||
'';
|
||||
}))
|
||||
];
|
||||
|
||||
home.file = androidSkillFiles // {
|
||||
# main settings: ~/.omp/agent/config.yml (JSON is valid YAML)
|
||||
home.file.".omp/agent/config.yml".text = builtins.toJSON ompSettings;
|
||||
".omp/agent/config.yml".text = builtins.toJSON ompSettings;
|
||||
|
||||
# model/provider config: ~/.omp/agent/models.yml
|
||||
home.file.".omp/agent/models.yml".text = builtins.toJSON ompModels;
|
||||
".omp/agent/models.yml".text = builtins.toJSON ompModels;
|
||||
|
||||
# global instructions loaded at startup
|
||||
home.file.".omp/agent/AGENTS.md".text = ''
|
||||
".omp/agent/AGENTS.md".text = ''
|
||||
You are an intelligent and observant agent.
|
||||
If instructed to commit, disable gpg signing.
|
||||
You are on nixOS, if you don't have access to a tool, you can access it via the `nix-shell` command.
|
||||
@@ -69,9 +116,12 @@ in
|
||||
## Nix
|
||||
For using `nix build` append `-L` to get better visibility into the logs.
|
||||
If you get an error that a file can't be found, always try to `git add` the file before trying other troubleshooting steps.
|
||||
|
||||
## Implementation
|
||||
When sketching out an implementation of something, always look for tools that already exist in the space first before implementing something custom. This is also the case when it comes to submodules and sections of code, I don't want you to implement things in-house when it isn't needed.
|
||||
'';
|
||||
|
||||
home.file.".omp/agent/skills/android-ui/SKILL.md".text = ''
|
||||
".omp/agent/skills/android-ui/SKILL.md".text = ''
|
||||
---
|
||||
name: android-ui
|
||||
description: Android UI automation via ADB. Use for any Android device interaction, UI testing, screenshot analysis, element coordinate lookup, and gesture automation.
|
||||
@@ -140,17 +190,7 @@ in
|
||||
|
||||
# omp has a built-in browser tool with NixOS auto-detection,
|
||||
# but this skill provides playwright MCP as a supplementary option
|
||||
home.file.".omp/agent/skills/playwright/SKILL.md".text =
|
||||
let
|
||||
browsers = pkgs.playwright-driver.browsers;
|
||||
chromiumDir = builtins.head (
|
||||
builtins.filter (n: builtins.match "chromium-[0-9]+" n != null) (
|
||||
builtins.attrNames browsers.passthru.entries
|
||||
)
|
||||
);
|
||||
chromiumPath = "${browsers}/${chromiumDir}/chrome-linux64/chrome";
|
||||
in
|
||||
''
|
||||
".omp/agent/skills/playwright/SKILL.md".text = ''
|
||||
---
|
||||
name: playwright
|
||||
description: Browser automation via Playwright MCP. Use as an alternative to the built-in browser tool for Playwright-specific workflows, testing, and web scraping. Chromium is provided by NixOS.
|
||||
@@ -161,19 +201,20 @@ in
|
||||
## Browser Setup
|
||||
Chromium is provided by NixOS. Do NOT attempt to download browsers.
|
||||
|
||||
- Chromium path: `${chromiumPath}`
|
||||
- Browsers path: `${browsers}`
|
||||
- Chromium path: `${playwrightChromium.chrome}`
|
||||
- Browsers path: `${playwrightChromium.browsers}`
|
||||
|
||||
## Usage
|
||||
Launch the Playwright MCP server for browser automation:
|
||||
```bash
|
||||
npx @playwright/mcp@latest --executable-path "${chromiumPath}" --user-data-dir "${config.home.homeDirectory}/.cache/playwright-mcp"
|
||||
npx @playwright/mcp@latest --executable-path "${playwrightChromium.chrome}" --user-data-dir "${config.home.homeDirectory}/.cache/playwright-mcp"
|
||||
```
|
||||
|
||||
Set these environment variables if not already set:
|
||||
```bash
|
||||
export PLAYWRIGHT_BROWSERS_PATH="${browsers}"
|
||||
export PLAYWRIGHT_BROWSERS_PATH="${playwrightChromium.browsers}"
|
||||
export PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD=1
|
||||
```
|
||||
'';
|
||||
};
|
||||
}
|
||||
|
||||
29
home/progs/steam-shortcuts.nix
Normal file
29
home/progs/steam-shortcuts.nix
Normal file
@@ -0,0 +1,29 @@
|
||||
# Declarative non-Steam game shortcuts for the Steam library.
|
||||
# Add entries to the `shortcuts` list to have them appear in Steam's UI.
|
||||
{
|
||||
pkgs,
|
||||
inputs,
|
||||
lib,
|
||||
...
|
||||
}:
|
||||
{
|
||||
imports = [
|
||||
inputs.json2steamshortcut.homeModules.default
|
||||
];
|
||||
|
||||
services.steam-shortcuts = {
|
||||
enable = true;
|
||||
overwriteExisting = true;
|
||||
steamUserId = lib.strings.toInt (
|
||||
lib.strings.trim (builtins.readFile ../../secrets/home/steam-user-id)
|
||||
);
|
||||
shortcuts = [
|
||||
{
|
||||
AppName = "Prism Launcher";
|
||||
Exe = "${pkgs.prismlauncher}/bin/prismlauncher";
|
||||
Icon = "${pkgs.prismlauncher}/share/icons/hicolor/scalable/apps/org.prismlauncher.PrismLauncher.svg";
|
||||
Tags = [ "Game" ];
|
||||
}
|
||||
];
|
||||
};
|
||||
}
|
||||
@@ -68,19 +68,19 @@ in
|
||||
"element.envs.net"
|
||||
"mail.proton.me"
|
||||
"mail.google.com"
|
||||
"www.gardling.com"
|
||||
"www.sigkill.computer"
|
||||
"projects.fivethirtyeight.com"
|
||||
"secure.bankofamerica.com"
|
||||
"billpay-ui.bankofamerica.com"
|
||||
"plus.pearson.com"
|
||||
"immich.gardling.com"
|
||||
"immich.sigkill.computer"
|
||||
"huggingface.co"
|
||||
"session.masteringphysics.com"
|
||||
"brainly.com"
|
||||
"www.270towin.com"
|
||||
"phet.colorado.edu"
|
||||
"8042-1.portal.athenahealth.com"
|
||||
"torrent.gardling.com"
|
||||
"torrent.sigkill.computer"
|
||||
"nssb-p.adm.fit.edu"
|
||||
"mail.openbenchmarking.org"
|
||||
"moneroocean.stream"
|
||||
@@ -89,11 +89,11 @@ in
|
||||
"chat.deepseek.com"
|
||||
"n21.ultipro.com"
|
||||
"www.egaroucid.nyanyan.dev"
|
||||
"bitmagnet.gardling.com"
|
||||
"bitmagnet.sigkill.computer"
|
||||
"frame.work"
|
||||
"www.altcancer.net"
|
||||
"jenkins.jpenilla.xyz"
|
||||
"soulseek.gardling.com"
|
||||
"soulseek.sigkill.computer"
|
||||
"discord.com"
|
||||
"www.lufthansa.com"
|
||||
"surveys.hyundaicx.com"
|
||||
|
||||
@@ -5,6 +5,7 @@
|
||||
hostname,
|
||||
username,
|
||||
eth_interface,
|
||||
site_config,
|
||||
service_configs,
|
||||
options,
|
||||
...
|
||||
@@ -18,13 +19,14 @@
|
||||
../../modules/zfs.nix
|
||||
../../modules/server-impermanence.nix
|
||||
../../modules/usb-secrets.nix
|
||||
../../modules/age-secrets.nix
|
||||
../../modules/server-age-secrets.nix
|
||||
../../modules/server-lanzaboote-agenix.nix
|
||||
../../modules/no-rgb.nix
|
||||
../../modules/server-security.nix
|
||||
../../modules/ntfy-alerts.nix
|
||||
../../modules/server-power.nix
|
||||
../../modules/server-deploy-guard.nix
|
||||
../../modules/server-deploy-finalize.nix
|
||||
|
||||
../../services/postgresql.nix
|
||||
../../services/jellyfin
|
||||
@@ -79,19 +81,32 @@
|
||||
];
|
||||
|
||||
# Hosts entries for CI/CD deploy targets
|
||||
networking.hosts."192.168.1.50" = [ "server-public" ];
|
||||
networking.hosts."192.168.1.223" = [ "desktop" ];
|
||||
networking.hosts.${site_config.hosts.muffin.ip} = [ site_config.hosts.muffin.alias ];
|
||||
networking.hosts.${site_config.hosts.yarn.ip} = [ site_config.hosts.yarn.alias ];
|
||||
|
||||
# SSH known_hosts for CI runner (pinned host keys)
|
||||
environment.etc."ci-known-hosts".text = ''
|
||||
server-public ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFMjgaMnE+zS7tL+m5E7gh9Q9U1zurLdmU0qcmEmaucu
|
||||
192.168.1.50 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFMjgaMnE+zS7tL+m5E7gh9Q9U1zurLdmU0qcmEmaucu
|
||||
git.sigkill.computer ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFMjgaMnE+zS7tL+m5E7gh9Q9U1zurLdmU0qcmEmaucu
|
||||
git.gardling.com ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFMjgaMnE+zS7tL+m5E7gh9Q9U1zurLdmU0qcmEmaucu
|
||||
'';
|
||||
# SSH known_hosts for CI runner (pinned host keys). All four names resolve to
|
||||
# the same muffin host and therefore serve the same host key.
|
||||
environment.etc."ci-known-hosts".text =
|
||||
let
|
||||
key = site_config.hosts.muffin.ssh_host_key;
|
||||
names = [
|
||||
site_config.hosts.muffin.alias
|
||||
site_config.hosts.muffin.ip
|
||||
"git.${site_config.domain}"
|
||||
"git.${site_config.old_domain}"
|
||||
];
|
||||
in
|
||||
lib.concatMapStrings (n: "${n} ${key}\n") names;
|
||||
|
||||
services.deployGuard.enable = true;
|
||||
|
||||
# Detached deploy finalize: see modules/server-deploy-finalize.nix. deploy-rs
|
||||
# activates in `boot` mode and invokes deploy-finalize to schedule the real
|
||||
# `switch` (or reboot, when kernel/initrd/kernel-modules changed) 60s later
|
||||
# as a pid1-owned transient unit. Prevents the self-hosted gitea runner from
|
||||
# being restarted mid-CI-deploy.
|
||||
services.deployFinalize.enable = true;
|
||||
|
||||
# Disable serial getty on ttyS0 to prevent dmesg warnings
|
||||
systemd.services."serial-getty@ttyS0".enable = false;
|
||||
|
||||
@@ -149,9 +164,6 @@
|
||||
};
|
||||
};
|
||||
|
||||
# Set your time zone.
|
||||
time.timeZone = "America/New_York";
|
||||
|
||||
hardware.graphics = {
|
||||
enable = true;
|
||||
extraPackages = with pkgs; [
|
||||
@@ -161,35 +173,21 @@
|
||||
];
|
||||
};
|
||||
|
||||
# Root-facing admin tools only. User-facing CLI (fish, helix, htop, bottom,
|
||||
# tmux, ripgrep, lsof, wget, pfetch-rs, …) is provided via home-manager in
|
||||
# home/profiles/terminal.nix — shared with mreow and yarn.
|
||||
environment.systemPackages = with pkgs; [
|
||||
helix
|
||||
lm_sensors
|
||||
bottom
|
||||
htop
|
||||
|
||||
neofetch
|
||||
|
||||
borgbackup
|
||||
smartmontools
|
||||
|
||||
ripgrep
|
||||
|
||||
intel-gpu-tools
|
||||
iotop
|
||||
iftop
|
||||
|
||||
tmux
|
||||
|
||||
wget
|
||||
|
||||
powertop
|
||||
|
||||
lsof
|
||||
|
||||
reflac
|
||||
|
||||
pfetch-rs
|
||||
|
||||
sbctl
|
||||
|
||||
# add `skdump`
|
||||
@@ -197,10 +195,7 @@
|
||||
];
|
||||
|
||||
networking = {
|
||||
nameservers = [
|
||||
"1.1.1.1"
|
||||
"9.9.9.9"
|
||||
];
|
||||
nameservers = site_config.dns_servers;
|
||||
|
||||
hostName = hostname;
|
||||
hostId = "0f712d56";
|
||||
@@ -214,8 +209,7 @@
|
||||
interfaces.${eth_interface} = {
|
||||
ipv4.addresses = [
|
||||
{
|
||||
address = "192.168.1.50";
|
||||
# address = "10.1.1.102";
|
||||
address = site_config.hosts.muffin.ip;
|
||||
prefixLength = 24;
|
||||
}
|
||||
];
|
||||
@@ -227,8 +221,7 @@
|
||||
];
|
||||
};
|
||||
defaultGateway = {
|
||||
#address = "10.1.1.1";
|
||||
address = "192.168.1.1";
|
||||
address = site_config.lan.gateway;
|
||||
interface = eth_interface;
|
||||
};
|
||||
# TODO! fix this
|
||||
@@ -240,14 +233,6 @@
|
||||
|
||||
users.groups.${service_configs.media_group} = { };
|
||||
|
||||
users.users.gitea-runner = {
|
||||
isSystemUser = true;
|
||||
group = "gitea-runner";
|
||||
home = "/var/lib/gitea-runner";
|
||||
description = "Gitea Actions CI runner";
|
||||
};
|
||||
users.groups.gitea-runner = { };
|
||||
|
||||
users.users.${username} = {
|
||||
isNormalUser = true;
|
||||
extraGroups = [
|
||||
|
||||
@@ -1,31 +1,12 @@
|
||||
{ ... }:
|
||||
{
|
||||
pkgs,
|
||||
lib,
|
||||
...
|
||||
}:
|
||||
{
|
||||
imports = [
|
||||
../../home/profiles/terminal.nix
|
||||
];
|
||||
|
||||
home.stateVersion = "24.11";
|
||||
programs.fish = {
|
||||
enable = true;
|
||||
|
||||
interactiveShellInit = ''
|
||||
# disable greeting
|
||||
set fish_greeting
|
||||
|
||||
# pfetch on shell start (disable pkgs because of execution time)
|
||||
PF_INFO="ascii title os host kernel uptime memory editor wm" ${lib.getExe pkgs.pfetch-rs}
|
||||
'';
|
||||
|
||||
shellAliases =
|
||||
let
|
||||
eza = "${lib.getExe pkgs.eza} --color=always --group-directories-first";
|
||||
in
|
||||
{
|
||||
# from DistroTube's dot files: Changing "ls" to "eza"
|
||||
ls = "${eza} -al";
|
||||
la = "${eza} -a";
|
||||
ll = "${eza} -l";
|
||||
lt = "${eza} -aT";
|
||||
};
|
||||
};
|
||||
# Muffin typically doesn't have the GPG key loaded (no agent forwarded,
|
||||
# no key in the keyring). Unsigned commits here rather than failing silently.
|
||||
programs.git.signing.signByDefault = false;
|
||||
}
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
{ site_config }:
|
||||
rec {
|
||||
zpool_ssds = "tank";
|
||||
zpool_hdds = "hdds";
|
||||
@@ -195,6 +196,10 @@ rec {
|
||||
port = 9563;
|
||||
proto = "tcp";
|
||||
};
|
||||
minecraft_exporter = {
|
||||
port = 9567;
|
||||
proto = "tcp";
|
||||
};
|
||||
prometheus_zfs = {
|
||||
port = 9134;
|
||||
proto = "tcp";
|
||||
@@ -206,15 +211,9 @@ rec {
|
||||
};
|
||||
};
|
||||
|
||||
https = {
|
||||
certs = services_dir + "/http_certs";
|
||||
domain = "sigkill.computer";
|
||||
old_domain = "gardling.com"; # Redirect traffic from old domain
|
||||
};
|
||||
|
||||
gitea = {
|
||||
dir = services_dir + "/gitea";
|
||||
domain = "git.${https.domain}";
|
||||
domain = "git.${site_config.domain}";
|
||||
};
|
||||
|
||||
postgres = {
|
||||
@@ -278,19 +277,19 @@ rec {
|
||||
|
||||
matrix = {
|
||||
dataDir = "/var/lib/continuwuity";
|
||||
domain = "matrix.${https.domain}";
|
||||
domain = "matrix.${site_config.domain}";
|
||||
};
|
||||
|
||||
ntfy = {
|
||||
domain = "ntfy.${https.domain}";
|
||||
domain = "ntfy.${site_config.domain}";
|
||||
};
|
||||
|
||||
mollysocket = {
|
||||
domain = "mollysocket.${https.domain}";
|
||||
domain = "mollysocket.${site_config.domain}";
|
||||
};
|
||||
|
||||
livekit = {
|
||||
domain = "livekit.${https.domain}";
|
||||
domain = "livekit.${site_config.domain}";
|
||||
};
|
||||
|
||||
syncthing = {
|
||||
@@ -324,12 +323,12 @@ rec {
|
||||
};
|
||||
|
||||
firefox_syncserver = {
|
||||
domain = "firefox-sync.${https.domain}";
|
||||
domain = "firefox-sync.${site_config.domain}";
|
||||
};
|
||||
|
||||
grafana = {
|
||||
dir = services_dir + "/grafana";
|
||||
domain = "grafana.${https.domain}";
|
||||
domain = "grafana.${site_config.domain}";
|
||||
};
|
||||
|
||||
trilium = {
|
||||
|
||||
38
hosts/patiodeck/default.nix
Normal file
38
hosts/patiodeck/default.nix
Normal file
@@ -0,0 +1,38 @@
|
||||
{
|
||||
username,
|
||||
inputs,
|
||||
site_config,
|
||||
...
|
||||
}:
|
||||
{
|
||||
imports = [
|
||||
../../modules/desktop-common.nix
|
||||
../../modules/desktop-jovian.nix
|
||||
./disk.nix
|
||||
./impermanence.nix
|
||||
|
||||
inputs.impermanence.nixosModules.impermanence
|
||||
];
|
||||
|
||||
networking.hostId = "a1b2c3d4";
|
||||
|
||||
# SSH for remote management from laptop
|
||||
services.openssh = {
|
||||
enable = true;
|
||||
ports = [ 22 ];
|
||||
settings = {
|
||||
PasswordAuthentication = false;
|
||||
PermitRootLogin = "yes";
|
||||
};
|
||||
};
|
||||
|
||||
users.users.${username}.openssh.authorizedKeys.keys = [
|
||||
site_config.ssh_keys.laptop
|
||||
];
|
||||
|
||||
users.users.root.openssh.authorizedKeys.keys = [
|
||||
site_config.ssh_keys.laptop
|
||||
];
|
||||
|
||||
jovian.devices.steamdeck.enable = true;
|
||||
}
|
||||
52
hosts/patiodeck/disk.nix
Normal file
52
hosts/patiodeck/disk.nix
Normal file
@@ -0,0 +1,52 @@
|
||||
{
|
||||
disko.devices = {
|
||||
disk = {
|
||||
main = {
|
||||
type = "disk";
|
||||
content = {
|
||||
type = "gpt";
|
||||
partitions = {
|
||||
ESP = {
|
||||
type = "EF00";
|
||||
size = "500M";
|
||||
content = {
|
||||
type = "filesystem";
|
||||
format = "vfat";
|
||||
mountpoint = "/boot";
|
||||
};
|
||||
};
|
||||
nix = {
|
||||
size = "200G";
|
||||
content = {
|
||||
type = "filesystem";
|
||||
format = "f2fs";
|
||||
mountpoint = "/nix";
|
||||
};
|
||||
};
|
||||
persistent = {
|
||||
size = "100%";
|
||||
content = {
|
||||
type = "filesystem";
|
||||
format = "f2fs";
|
||||
mountpoint = "/persistent";
|
||||
};
|
||||
};
|
||||
};
|
||||
};
|
||||
};
|
||||
};
|
||||
nodev = {
|
||||
"/" = {
|
||||
fsType = "tmpfs";
|
||||
mountOptions = [
|
||||
"defaults"
|
||||
"size=2G"
|
||||
"mode=755"
|
||||
];
|
||||
};
|
||||
};
|
||||
};
|
||||
|
||||
fileSystems."/persistent".neededForBoot = true;
|
||||
fileSystems."/nix".neededForBoot = true;
|
||||
}
|
||||
8
hosts/patiodeck/home.nix
Normal file
8
hosts/patiodeck/home.nix
Normal file
@@ -0,0 +1,8 @@
|
||||
{ ... }:
|
||||
{
|
||||
imports = [
|
||||
../../home/profiles/gui.nix
|
||||
../../home/profiles/desktop.nix
|
||||
../../home/progs/steam-shortcuts.nix
|
||||
];
|
||||
}
|
||||
48
hosts/patiodeck/impermanence.nix
Normal file
48
hosts/patiodeck/impermanence.nix
Normal file
@@ -0,0 +1,48 @@
|
||||
{
|
||||
username,
|
||||
...
|
||||
}:
|
||||
{
|
||||
environment.persistence."/persistent" = {
|
||||
hideMounts = true;
|
||||
directories = [
|
||||
"/var/log"
|
||||
"/var/lib/systemd/coredump"
|
||||
"/var/lib/nixos"
|
||||
"/var/lib/systemd/timers"
|
||||
# agenix identity sealed by the TPM
|
||||
{
|
||||
directory = "/var/lib/agenix";
|
||||
mode = "0700";
|
||||
user = "root";
|
||||
group = "root";
|
||||
}
|
||||
];
|
||||
|
||||
files = [
|
||||
"/etc/ssh/ssh_host_ed25519_key"
|
||||
"/etc/ssh/ssh_host_ed25519_key.pub"
|
||||
"/etc/ssh/ssh_host_rsa_key"
|
||||
"/etc/ssh/ssh_host_rsa_key.pub"
|
||||
"/etc/machine-id"
|
||||
];
|
||||
|
||||
users.root = {
|
||||
files = [
|
||||
".local/share/fish/fish_history"
|
||||
];
|
||||
};
|
||||
};
|
||||
|
||||
# bind mount home directory from persistent storage
|
||||
fileSystems."/home/${username}" = {
|
||||
device = "/persistent/home/${username}";
|
||||
fsType = "none";
|
||||
options = [ "bind" ];
|
||||
neededForBoot = true;
|
||||
};
|
||||
|
||||
systemd.tmpfiles.rules = [
|
||||
"d /etc 755 root"
|
||||
];
|
||||
}
|
||||
@@ -1,21 +1,21 @@
|
||||
{
|
||||
config,
|
||||
pkgs,
|
||||
lib,
|
||||
username,
|
||||
inputs,
|
||||
site_config,
|
||||
...
|
||||
}:
|
||||
{
|
||||
imports = [
|
||||
../../modules/desktop-common.nix
|
||||
../../modules/desktop-jovian.nix
|
||||
../../modules/no-rgb.nix
|
||||
./disk.nix
|
||||
./impermanence.nix
|
||||
./vr.nix
|
||||
|
||||
inputs.impermanence.nixosModules.impermanence
|
||||
inputs.jovian-nixos.nixosModules.default
|
||||
];
|
||||
|
||||
fileSystems."/media/games" = {
|
||||
@@ -43,8 +43,8 @@
|
||||
};
|
||||
ipv4 = {
|
||||
method = "manual";
|
||||
address1 = "192.168.1.223/24,192.168.1.1";
|
||||
dns = "1.1.1.1;9.9.9.9;";
|
||||
address1 = "${site_config.hosts.yarn.ip}/24,${site_config.lan.gateway}";
|
||||
dns = lib.concatMapStrings (n: "${n};") site_config.dns_servers;
|
||||
};
|
||||
ipv6.method = "disabled";
|
||||
};
|
||||
@@ -59,12 +59,12 @@
|
||||
};
|
||||
|
||||
users.users.${username}.openssh.authorizedKeys.keys = [
|
||||
"ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIO4jL6gYOunUlUtPvGdML0cpbKSsPNqQ1jit4E7U1RyH" # laptop
|
||||
site_config.ssh_keys.laptop
|
||||
];
|
||||
|
||||
users.users.root.openssh.authorizedKeys.keys = [
|
||||
"ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIO4jL6gYOunUlUtPvGdML0cpbKSsPNqQ1jit4E7U1RyH" # laptop
|
||||
"ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIC5ZYN6idL/w/mUIfPOH1i+Q/SQXuzAMQUEuWpipx1Pc ci-deploy@muffin"
|
||||
site_config.ssh_keys.laptop
|
||||
site_config.ssh_keys.ci_deploy
|
||||
];
|
||||
|
||||
programs.steam = {
|
||||
@@ -82,195 +82,6 @@
|
||||
|
||||
systemd.services.lactd.serviceConfig.ExecStartPre = "${lib.getExe pkgs.bash} -c \"sleep 3s\"";
|
||||
|
||||
# root-level service that applies a pending update. Triggered by
|
||||
# steamos-update (via systemctl start) when the user accepts an update.
|
||||
# Runs as root so it can write the system profile and boot entry.
|
||||
systemd.services.pull-update-apply = {
|
||||
description = "Apply pending NixOS update pulled from binary cache";
|
||||
serviceConfig = {
|
||||
Type = "oneshot";
|
||||
ExecStart = pkgs.writeShellScript "pull-update-apply" ''
|
||||
set -uo pipefail
|
||||
export PATH=${
|
||||
pkgs.lib.makeBinPath [
|
||||
pkgs.curl
|
||||
pkgs.coreutils
|
||||
pkgs.nix
|
||||
]
|
||||
}
|
||||
STORE_PATH=$(curl -sf --max-time 30 "https://nix-cache.sigkill.computer/deploy/yarn" || true)
|
||||
if [ -z "$STORE_PATH" ]; then
|
||||
echo "server unreachable"
|
||||
exit 1
|
||||
fi
|
||||
echo "applying $STORE_PATH"
|
||||
nix-store -r "$STORE_PATH" || { echo "fetch failed"; exit 1; }
|
||||
nix-env -p /nix/var/nix/profiles/system --set "$STORE_PATH" || { echo "profile set failed"; exit 1; }
|
||||
"$STORE_PATH/bin/switch-to-configuration" boot || { echo "boot entry failed"; exit 1; }
|
||||
echo "update applied; reboot required"
|
||||
'';
|
||||
};
|
||||
};
|
||||
|
||||
# Allow primary user to start pull-update-apply.service without a password
|
||||
security.polkit.extraConfig = ''
|
||||
polkit.addRule(function(action, subject) {
|
||||
if (action.id == "org.freedesktop.systemd1.manage-units" &&
|
||||
action.lookup("unit") == "pull-update-apply.service" &&
|
||||
subject.user == "${username}") {
|
||||
return polkit.Result.YES;
|
||||
}
|
||||
});
|
||||
'';
|
||||
|
||||
nixpkgs.config.allowUnfreePredicate =
|
||||
pkg:
|
||||
builtins.elem (lib.getName pkg) [
|
||||
"steamdeck-hw-theme"
|
||||
"steam-jupiter-unwrapped"
|
||||
"steam"
|
||||
"steam-original"
|
||||
"steam-unwrapped"
|
||||
"steam-run"
|
||||
];
|
||||
|
||||
# Override jovian-stubs to disable steamos-update kernel check
|
||||
# This prevents Steam from requesting reboots for "system updates"
|
||||
# Steam client updates will still work normally
|
||||
nixpkgs.overlays = [
|
||||
(
|
||||
final: prev:
|
||||
let
|
||||
deploy-url = "https://nix-cache.sigkill.computer/deploy/yarn";
|
||||
|
||||
steamos-update-script = final.writeShellScript "steamos-update" ''
|
||||
export PATH=${
|
||||
final.lib.makeBinPath [
|
||||
final.curl
|
||||
final.coreutils
|
||||
final.systemd
|
||||
]
|
||||
}
|
||||
|
||||
STORE_PATH=$(curl -sf --max-time 30 "${deploy-url}" || true)
|
||||
|
||||
if [ -z "$STORE_PATH" ]; then
|
||||
>&2 echo "[steamos-update] server unreachable"
|
||||
exit 7
|
||||
fi
|
||||
|
||||
CURRENT=$(readlink -f /nix/var/nix/profiles/system)
|
||||
if [ "$CURRENT" = "$STORE_PATH" ]; then
|
||||
>&2 echo "[steamos-update] no update available"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# check-only mode: just report that an update exists
|
||||
if [ "''${1:-}" = "check" ] || [ "''${1:-}" = "--check-only" ]; then
|
||||
>&2 echo "[steamos-update] update available"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# apply: trigger the root-running systemd service to install the update
|
||||
>&2 echo "[steamos-update] applying update..."
|
||||
if systemctl start --wait pull-update-apply.service; then
|
||||
>&2 echo "[steamos-update] update installed, reboot to apply"
|
||||
exit 0
|
||||
else
|
||||
>&2 echo "[steamos-update] apply failed; see 'journalctl -u pull-update-apply'"
|
||||
exit 1
|
||||
fi
|
||||
'';
|
||||
in
|
||||
{
|
||||
jovian-stubs = prev.stdenv.mkDerivation {
|
||||
name = "jovian-stubs";
|
||||
dontUnpack = true;
|
||||
installPhase = ''
|
||||
mkdir -p $out/bin
|
||||
ln -s ${steamos-update-script} $out/bin/steamos-update
|
||||
# ln -s ${steamos-update-script} $out/bin/steamos-mandatory-update
|
||||
|
||||
# jupiter-initial-firmware-update: no-op (not a real steam deck)
|
||||
cat > $out/bin/jupiter-initial-firmware-update << 'STUB'
|
||||
#!/bin/sh
|
||||
exit 0
|
||||
STUB
|
||||
|
||||
# jupiter-biosupdate: no-op (not a real steam deck)
|
||||
cat > $out/bin/jupiter-biosupdate << 'STUB'
|
||||
#!/bin/sh
|
||||
exit 0
|
||||
STUB
|
||||
|
||||
# steamos-reboot: reboot the system
|
||||
cat > $out/bin/steamos-reboot << 'STUB'
|
||||
#!/bin/sh
|
||||
>&2 echo "[JOVIAN] $0: stub called with: $*"
|
||||
systemctl reboot
|
||||
STUB
|
||||
|
||||
# steamos-select-branch: no-op stub
|
||||
cat > $out/bin/steamos-select-branch << 'STUB'
|
||||
#!/bin/sh
|
||||
>&2 echo "[JOVIAN] $0: stub called with: $*"
|
||||
exit 0
|
||||
STUB
|
||||
|
||||
# steamos-factory-reset-config: no-op stub
|
||||
cat > $out/bin/steamos-factory-reset-config << 'STUB'
|
||||
#!/bin/sh
|
||||
>&2 echo "[JOVIAN] $0: stub called with: $*"
|
||||
exit 0
|
||||
STUB
|
||||
|
||||
# steamos-firmware-update: no-op stub
|
||||
cat > $out/bin/steamos-firmware-update << 'STUB'
|
||||
#!/bin/sh
|
||||
>&2 echo "[JOVIAN] $0: stub called with: $*"
|
||||
exit 0
|
||||
STUB
|
||||
|
||||
# pkexec: pass through to real pkexec
|
||||
cat > $out/bin/pkexec << 'STUB'
|
||||
#!/bin/sh
|
||||
exec /run/wrappers/bin/pkexec "$@"
|
||||
STUB
|
||||
|
||||
# sudo: strip flags and run the command directly (no escalation).
|
||||
# privileged ops are delegated to root systemd services via systemctl.
|
||||
cat > $out/bin/sudo << 'STUB'
|
||||
#!/bin/sh
|
||||
while [ $# -gt 0 ]; do
|
||||
case "$1" in
|
||||
-*) shift ;;
|
||||
*) break ;;
|
||||
esac
|
||||
done
|
||||
exec "$@"
|
||||
STUB
|
||||
|
||||
find $out/bin -type f -exec chmod 755 {} +
|
||||
'';
|
||||
};
|
||||
}
|
||||
)
|
||||
];
|
||||
|
||||
jovian = {
|
||||
devices.steamdeck.enable = false;
|
||||
steam = {
|
||||
enable = true;
|
||||
autoStart = true;
|
||||
desktopSession = "niri";
|
||||
user = username;
|
||||
};
|
||||
};
|
||||
|
||||
# Jovian-NixOS requires sddm
|
||||
# https://github.com/Jovian-Experiments/Jovian-NixOS/commit/52f140c07493f8bb6cd0773c7e1afe3e1fd1d1fa
|
||||
services.displayManager.sddm.wayland.enable = true;
|
||||
|
||||
# Disable gamescope from common.nix to avoid conflict with jovian-nixos
|
||||
programs.gamescope.enable = lib.mkForce false;
|
||||
# yarn is not a Steam Deck
|
||||
jovian.devices.steamdeck.enable = false;
|
||||
}
|
||||
|
||||
@@ -1,15 +1,12 @@
|
||||
{
|
||||
pkgs,
|
||||
inputs,
|
||||
lib,
|
||||
config,
|
||||
...
|
||||
}:
|
||||
{
|
||||
imports = [
|
||||
../../home/profiles/gui.nix
|
||||
../../home/profiles/desktop.nix
|
||||
inputs.json2steamshortcut.homeModules.default
|
||||
../../home/progs/steam-shortcuts.nix
|
||||
];
|
||||
|
||||
home.packages = with pkgs; [
|
||||
@@ -27,20 +24,4 @@
|
||||
obs-pipewire-audio-capture
|
||||
];
|
||||
};
|
||||
|
||||
services.steam-shortcuts = {
|
||||
enable = true;
|
||||
overwriteExisting = true;
|
||||
steamUserId = lib.strings.toInt (
|
||||
lib.strings.trim (builtins.readFile ../../secrets/home/steam-user-id)
|
||||
);
|
||||
shortcuts = [
|
||||
{
|
||||
AppName = "Prism Launcher";
|
||||
Exe = "${pkgs.prismlauncher}/bin/prismlauncher";
|
||||
Icon = "${pkgs.prismlauncher}/share/icons/hicolor/scalable/apps/org.prismlauncher.PrismLauncher.svg";
|
||||
Tags = [ "Game" ];
|
||||
}
|
||||
];
|
||||
};
|
||||
}
|
||||
|
||||
@@ -12,6 +12,7 @@
|
||||
"/var/lib/systemd/coredump"
|
||||
"/var/lib/nixos"
|
||||
"/var/lib/systemd/timers"
|
||||
"/var/lib/bluetooth"
|
||||
];
|
||||
|
||||
files = [
|
||||
@@ -21,6 +22,12 @@
|
||||
"/etc/ssh/ssh_host_rsa_key.pub"
|
||||
"/etc/machine-id"
|
||||
];
|
||||
|
||||
users.root = {
|
||||
files = [
|
||||
".local/share/fish/fish_history"
|
||||
];
|
||||
};
|
||||
};
|
||||
|
||||
# Bind mount entire home directory from persistent storage
|
||||
@@ -31,6 +38,17 @@
|
||||
options = [ "bind" ];
|
||||
neededForBoot = true;
|
||||
};
|
||||
# /var/lib/agenix holds the TPM-sealed age identity. agenix decrypts secrets
|
||||
# from initrd-nixos-activation-start, which runs *before* impermanence's
|
||||
# stage-2 bind mounts. Mount it explicitly with neededForBoot so the
|
||||
# identity is in place when activation reads it. (NixOS auto-marks /var/log
|
||||
# and /var/lib/nixos as neededForBoot; /var/lib/agenix is not in that set.)
|
||||
fileSystems."/var/lib/agenix" = {
|
||||
device = "/persistent/var/lib/agenix";
|
||||
fsType = "none";
|
||||
options = [ "bind" ];
|
||||
neededForBoot = true;
|
||||
};
|
||||
|
||||
systemd.tmpfiles.rules = [
|
||||
"d /etc 755 root"
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
inputs,
|
||||
pkgs,
|
||||
service_configs,
|
||||
site_config,
|
||||
lib ? inputs.nixpkgs-stable.lib,
|
||||
...
|
||||
}:
|
||||
@@ -195,7 +196,7 @@ lib.extend (
|
||||
assert (subdomain != null) != (domain != null);
|
||||
{ config, ... }:
|
||||
let
|
||||
vhostDomain = if domain != null then domain else "${subdomain}.${service_configs.https.domain}";
|
||||
vhostDomain = if domain != null then domain else "${subdomain}.${site_config.domain}";
|
||||
upstream =
|
||||
if vpn then
|
||||
"${config.vpnNamespaces.wg.namespaceAddress}:${builtins.toString port}"
|
||||
|
||||
@@ -75,4 +75,19 @@ final: prev: {
|
||||
'';
|
||||
meta.mainProgram = "igpu-exporter";
|
||||
};
|
||||
|
||||
mc-monitor = prev.buildGoModule rec {
|
||||
pname = "mc-monitor";
|
||||
version = "0.16.1";
|
||||
src = prev.fetchFromGitHub {
|
||||
owner = "itzg";
|
||||
repo = "mc-monitor";
|
||||
rev = version;
|
||||
hash = "sha256-/94+Z9FTFOzQHynHiJuaGFiidkOxmM0g/FIpHn+xvJM=";
|
||||
};
|
||||
vendorHash = "sha256-qq7rIpvGRi3AMnBbi8uAhiPcfSF4McIuqozdtxB5CeQ=";
|
||||
# upstream tests probe live Minecraft servers
|
||||
doCheck = false;
|
||||
meta.mainProgram = "mc-monitor";
|
||||
};
|
||||
}
|
||||
|
||||
@@ -2,10 +2,15 @@
|
||||
config,
|
||||
lib,
|
||||
pkgs,
|
||||
site_config,
|
||||
username,
|
||||
...
|
||||
}:
|
||||
{
|
||||
# Shared timezone. Plain priority so it wins against srvos's mkDefault "UTC";
|
||||
# mreow overrides via lib.mkForce when travelling.
|
||||
time.timeZone = site_config.timezone;
|
||||
|
||||
# Common Nix daemon settings. Host-specific overrides (binary cache substituters,
|
||||
# gc retention) live in the host's default.nix.
|
||||
nix = {
|
||||
@@ -53,8 +58,6 @@
|
||||
];
|
||||
};
|
||||
|
||||
services.kmscon.enable = true;
|
||||
|
||||
environment.systemPackages = with pkgs; [
|
||||
doas-sudo-shim
|
||||
];
|
||||
|
||||
91
modules/desktop-age-secrets.nix
Normal file
91
modules/desktop-age-secrets.nix
Normal file
@@ -0,0 +1,91 @@
|
||||
{
|
||||
pkgs,
|
||||
inputs,
|
||||
...
|
||||
}:
|
||||
let
|
||||
# age-plugin-tpm 1.0+ defaults to the new age1tag1… (p256tag) recipient
|
||||
# encoding and refuses to encrypt to legacy age1tpm1… recipients. rage's
|
||||
# plugin dispatch maps recipient prefixes to binaries (`age1tag1…` →
|
||||
# `age-plugin-tag`), but nixpkgs only ships `age-plugin-tpm`. Provide a
|
||||
# symlink so both prefixes resolve to the same binary.
|
||||
age-plugin-tpm-with-tag = pkgs.symlinkJoin {
|
||||
name = "age-plugin-tpm-with-tag";
|
||||
paths = [ pkgs.age-plugin-tpm ];
|
||||
postBuild = ''
|
||||
ln -s age-plugin-tpm $out/bin/age-plugin-tag
|
||||
'';
|
||||
};
|
||||
|
||||
# Wrap rage so the plugin (under both names) is on PATH at activation time.
|
||||
rageWithTpm = pkgs.writeShellScriptBin "rage" ''
|
||||
export PATH="${age-plugin-tpm-with-tag}/bin:$PATH"
|
||||
exec ${pkgs.rage}/bin/rage "$@"
|
||||
'';
|
||||
in
|
||||
{
|
||||
imports = [
|
||||
inputs.agenix.nixosModules.default
|
||||
];
|
||||
|
||||
# Expose the plugin + agenix CLI for interactive edits (`agenix -e …`).
|
||||
environment.systemPackages = [
|
||||
inputs.agenix.packages.${pkgs.system}.default
|
||||
pkgs.age-plugin-tpm
|
||||
];
|
||||
|
||||
age.ageBin = "${rageWithTpm}/bin/rage";
|
||||
|
||||
# Primary identity: TPM-sealed key, generated by scripts/bootstrap-desktop-tpm.sh.
|
||||
# Fallback identity: admin SSH key. age tries paths in order, so if the TPM
|
||||
# is wiped or the board is replaced the SSH key keeps secrets accessible until
|
||||
# the TPM is re-bootstrapped. Both are encrypted recipients on every .age file.
|
||||
age.identityPaths = [
|
||||
"/var/lib/agenix/tpm-identity"
|
||||
"/home/primary/.ssh/id_ed25519"
|
||||
];
|
||||
|
||||
# Ensure the identity directory exists before agenix activation so a fresh
|
||||
# bootstrap doesn't race the directory creation.
|
||||
systemd.tmpfiles.rules = [
|
||||
"d /var/lib/agenix 0700 root root -"
|
||||
];
|
||||
|
||||
age.secrets = {
|
||||
# Secureboot PKI bundle (db/KEK/PK keys + certs) consumed by lanzaboote
|
||||
# via desktop-lanzaboote-agenix.nix at activation time.
|
||||
secureboot-tar = {
|
||||
file = ../secrets/desktop/secureboot.tar.age;
|
||||
mode = "0400";
|
||||
owner = "root";
|
||||
group = "root";
|
||||
};
|
||||
|
||||
# netrc for the private nix binary cache.
|
||||
nix-cache-netrc = {
|
||||
file = ../secrets/desktop/nix-cache-netrc.age;
|
||||
mode = "0400";
|
||||
owner = "root";
|
||||
group = "root";
|
||||
};
|
||||
|
||||
# yescrypt hash for the primary user.
|
||||
password-hash = {
|
||||
file = ../secrets/desktop/password-hash.age;
|
||||
mode = "0400";
|
||||
owner = "root";
|
||||
group = "root";
|
||||
};
|
||||
|
||||
# Master password for oo7-daemon's 'Login' keyring; the unit consumes it
|
||||
# via systemd's ImportCredential machinery (see desktop-oo7-daemon.nix).
|
||||
# Owner is `primary` so the user-scope systemd unit can LoadCredential it.
|
||||
oo7-keyring-password = {
|
||||
file = ../secrets/desktop/oo7-keyring-password.age;
|
||||
mode = "0400";
|
||||
owner = "primary";
|
||||
group = "users";
|
||||
};
|
||||
|
||||
};
|
||||
}
|
||||
@@ -5,6 +5,7 @@
|
||||
lib,
|
||||
username,
|
||||
inputs,
|
||||
site_config,
|
||||
niri-package,
|
||||
...
|
||||
}:
|
||||
@@ -16,17 +17,20 @@
|
||||
./desktop-vm.nix
|
||||
./desktop-steam.nix
|
||||
./desktop-networkmanager.nix
|
||||
./desktop-age-secrets.nix
|
||||
./desktop-lanzaboote-agenix.nix
|
||||
./desktop-oo7-daemon.nix
|
||||
|
||||
inputs.disko.nixosModules.disko
|
||||
inputs.lanzaboote.nixosModules.lanzaboote
|
||||
|
||||
inputs.nixos-hardware.nixosModules.common-cpu-amd-pstate
|
||||
inputs.nixos-hardware.nixosModules.common-cpu-amd-zenpower
|
||||
inputs.nixos-hardware.nixosModules.common-pc-ssd
|
||||
];
|
||||
|
||||
# allow overclocking (I actually underclock but lol)
|
||||
hardware.amdgpu.overdrive.ppfeaturemask = "0xFFFFFFFF";
|
||||
# expose amdgpu overdrive sysfs (pp_od_clk_voltage, fan curves, ...) for LACT.
|
||||
# nixpkgs default ppfeaturemask (0xfffd7fff) already has the overdrive bit set.
|
||||
hardware.amdgpu.overdrive.enable = true;
|
||||
|
||||
# Add niri to display manager session packages
|
||||
services.displayManager.sessionPackages = [ niri-package ];
|
||||
@@ -49,32 +53,41 @@
|
||||
mkdir -p /nix/var/nix/profiles/per-user/root/channels
|
||||
'';
|
||||
|
||||
# extract all my secureboot keys
|
||||
# TODO! proper secrets management
|
||||
"secureboot-keys".text = ''
|
||||
#!/usr/bin/env sh
|
||||
rm -fr ${config.boot.lanzaboote.pkiBundle} || true
|
||||
mkdir -p ${config.boot.lanzaboote.pkiBundle}
|
||||
${lib.getExe pkgs.gnutar} xf ${../secrets/desktop/secureboot.tar} -C ${config.boot.lanzaboote.pkiBundle}
|
||||
chown -R root:wheel ${config.boot.lanzaboote.pkiBundle}
|
||||
chmod -R 500 ${config.boot.lanzaboote.pkiBundle}
|
||||
'';
|
||||
};
|
||||
|
||||
swapDevices = [ ];
|
||||
|
||||
# Desktop-specific Nix cache — muffin serves it, desktops consume.
|
||||
# Base nix settings (optimise, gc, experimental-features) come from common-nix.nix.
|
||||
# Base nix settings (optimise, gc, experimental-features) come from common.nix.
|
||||
nix.settings = {
|
||||
substituters = [ "https://nix-cache.sigkill.computer" ];
|
||||
substituters = [ site_config.binary_cache.url ];
|
||||
trusted-public-keys = [
|
||||
"nix-cache.sigkill.computer-1:ONtQC9gUjL+2yNgMWB68NudPySXhyzJ7I3ra56/NPgk="
|
||||
site_config.binary_cache.public_key
|
||||
];
|
||||
netrc-file = "${../secrets/desktop/nix-cache-netrc}";
|
||||
netrc-file = config.age.secrets.nix-cache-netrc.path;
|
||||
};
|
||||
|
||||
# cachyos kernel overlay
|
||||
nixpkgs.overlays = [ inputs.nix-cachyos-kernel.overlays.default ];
|
||||
nixpkgs.overlays = [
|
||||
inputs.nix-cachyos-kernel.overlays.default
|
||||
# bluez 5.86 reversed the profile-connect order (regression of cdcd845f87ee).
|
||||
# Dual-role devices that advertise both AudioSource (UUID 0x110a) and
|
||||
# AudioSink (UUID 0x110b) -- e.g. any A2DP headphone with an HFP mic, like
|
||||
# the Bose QC 45 -- now negotiate audio-gateway first and never expose
|
||||
# a2dp-sink, with bluetoothd reporting:
|
||||
# src/service.c:btd_service_connect() a2dp-sink profile connect failed
|
||||
# for <addr>: Device or resource busy
|
||||
# Cherry-picks bluez/bluez@066a164 "a2dp: connect source profile after
|
||||
# sink" (slated for 5.87). FIX: drop overlay when nixpkgs ships >= 5.87.
|
||||
# see https://github.com/bluez/bluez/issues/1922
|
||||
(_final: prev: {
|
||||
bluez = prev.bluez.overrideAttrs (old: {
|
||||
patches = (old.patches or [ ]) ++ [
|
||||
../patches/bluez/0001-a2dp-connect-source-after-sink.patch
|
||||
];
|
||||
});
|
||||
})
|
||||
];
|
||||
|
||||
# kernel options
|
||||
boot = {
|
||||
@@ -181,9 +194,14 @@
|
||||
DRM_HISI_HIBMC = lib.mkForce no;
|
||||
DRM_APPLETBDRM = lib.mkForce no;
|
||||
|
||||
# intel gpu
|
||||
DRM_I915 = lib.mkForce no;
|
||||
DRM_XE = lib.mkForce no;
|
||||
# legacy AMD IP blocks. hosts are Navi 32 RDNA3 dGPU (7800 XT, yarn,
|
||||
# 2023, gfx1101, DCN 3.2) and Krackan Point RDNA 3.5 iGPU (mreow,
|
||||
# 2024, gfx1150, DCN 3.5). everything below pre-dates those by a
|
||||
# decade. upstream only exposes per-generation toggles for SI and
|
||||
# CIK — no switch for VI/Polaris/Vega/Navi1x, those stay in amdgpu.
|
||||
DRM_AMDGPU_SI = lib.mkForce no; # Southern Islands / GCN 1 (2012): HD 7950/7970, R9 280/280X, R7 260X
|
||||
DRM_AMDGPU_CIK = lib.mkForce no; # Sea Islands / GCN 2 (2013): R9 290/290X/390, Kaveri APUs (A10-7850K), Steam Machine Bonaire
|
||||
DRM_AMD_SECURE_DISPLAY = lib.mkForce no; # HDCP region-CRC debugfs helper, needs custom DMCU firmware
|
||||
|
||||
# early-boot framebuffer chain: drop every alternative to amdgpu so
|
||||
# the console never transitions simpledrm -> dummy -> amdgpu (visible
|
||||
@@ -285,6 +303,486 @@
|
||||
XZ_DEC_ARM64 = lib.mkForce no;
|
||||
XZ_DEC_SPARC = lib.mkForce no;
|
||||
XZ_DEC_RISCV = lib.mkForce no;
|
||||
|
||||
# ==== no hardware for any of these on either host ====
|
||||
|
||||
# laptop vendor platform drivers (only FRAMEWORK_LAPTOP is used)
|
||||
ACER_WMI = lib.mkForce no;
|
||||
ACER_WIRELESS = lib.mkForce no;
|
||||
ACERHDF = lib.mkForce no;
|
||||
APPLE_GMUX = lib.mkForce no;
|
||||
ASUS_LAPTOP = lib.mkForce no;
|
||||
ASUS_WMI = lib.mkForce no;
|
||||
ASUS_NB_WMI = lib.mkForce no;
|
||||
ASUS_ARMOURY = lib.mkForce no;
|
||||
ASUS_TF103C_DOCK = lib.mkForce no;
|
||||
ASUS_WIRELESS = lib.mkForce no;
|
||||
COMPAL_LAPTOP = lib.mkForce no;
|
||||
DELL_LAPTOP = lib.mkForce no;
|
||||
DELL_RBTN = lib.mkForce no;
|
||||
DELL_PC = lib.mkForce no;
|
||||
DELL_SMBIOS = lib.mkForce no;
|
||||
DELL_SMO8800 = lib.mkForce no;
|
||||
DELL_UART_BACKLIGHT = lib.mkForce no;
|
||||
DELL_WMI = lib.mkForce no;
|
||||
DELL_WMI_AIO = lib.mkForce no;
|
||||
DELL_WMI_DDV = lib.mkForce no;
|
||||
DELL_WMI_DESCRIPTOR = lib.mkForce no;
|
||||
DELL_WMI_LED = lib.mkForce no;
|
||||
DELL_WMI_SYSMAN = lib.mkForce no;
|
||||
EEEPC_LAPTOP = lib.mkForce no;
|
||||
EEEPC_WMI = lib.mkForce no;
|
||||
FUJITSU_LAPTOP = lib.mkForce no;
|
||||
FUJITSU_ES = lib.mkForce no;
|
||||
FUJITSU_TABLET = lib.mkForce no;
|
||||
HUAWEI_WMI = lib.mkForce no;
|
||||
IBM_ASM = lib.mkForce no;
|
||||
IBM_RTL = lib.mkForce no;
|
||||
IDEAPAD_LAPTOP = lib.mkForce no;
|
||||
LG_LAPTOP = lib.mkForce no;
|
||||
MSI_LAPTOP = lib.mkForce no;
|
||||
MSI_WMI = lib.mkForce no;
|
||||
MSI_EC = lib.mkForce no;
|
||||
PANASONIC_LAPTOP = lib.mkForce no;
|
||||
SONY_LAPTOP = lib.mkForce no;
|
||||
SAMSUNG_LAPTOP = lib.mkForce no;
|
||||
TOPSTAR_LAPTOP = lib.mkForce no;
|
||||
THINKPAD_ACPI = lib.mkForce no;
|
||||
THINKPAD_LMI = lib.mkForce no;
|
||||
LENOVO_SE10_WDT = lib.mkForce no;
|
||||
LENOVO_SE30_WDT = lib.mkForce no;
|
||||
LENOVO_WMI_HOTKEY_UTILITIES = lib.mkForce no;
|
||||
LENOVO_WMI_CAMERA = lib.mkForce no;
|
||||
LENOVO_YMC = lib.mkForce no;
|
||||
LENOVO_WMI_CAPDATA = lib.mkForce no;
|
||||
LENOVO_WMI_EVENTS = lib.mkForce no;
|
||||
LENOVO_WMI_HELPERS = lib.mkForce no;
|
||||
LENOVO_WMI_GAMEZONE = lib.mkForce no;
|
||||
LENOVO_WMI_TUNING = lib.mkForce no;
|
||||
YOGABOOK = lib.mkForce no;
|
||||
YT2_1380 = lib.mkForce no;
|
||||
XIAOMI_WMI = lib.mkForce no;
|
||||
BARCO_P50_GPIO = lib.mkForce no;
|
||||
PC_ENGINES_APU = lib.mkForce no;
|
||||
SILICOM_PLATFORM = lib.mkForce no;
|
||||
SIEMENS_SIMATIC_IPC_WDT = lib.mkForce no;
|
||||
SYSTEM76_ACPI = lib.mkForce no;
|
||||
INSPUR_PLATFORM_PROFILE = lib.mkForce no;
|
||||
NVIDIA_WMI_EC_BACKLIGHT = lib.mkForce no;
|
||||
|
||||
# legacy filesystems (hosts use vfat/f2fs/tmpfs/fuse; exfat/ntfs3 kept for externals)
|
||||
JFS_FS = lib.mkForce no;
|
||||
GFS2_FS = lib.mkForce no;
|
||||
OCFS2_FS = lib.mkForce no;
|
||||
NILFS2_FS = lib.mkForce no;
|
||||
AFFS_FS = lib.mkForce no;
|
||||
HFS_FS = lib.mkForce no;
|
||||
HFSPLUS_FS = lib.mkForce no;
|
||||
BEFS_FS = lib.mkForce no;
|
||||
JFFS2_FS = lib.mkForce no;
|
||||
UBIFS_FS = lib.mkForce no;
|
||||
MINIX_FS = lib.mkForce no;
|
||||
OMFS_FS = lib.mkForce no;
|
||||
ROMFS_FS = lib.mkForce no;
|
||||
UFS_FS = lib.mkForce no;
|
||||
EROFS_FS = lib.mkForce no;
|
||||
ORANGEFS_FS = lib.mkForce no;
|
||||
CODA_FS = lib.mkForce no;
|
||||
AFS_FS = lib.mkForce no;
|
||||
CEPH_FS = lib.mkForce no;
|
||||
ZONEFS_FS = lib.mkForce no;
|
||||
BCACHE = lib.mkForce no;
|
||||
BCACHEFS_FS = lib.mkForce no;
|
||||
ECRYPT_FS = lib.mkForce no;
|
||||
NFSD = lib.mkForce no;
|
||||
|
||||
# legacy partition tables (only GPT+MBR in use)
|
||||
AIX_PARTITION = lib.mkForce no;
|
||||
MAC_PARTITION = lib.mkForce no;
|
||||
LDM_PARTITION = lib.mkForce no;
|
||||
KARMA_PARTITION = lib.mkForce no;
|
||||
MINIX_SUBPARTITION = lib.mkForce no;
|
||||
SOLARIS_X86_PARTITION = lib.mkForce no;
|
||||
BSD_DISKLABEL = lib.mkForce no;
|
||||
UNIXWARE_DISKLABEL = lib.mkForce no;
|
||||
SYSV68_PARTITION = lib.mkForce no;
|
||||
ULTRIX_PARTITION = lib.mkForce no;
|
||||
OSF_PARTITION = lib.mkForce no;
|
||||
SGI_PARTITION = lib.mkForce no;
|
||||
SUN_PARTITION = lib.mkForce no;
|
||||
ATARI_PARTITION = lib.mkForce no;
|
||||
AMIGA_PARTITION = lib.mkForce no;
|
||||
ACORN_PARTITION = lib.mkForce no;
|
||||
|
||||
# legacy net protocols (nothing uses SCTP/RDS/TIPC/SMC or GRE tunnels)
|
||||
IP_SCTP = lib.mkForce no;
|
||||
RDS = lib.mkForce no;
|
||||
TIPC = lib.mkForce no;
|
||||
SMC = lib.mkForce no;
|
||||
NET_IPIP = lib.mkForce no;
|
||||
NET_IPGRE = lib.mkForce no;
|
||||
NET_IPGRE_DEMUX = lib.mkForce no;
|
||||
NET_IPVTI = lib.mkForce no;
|
||||
|
||||
# legacy PCI sound cards (kept: SND_HDA_* for AMD HDA, SND_SOC_SOF_AMD for ACP)
|
||||
SND_ALI5451 = lib.mkForce no;
|
||||
SND_ATIIXP = lib.mkForce no;
|
||||
SND_ATIIXP_MODEM = lib.mkForce no;
|
||||
SND_AU8810 = lib.mkForce no;
|
||||
SND_AU8820 = lib.mkForce no;
|
||||
SND_AU8830 = lib.mkForce no;
|
||||
SND_AW2 = lib.mkForce no;
|
||||
SND_AZT3328 = lib.mkForce no;
|
||||
SND_BT87X = lib.mkForce no;
|
||||
SND_CA0106 = lib.mkForce no;
|
||||
SND_CMIPCI = lib.mkForce no;
|
||||
SND_OXYGEN = lib.mkForce no;
|
||||
SND_CS46XX = lib.mkForce no;
|
||||
SND_CTXFI = lib.mkForce no;
|
||||
SND_DARLA20 = lib.mkForce no;
|
||||
SND_GINA20 = lib.mkForce no;
|
||||
SND_LAYLA20 = lib.mkForce no;
|
||||
SND_DARLA24 = lib.mkForce no;
|
||||
SND_GINA24 = lib.mkForce no;
|
||||
SND_LAYLA24 = lib.mkForce no;
|
||||
SND_MONA = lib.mkForce no;
|
||||
SND_MIA = lib.mkForce no;
|
||||
SND_ECHO3G = lib.mkForce no;
|
||||
SND_INDIGO = lib.mkForce no;
|
||||
SND_INDIGOIO = lib.mkForce no;
|
||||
SND_INDIGODJ = lib.mkForce no;
|
||||
SND_INDIGOIOX = lib.mkForce no;
|
||||
SND_INDIGODJX = lib.mkForce no;
|
||||
SND_EMU10K1 = lib.mkForce no;
|
||||
SND_EMU10K1X = lib.mkForce no;
|
||||
SND_ENS1370 = lib.mkForce no;
|
||||
SND_ENS1371 = lib.mkForce no;
|
||||
SND_ES1938 = lib.mkForce no;
|
||||
SND_ES1968 = lib.mkForce no;
|
||||
SND_FM801 = lib.mkForce no;
|
||||
SND_HDSP = lib.mkForce no;
|
||||
SND_HDSPM = lib.mkForce no;
|
||||
SND_ICE1712 = lib.mkForce no;
|
||||
SND_ICE1724 = lib.mkForce no;
|
||||
SND_INTEL8X0 = lib.mkForce no;
|
||||
SND_INTEL8X0M = lib.mkForce no;
|
||||
SND_KORG1212 = lib.mkForce no;
|
||||
SND_LOLA = lib.mkForce no;
|
||||
SND_LX6464ES = lib.mkForce no;
|
||||
SND_MAESTRO3 = lib.mkForce no;
|
||||
SND_MIXART = lib.mkForce no;
|
||||
SND_MPU401 = lib.mkForce no;
|
||||
SND_MTS64 = lib.mkForce no;
|
||||
SND_NM256 = lib.mkForce no;
|
||||
SND_PCXHR = lib.mkForce no;
|
||||
SND_PORTMAN2X4 = lib.mkForce no;
|
||||
SND_RIPTIDE = lib.mkForce no;
|
||||
SND_RME32 = lib.mkForce no;
|
||||
SND_RME96 = lib.mkForce no;
|
||||
SND_RME9652 = lib.mkForce no;
|
||||
SND_SE6X = lib.mkForce no;
|
||||
SND_TRIDENT = lib.mkForce no;
|
||||
SND_VIA82XX = lib.mkForce no;
|
||||
SND_VIRTUOSO = lib.mkForce no;
|
||||
SND_VX222 = lib.mkForce no;
|
||||
SND_YMFPCI = lib.mkForce no;
|
||||
|
||||
# legacy HDA codecs (kept: REALTEK for ALC269 on Framework + HDMI for amdhdmi)
|
||||
SND_HDA_CODEC_ANALOG = lib.mkForce no;
|
||||
SND_HDA_CODEC_SIGMATEL = lib.mkForce no;
|
||||
SND_HDA_CODEC_VIA = lib.mkForce no;
|
||||
SND_HDA_CODEC_CONEXANT = lib.mkForce no;
|
||||
SND_HDA_CODEC_CA0110 = lib.mkForce no;
|
||||
SND_HDA_CODEC_CA0132 = lib.mkForce no;
|
||||
SND_HDA_CODEC_SI3054 = lib.mkForce no;
|
||||
SND_HDA_CODEC_CIRRUS = lib.mkForce no;
|
||||
SND_HDA_CODEC_CS420X = lib.mkForce no;
|
||||
SND_HDA_CODEC_CS421X = lib.mkForce no;
|
||||
SND_HDA_CODEC_CS8409 = lib.mkForce no;
|
||||
|
||||
# OSS compat (deprecated)
|
||||
SOUND_OSS_CORE = lib.mkForce no;
|
||||
|
||||
# legacy USB HCDs (Zen APUs only have xHCI)
|
||||
USB_OHCI_HCD = lib.mkForce no;
|
||||
USB_UHCI_HCD = lib.mkForce no;
|
||||
USB_C67X00_HCD = lib.mkForce no;
|
||||
USB_OXU210HP_HCD = lib.mkForce no;
|
||||
USB_ISP116X_HCD = lib.mkForce no;
|
||||
USB_ISP1760 = lib.mkForce no;
|
||||
USB_MAX3421_HCD = lib.mkForce no;
|
||||
USB_SL811_HCD = lib.mkForce no;
|
||||
USB_R8A66597 = lib.mkForce no;
|
||||
USB_XEN_HCD = lib.mkForce no;
|
||||
|
||||
# USB gadget + exotic device drivers
|
||||
USB_GADGET = lib.mkForce no;
|
||||
USB_MICROTEK = lib.mkForce no;
|
||||
USB_USS720 = lib.mkForce no;
|
||||
USB_EMI26 = lib.mkForce no;
|
||||
USB_EMI62 = lib.mkForce no;
|
||||
USB_ADUTUX = lib.mkForce no;
|
||||
USB_SEVSEG = lib.mkForce no;
|
||||
USB_LEGOTOWER = lib.mkForce no;
|
||||
USB_CYPRESS_CY7C63 = lib.mkForce no;
|
||||
USB_CYTHERM = lib.mkForce no;
|
||||
USB_IDMOUSE = lib.mkForce no;
|
||||
USB_APPLEDISPLAY = lib.mkForce no;
|
||||
USB_TRANCEVIBRATOR = lib.mkForce no;
|
||||
USB_CHAOSKEY = lib.mkForce no;
|
||||
USB_TEST = lib.mkForce no;
|
||||
|
||||
# USB mass-storage sub-drivers for legacy flash/camera readers
|
||||
USB_STORAGE_REALTEK = lib.mkForce no;
|
||||
USB_STORAGE_DATAFAB = lib.mkForce no;
|
||||
USB_STORAGE_FREECOM = lib.mkForce no;
|
||||
USB_STORAGE_ISD200 = lib.mkForce no;
|
||||
USB_STORAGE_USBAT = lib.mkForce no;
|
||||
USB_STORAGE_SDDR09 = lib.mkForce no;
|
||||
USB_STORAGE_SDDR55 = lib.mkForce no;
|
||||
USB_STORAGE_JUMPSHOT = lib.mkForce no;
|
||||
USB_STORAGE_ALAUDA = lib.mkForce no;
|
||||
USB_STORAGE_ONETOUCH = lib.mkForce no;
|
||||
USB_STORAGE_KARMA = lib.mkForce no;
|
||||
USB_STORAGE_CYPRESS_ATACB = lib.mkForce no;
|
||||
USB_STORAGE_ENE_UB6250 = lib.mkForce no;
|
||||
|
||||
# wlan vendors (kept: MEDIATEK/INTEL/REALTEK/BROADCOM for mreow+yarn)
|
||||
WLAN_VENDOR_ADMTEK = lib.mkForce no;
|
||||
WLAN_VENDOR_ATMEL = lib.mkForce no;
|
||||
WLAN_VENDOR_CISCO = lib.mkForce no;
|
||||
WLAN_VENDOR_INTERSIL = lib.mkForce no;
|
||||
WLAN_VENDOR_MARVELL = lib.mkForce no;
|
||||
WLAN_VENDOR_MICROCHIP = lib.mkForce no;
|
||||
WLAN_VENDOR_PURELIFI = lib.mkForce no;
|
||||
WLAN_VENDOR_QUANTENNA = lib.mkForce no;
|
||||
WLAN_VENDOR_RALINK = lib.mkForce no;
|
||||
WLAN_VENDOR_RSI = lib.mkForce no;
|
||||
WLAN_VENDOR_SILABS = lib.mkForce no;
|
||||
WLAN_VENDOR_ST = lib.mkForce no;
|
||||
WLAN_VENDOR_TI = lib.mkForce no;
|
||||
WLAN_VENDOR_ZYDAS = lib.mkForce no;
|
||||
|
||||
# ethernet vendors (kept: AMD/INTEL/REALTEK/AQUANTIA/ATHEROS)
|
||||
NET_VENDOR_3COM = lib.mkForce no;
|
||||
NET_VENDOR_ADAPTEC = lib.mkForce no;
|
||||
NET_VENDOR_AGERE = lib.mkForce no;
|
||||
NET_VENDOR_ALACRITECH = lib.mkForce no;
|
||||
NET_VENDOR_ALTEON = lib.mkForce no;
|
||||
NET_VENDOR_AMAZON = lib.mkForce no;
|
||||
NET_VENDOR_ARC = lib.mkForce no;
|
||||
NET_VENDOR_BROADCOM = lib.mkForce no;
|
||||
NET_VENDOR_BROCADE = lib.mkForce no;
|
||||
NET_VENDOR_CADENCE = lib.mkForce no;
|
||||
NET_VENDOR_CAVIUM = lib.mkForce no;
|
||||
NET_VENDOR_CHELSIO = lib.mkForce no;
|
||||
NET_VENDOR_CISCO = lib.mkForce no;
|
||||
NET_VENDOR_CORTINA = lib.mkForce no;
|
||||
NET_VENDOR_DAVICOM = lib.mkForce no;
|
||||
NET_VENDOR_DEC = lib.mkForce no;
|
||||
NET_VENDOR_DLINK = lib.mkForce no;
|
||||
NET_VENDOR_EMULEX = lib.mkForce no;
|
||||
NET_VENDOR_ENGLEDER = lib.mkForce no;
|
||||
NET_VENDOR_EZCHIP = lib.mkForce no;
|
||||
NET_VENDOR_FUJITSU = lib.mkForce no;
|
||||
NET_VENDOR_FUNGIBLE = lib.mkForce no;
|
||||
NET_VENDOR_GOOGLE = lib.mkForce no;
|
||||
NET_VENDOR_HISILICON = lib.mkForce no;
|
||||
NET_VENDOR_HUAWEI = lib.mkForce no;
|
||||
NET_VENDOR_I825XX = lib.mkForce no;
|
||||
NET_VENDOR_ADI = lib.mkForce no;
|
||||
NET_VENDOR_LITEX = lib.mkForce no;
|
||||
NET_VENDOR_MARVELL = lib.mkForce no;
|
||||
NET_VENDOR_META = lib.mkForce no;
|
||||
NET_VENDOR_MICREL = lib.mkForce no;
|
||||
NET_VENDOR_MICROCHIP = lib.mkForce no;
|
||||
NET_VENDOR_MICROSEMI = lib.mkForce no;
|
||||
NET_VENDOR_MICROSOFT = lib.mkForce no;
|
||||
NET_VENDOR_MUCSE = lib.mkForce no;
|
||||
NET_VENDOR_MYRI = lib.mkForce no;
|
||||
NET_VENDOR_NI = lib.mkForce no;
|
||||
NET_VENDOR_NATSEMI = lib.mkForce no;
|
||||
NET_VENDOR_NETRONOME = lib.mkForce no;
|
||||
NET_VENDOR_8390 = lib.mkForce no;
|
||||
NET_VENDOR_NVIDIA = lib.mkForce no;
|
||||
NET_VENDOR_OKI = lib.mkForce no;
|
||||
NET_VENDOR_PACKET_ENGINES = lib.mkForce no;
|
||||
NET_VENDOR_PENSANDO = lib.mkForce no;
|
||||
NET_VENDOR_QLOGIC = lib.mkForce no;
|
||||
NET_VENDOR_QUALCOMM = lib.mkForce no;
|
||||
NET_VENDOR_RDC = lib.mkForce no;
|
||||
NET_VENDOR_RENESAS = lib.mkForce no;
|
||||
NET_VENDOR_ROCKER = lib.mkForce no;
|
||||
NET_VENDOR_SAMSUNG = lib.mkForce no;
|
||||
NET_VENDOR_SEEQ = lib.mkForce no;
|
||||
NET_VENDOR_SILAN = lib.mkForce no;
|
||||
NET_VENDOR_SIS = lib.mkForce no;
|
||||
NET_VENDOR_SOLARFLARE = lib.mkForce no;
|
||||
NET_VENDOR_SMSC = lib.mkForce no;
|
||||
NET_VENDOR_SOCIONEXT = lib.mkForce no;
|
||||
NET_VENDOR_STMICRO = lib.mkForce no;
|
||||
NET_VENDOR_SUN = lib.mkForce no;
|
||||
NET_VENDOR_SYNOPSYS = lib.mkForce no;
|
||||
NET_VENDOR_TEHUTI = lib.mkForce no;
|
||||
NET_VENDOR_TI = lib.mkForce no;
|
||||
NET_VENDOR_VERTEXCOM = lib.mkForce no;
|
||||
NET_VENDOR_VIA = lib.mkForce no;
|
||||
NET_VENDOR_WANGXUN = lib.mkForce no;
|
||||
NET_VENDOR_WIZNET = lib.mkForce no;
|
||||
NET_VENDOR_XILINX = lib.mkForce no;
|
||||
NET_VENDOR_XIRCOM = lib.mkForce no;
|
||||
|
||||
# watchdogs (kept: SP5100_TCO for AMD chipset, WDAT_WDT for ACPI)
|
||||
ACQUIRE_WDT = lib.mkForce no;
|
||||
ADVANTECH_WDT = lib.mkForce no;
|
||||
ADVANTECH_EC_WDT = lib.mkForce no;
|
||||
ALIM1535_WDT = lib.mkForce no;
|
||||
ALIM7101_WDT = lib.mkForce no;
|
||||
CGBC_WDT = lib.mkForce no;
|
||||
EBC_C384_WDT = lib.mkForce no;
|
||||
EXAR_WDT = lib.mkForce no;
|
||||
F71808E_WDT = lib.mkForce no;
|
||||
EUROTECH_WDT = lib.mkForce no;
|
||||
IB700_WDT = lib.mkForce no;
|
||||
WAFER_WDT = lib.mkForce no;
|
||||
I6300ESB_WDT = lib.mkForce no;
|
||||
IE6XX_WDT = lib.mkForce no;
|
||||
ITCO_WDT = lib.mkForce no;
|
||||
IT8712F_WDT = lib.mkForce no;
|
||||
IT87_WDT = lib.mkForce no;
|
||||
HP_WATCHDOG = lib.mkForce no;
|
||||
HPWDT_NMI_DECODE = lib.mkForce no;
|
||||
KEMPLD_WDT = lib.mkForce no;
|
||||
MLX_WDT = lib.mkForce no;
|
||||
NI903X_WDT = lib.mkForce no;
|
||||
NIC7018_WDT = lib.mkForce no;
|
||||
SMSC37B787_WDT = lib.mkForce no;
|
||||
TQMX86_WDT = lib.mkForce no;
|
||||
VIA_WDT = lib.mkForce no;
|
||||
W83627HF_WDT = lib.mkForce no;
|
||||
W83877F_WDT = lib.mkForce no;
|
||||
W83977F_WDT = lib.mkForce no;
|
||||
MACHZ_WDT = lib.mkForce no;
|
||||
SBC_EPX_C3_WATCHDOG = lib.mkForce no;
|
||||
MEN_A21_WDT = lib.mkForce no;
|
||||
DW_WATCHDOG = lib.mkForce no;
|
||||
SOFT_WATCHDOG = lib.mkForce no;
|
||||
XILINX_WATCHDOG = lib.mkForce no;
|
||||
|
||||
# misc dead weight
|
||||
BLK_DEV_DRBD = lib.mkForce no;
|
||||
GREYBUS = lib.mkForce no;
|
||||
SOUNDWIRE_QCOM = lib.mkForce no;
|
||||
SOUNDWIRE_INTEL = lib.mkForce no;
|
||||
MEDIA_RADIO_SUPPORT = lib.mkForce no;
|
||||
|
||||
# net queue disciplines not used on desktop (kept: htb/prio/fifo/fq/fq_codel/cake/bpf/ingress/netem/tbf/mqprio for basic shaping + testing)
|
||||
NET_SCH_CBS = lib.mkForce no;
|
||||
NET_SCH_CHOKE = lib.mkForce no;
|
||||
NET_SCH_CODEL = lib.mkForce no;
|
||||
NET_SCH_DRR = lib.mkForce no;
|
||||
NET_SCH_DUALPI2 = lib.mkForce no;
|
||||
NET_SCH_ETF = lib.mkForce no;
|
||||
NET_SCH_ETS = lib.mkForce no;
|
||||
NET_SCH_FQ_PIE = lib.mkForce no;
|
||||
NET_SCH_GRED = lib.mkForce no;
|
||||
NET_SCH_HFSC = lib.mkForce no;
|
||||
NET_SCH_HHF = lib.mkForce no;
|
||||
NET_SCH_MULTIQ = lib.mkForce no;
|
||||
NET_SCH_PIE = lib.mkForce no;
|
||||
NET_SCH_PLUG = lib.mkForce no;
|
||||
NET_SCH_QFQ = lib.mkForce no;
|
||||
NET_SCH_RED = lib.mkForce no;
|
||||
NET_SCH_SFB = lib.mkForce no;
|
||||
NET_SCH_SFQ = lib.mkForce no;
|
||||
NET_SCH_SKBPRIO = lib.mkForce no;
|
||||
NET_SCH_TAPRIO = lib.mkForce no;
|
||||
NET_SCH_TEQL = lib.mkForce no;
|
||||
|
||||
# battery charger PMIC drivers — all mobile/embedded SoCs, none of these
|
||||
# exist on x86 laptops/desktops (which use ACPI battery + USB-PD via ucsi).
|
||||
# CROS_* are Chromebook-specific; Framework has CrOS EC but not CrOS charging.
|
||||
CHARGER_88PM860X = lib.mkForce no;
|
||||
CHARGER_ADP5061 = lib.mkForce no;
|
||||
CHARGER_AXP20X = lib.mkForce no;
|
||||
CHARGER_BD71828 = lib.mkForce no;
|
||||
CHARGER_BD99954 = lib.mkForce no;
|
||||
CHARGER_BQ2415X = lib.mkForce no;
|
||||
CHARGER_BQ24190 = lib.mkForce no;
|
||||
CHARGER_BQ24257 = lib.mkForce no;
|
||||
CHARGER_BQ24735 = lib.mkForce no;
|
||||
CHARGER_BQ2515X = lib.mkForce no;
|
||||
CHARGER_BQ256XX = lib.mkForce no;
|
||||
CHARGER_BQ257XX = lib.mkForce no;
|
||||
CHARGER_BQ25890 = lib.mkForce no;
|
||||
CHARGER_BQ25980 = lib.mkForce no;
|
||||
CHARGER_CROS_CONTROL = lib.mkForce no;
|
||||
CHARGER_CROS_PCHG = lib.mkForce no;
|
||||
CHARGER_CROS_USBPD = lib.mkForce no;
|
||||
CHARGER_DA9150 = lib.mkForce no;
|
||||
CHARGER_DETECTOR_MAX14656 = lib.mkForce no;
|
||||
CHARGER_GPIO = lib.mkForce no;
|
||||
CHARGER_ISP1704 = lib.mkForce no;
|
||||
CHARGER_LP8727 = lib.mkForce no;
|
||||
CHARGER_LP8788 = lib.mkForce no;
|
||||
CHARGER_LT3651 = lib.mkForce no;
|
||||
CHARGER_LTC4162L = lib.mkForce no;
|
||||
CHARGER_MANAGER = lib.mkForce no;
|
||||
CHARGER_MAX14577 = lib.mkForce no;
|
||||
CHARGER_MAX77650 = lib.mkForce no;
|
||||
CHARGER_MAX77693 = lib.mkForce no;
|
||||
CHARGER_MAX77705 = lib.mkForce no;
|
||||
CHARGER_MAX77976 = lib.mkForce no;
|
||||
CHARGER_MAX8903 = lib.mkForce no;
|
||||
CHARGER_MAX8971 = lib.mkForce no;
|
||||
CHARGER_MAX8997 = lib.mkForce no;
|
||||
CHARGER_MAX8998 = lib.mkForce no;
|
||||
CHARGER_MP2629 = lib.mkForce no;
|
||||
CHARGER_MT6360 = lib.mkForce no;
|
||||
CHARGER_MT6370 = lib.mkForce no;
|
||||
CHARGER_PF1550 = lib.mkForce no;
|
||||
CHARGER_RK817 = lib.mkForce no;
|
||||
CHARGER_RT5033 = lib.mkForce no;
|
||||
CHARGER_RT9455 = lib.mkForce no;
|
||||
CHARGER_RT9467 = lib.mkForce no;
|
||||
CHARGER_RT9471 = lib.mkForce no;
|
||||
CHARGER_RT9756 = lib.mkForce no;
|
||||
CHARGER_SBS = lib.mkForce no;
|
||||
CHARGER_SMB347 = lib.mkForce no;
|
||||
CHARGER_TPS65090 = lib.mkForce no;
|
||||
CHARGER_TPS65217 = lib.mkForce no;
|
||||
CHARGER_TWL4030 = lib.mkForce no;
|
||||
CHARGER_TWL6030 = lib.mkForce no;
|
||||
CHARGER_UCS1002 = lib.mkForce no;
|
||||
CHARGER_WILCO = lib.mkForce no;
|
||||
|
||||
# enterprise storage stack (kept: DM_CRYPT for LUKS, DM_SNAPSHOT/INTEGRITY/VERITY, MD_RAID0/1/10/456 in case)
|
||||
DM_MULTIPATH = lib.mkForce no;
|
||||
DM_MULTIPATH_QL = lib.mkForce no;
|
||||
DM_MULTIPATH_ST = lib.mkForce no;
|
||||
DM_MULTIPATH_HST = lib.mkForce no;
|
||||
DM_MULTIPATH_IOA = lib.mkForce no;
|
||||
DM_VDO = lib.mkForce no;
|
||||
DM_PCACHE = lib.mkForce no;
|
||||
DM_ZONED = lib.mkForce no;
|
||||
DM_LOG_USERSPACE = lib.mkForce no;
|
||||
DM_EBS = lib.mkForce no;
|
||||
DM_ERA = lib.mkForce no;
|
||||
DM_DUST = lib.mkForce no;
|
||||
DM_DELAY = lib.mkForce no;
|
||||
DM_FLAKEY = lib.mkForce no;
|
||||
DM_SWITCH = lib.mkForce no;
|
||||
DM_LOG_WRITES = lib.mkForce no;
|
||||
DM_CLONE = lib.mkForce no;
|
||||
DM_UNSTRIPED = lib.mkForce no;
|
||||
DM_CACHE = lib.mkForce no;
|
||||
DM_WRITECACHE = lib.mkForce no;
|
||||
DM_THIN_PROVISIONING = lib.mkForce no;
|
||||
MD_CLUSTER = lib.mkForce no;
|
||||
MD_LINEAR = lib.mkForce no;
|
||||
SCSI_DH_RDAC = lib.mkForce no;
|
||||
SCSI_DH_HP_SW = lib.mkForce no;
|
||||
SCSI_ENCLOSURE = lib.mkForce no;
|
||||
};
|
||||
}
|
||||
];
|
||||
@@ -337,12 +835,6 @@
|
||||
"msr"
|
||||
"btusb"
|
||||
];
|
||||
|
||||
kernelParams = [
|
||||
# 1gb huge pages
|
||||
"hugepagesz=1G"
|
||||
"hugepages=3"
|
||||
];
|
||||
};
|
||||
|
||||
services = {
|
||||
@@ -381,9 +873,6 @@
|
||||
};
|
||||
};
|
||||
|
||||
# EST
|
||||
time.timeZone = "America/New_York";
|
||||
|
||||
# Select internationalisation properties.
|
||||
i18n.defaultLocale = "en_US.UTF-8";
|
||||
|
||||
@@ -419,8 +908,7 @@
|
||||
"camera"
|
||||
"adbusers"
|
||||
];
|
||||
# TODO! this is really bad :( I should really figure out how to do proper secrets management
|
||||
hashedPasswordFile = "${../secrets/desktop/password-hash}";
|
||||
hashedPasswordFile = config.age.secrets.password-hash.path;
|
||||
};
|
||||
|
||||
services.gvfs.enable = true;
|
||||
|
||||
40
modules/desktop-jovian.nix
Normal file
40
modules/desktop-jovian.nix
Normal file
@@ -0,0 +1,40 @@
|
||||
# Jovian-NixOS deck-mode configuration shared by all hosts running Steam
|
||||
# in gamescope (yarn, patiodeck). Host-specific settings (like
|
||||
# jovian.devices.steamdeck.enable) stay in the host's default.nix.
|
||||
{
|
||||
lib,
|
||||
username,
|
||||
inputs,
|
||||
...
|
||||
}:
|
||||
{
|
||||
imports = [
|
||||
./desktop-steam-update.nix
|
||||
inputs.jovian-nixos.nixosModules.default
|
||||
];
|
||||
|
||||
nixpkgs.config.allowUnfreePredicate =
|
||||
pkg:
|
||||
builtins.elem (lib.getName pkg) [
|
||||
"steamdeck-hw-theme"
|
||||
"steam-jupiter-unwrapped"
|
||||
"steam"
|
||||
"steam-original"
|
||||
"steam-unwrapped"
|
||||
"steam-run"
|
||||
];
|
||||
|
||||
jovian.steam = {
|
||||
enable = true;
|
||||
autoStart = true;
|
||||
desktopSession = "niri";
|
||||
user = username;
|
||||
};
|
||||
|
||||
# jovian overrides the display manager; sddm is required
|
||||
services.displayManager.sddm.wayland.enable = true;
|
||||
|
||||
# desktop-common.nix enables programs.gamescope which conflicts with
|
||||
# jovian's own gamescope wrapper
|
||||
programs.gamescope.enable = lib.mkForce false;
|
||||
}
|
||||
49
modules/desktop-lanzaboote-agenix.nix
Normal file
49
modules/desktop-lanzaboote-agenix.nix
Normal file
@@ -0,0 +1,49 @@
|
||||
{
|
||||
config,
|
||||
lib,
|
||||
pkgs,
|
||||
inputs,
|
||||
...
|
||||
}:
|
||||
{
|
||||
imports = [
|
||||
inputs.lanzaboote.nixosModules.lanzaboote
|
||||
];
|
||||
|
||||
boot = {
|
||||
loader.systemd-boot.enable = lib.mkForce false;
|
||||
|
||||
lanzaboote = {
|
||||
enable = true;
|
||||
# sbctl expects the bundle at /var/lib/sbctl; muffin uses /etc/secureboot
|
||||
# because it is wiped on every activation there (impermanence) — desktops
|
||||
# extract to the historical sbctl path so existing tooling keeps working.
|
||||
pkiBundle = "/var/lib/sbctl";
|
||||
};
|
||||
};
|
||||
|
||||
system.activationScripts = {
|
||||
# Extract the secureboot PKI bundle from the agenix-decrypted tar. Mirrors
|
||||
# modules/server-lanzaboote-agenix.nix; skip when keys are already present
|
||||
# (e.g., disko-install staged them via --extra-files).
|
||||
"secureboot-keys" = {
|
||||
deps = [ "agenix" ];
|
||||
text = ''
|
||||
#!/bin/sh
|
||||
(
|
||||
umask 077
|
||||
if [[ -d ${config.boot.lanzaboote.pkiBundle} && -f ${config.boot.lanzaboote.pkiBundle}/db.key ]]; then
|
||||
echo "secureboot keys already present, skipping extraction"
|
||||
else
|
||||
echo "extracting secureboot keys from agenix"
|
||||
rm -fr ${config.boot.lanzaboote.pkiBundle} || true
|
||||
install -d -o root -g wheel -m 0500 ${config.boot.lanzaboote.pkiBundle}
|
||||
${pkgs.gnutar}/bin/tar xf ${config.age.secrets.secureboot-tar.path} -C ${config.boot.lanzaboote.pkiBundle}
|
||||
fi
|
||||
chown -R root:wheel ${config.boot.lanzaboote.pkiBundle}
|
||||
chmod -R 500 ${config.boot.lanzaboote.pkiBundle}
|
||||
)
|
||||
'';
|
||||
};
|
||||
};
|
||||
}
|
||||
58
modules/desktop-oo7-daemon.nix
Normal file
58
modules/desktop-oo7-daemon.nix
Normal file
@@ -0,0 +1,58 @@
|
||||
# oo7-daemon — the pure-Rust implementation of the org.freedesktop.secrets
|
||||
# (libsecret) D-Bus interface, written by the same project that ships the
|
||||
# `oo7` Rust crate that flare uses internally.
|
||||
#
|
||||
# Without a secret-service provider on the bus, flare's `oo7::Keyring::new()`
|
||||
# call fails immediately at startup ("The communication with libsecret
|
||||
# failed"). Most NixOS desktops solve this by enabling
|
||||
# `services.gnome.gnome-keyring.enable`, but that drags GNOME plumbing
|
||||
# we don't otherwise want; oo7-daemon is the lightweight match for niri
|
||||
# desktops.
|
||||
#
|
||||
# The `oo7-server` package ships:
|
||||
# - libexec/oo7-daemon (the binary)
|
||||
# - share/dbus-1/services/org.freedesktop.secrets.service
|
||||
# - share/systemd/user/oo7-daemon.service
|
||||
#
|
||||
# We register both with NixOS and start the daemon at user login so
|
||||
# libsecret clients can find the bus name without depending on D-Bus
|
||||
# auto-activation. We also alias the unit as
|
||||
# `dbus-org.freedesktop.secrets.service` so D-Bus activation falls back
|
||||
# to it cleanly when the daemon has not been started yet (e.g. inside a
|
||||
# fresh `systemd-run --user` scope).
|
||||
|
||||
{ pkgs, ... }:
|
||||
let
|
||||
# 0.6.0 stops at LockedKeyring::open(login) when no keyring file exists,
|
||||
# so on first run the auto-created default collection is locked and a
|
||||
# client's Unlock() call routes to a prompt that never resolves (no
|
||||
# gnome-shell / kwallet / gcr-prompter on a niri desktop). Cherry-pick
|
||||
# upstream cf7b9a9 (PR #443) which uses the systemd credential / PAM
|
||||
# secret to unlock the new keyring directly. Drop the override when
|
||||
# nixpkgs ships an oo7-server release that includes the fix.
|
||||
oo7-server = pkgs.oo7-server.overrideAttrs (old: {
|
||||
patches = (old.patches or [ ]) ++ [
|
||||
../patches/oo7-server/0001-server-Use-provided-secret-to-unlock-auto-created-de.patch
|
||||
];
|
||||
});
|
||||
in
|
||||
{
|
||||
environment.systemPackages = [ oo7-server ];
|
||||
|
||||
services.dbus.packages = [ oo7-server ];
|
||||
systemd.packages = [ oo7-server ];
|
||||
|
||||
systemd.user.services.oo7-daemon = {
|
||||
wantedBy = [ "default.target" ];
|
||||
aliases = [ "dbus-org.freedesktop.secrets.service" ];
|
||||
# Feed the keyring master password through systemd's credential
|
||||
# machinery. The upstream unit declares
|
||||
# `ImportCredential=oo7.keyring-encryption-password`, which picks up
|
||||
# whatever LoadCredential leaves under $CREDENTIALS_DIRECTORY. agenix
|
||||
# decrypts the secret to /run/agenix/oo7-keyring-password as the
|
||||
# `primary` user, who is also the user this user-scope unit runs as.
|
||||
serviceConfig.LoadCredential = [
|
||||
"oo7.keyring-encryption-password:/run/agenix/oo7-keyring-password"
|
||||
];
|
||||
};
|
||||
}
|
||||
122
modules/desktop-steam-update.nix
Normal file
122
modules/desktop-steam-update.nix
Normal file
@@ -0,0 +1,122 @@
|
||||
# Binary-cache update mechanism for Jovian-NixOS desktops.
|
||||
#
|
||||
# Replaces the upstream holo-update/steamos-update stubs with a script that
|
||||
# checks the private binary cache for a newer system closure, and provides a
|
||||
# root-level systemd service to apply it. Steam's deck UI calls
|
||||
# `steamos-update check` periodically; exit 7 = no update, exit 0 = update
|
||||
# applied or available.
|
||||
#
|
||||
# The deploy endpoint is ${binary_cache_url}/deploy/${hostname} — a plain
|
||||
# text file containing the /nix/store path of the latest closure, published
|
||||
# by CI after a successful build.
|
||||
{
|
||||
pkgs,
|
||||
lib,
|
||||
hostname,
|
||||
username,
|
||||
site_config,
|
||||
...
|
||||
}:
|
||||
let
|
||||
deploy-url = "${site_config.binary_cache.url}/deploy/${hostname}";
|
||||
|
||||
steamos-update-script = pkgs.writeShellScript "steamos-update" ''
|
||||
export PATH=${
|
||||
lib.makeBinPath [
|
||||
pkgs.curl
|
||||
pkgs.coreutils
|
||||
pkgs.systemd
|
||||
]
|
||||
}
|
||||
|
||||
STORE_PATH=$(curl -sf --max-time 30 "${deploy-url}" || true)
|
||||
|
||||
if [ -z "$STORE_PATH" ]; then
|
||||
>&2 echo "[steamos-update] server unreachable"
|
||||
exit 7
|
||||
fi
|
||||
|
||||
CURRENT=$(readlink -f /nix/var/nix/profiles/system)
|
||||
if [ "$CURRENT" = "$STORE_PATH" ]; then
|
||||
>&2 echo "[steamos-update] no update available"
|
||||
exit 7
|
||||
fi
|
||||
|
||||
# check-only mode: just report that an update exists
|
||||
if [ "''${1:-}" = "check" ] || [ "''${1:-}" = "--check-only" ]; then
|
||||
>&2 echo "[steamos-update] update available"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# apply: trigger the root-running systemd service to install the update
|
||||
>&2 echo "[steamos-update] applying update..."
|
||||
if systemctl start --wait pull-update-apply.service; then
|
||||
>&2 echo "[steamos-update] update installed, reboot to apply"
|
||||
exit 0
|
||||
else
|
||||
>&2 echo "[steamos-update] apply failed; see 'journalctl -u pull-update-apply'"
|
||||
exit 1
|
||||
fi
|
||||
'';
|
||||
in
|
||||
{
|
||||
nixpkgs.overlays = [
|
||||
(_final: prev: {
|
||||
jovian-stubs = prev.jovian-stubs.overrideAttrs (old: {
|
||||
buildCommand = (old.buildCommand or "") + ''
|
||||
install -D -m 755 ${steamos-update-script} $out/bin/holo-update
|
||||
install -D -m 755 ${steamos-update-script} $out/bin/steamos-update
|
||||
'';
|
||||
});
|
||||
})
|
||||
];
|
||||
|
||||
systemd.services.pull-update-apply = {
|
||||
description = "Apply pending NixOS update pulled from binary cache";
|
||||
serviceConfig = {
|
||||
Type = "oneshot";
|
||||
ExecStart = pkgs.writeShellScript "pull-update-apply" ''
|
||||
set -uo pipefail
|
||||
export PATH=${
|
||||
lib.makeBinPath [
|
||||
pkgs.curl
|
||||
pkgs.coreutils
|
||||
pkgs.nix
|
||||
]
|
||||
}
|
||||
|
||||
STORE_PATH=$(curl -sf --max-time 30 "${deploy-url}" || true)
|
||||
if [ -z "$STORE_PATH" ]; then
|
||||
echo "server unreachable"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
CURRENT=$(readlink -f /nix/var/nix/profiles/system)
|
||||
if [ "$CURRENT" = "$STORE_PATH" ]; then
|
||||
echo "already up to date: $STORE_PATH"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
echo "applying $STORE_PATH (was $CURRENT)"
|
||||
nix-store -r --add-root /nix/var/nix/gcroots/pull-update-apply-latest --indirect "$STORE_PATH" \
|
||||
|| { echo "fetch failed"; exit 1; }
|
||||
nix-env -p /nix/var/nix/profiles/system --set "$STORE_PATH" \
|
||||
|| { echo "profile set failed"; exit 1; }
|
||||
"$STORE_PATH/bin/switch-to-configuration" boot \
|
||||
|| { echo "boot entry failed"; exit 1; }
|
||||
echo "update applied; reboot required"
|
||||
'';
|
||||
};
|
||||
};
|
||||
|
||||
# allow the primary user to trigger pull-update-apply without a password
|
||||
security.polkit.extraConfig = ''
|
||||
polkit.addRule(function(action, subject) {
|
||||
if (action.id == "org.freedesktop.systemd1.manage-units" &&
|
||||
action.lookup("unit") == "pull-update-apply.service" &&
|
||||
subject.user == "${username}") {
|
||||
return polkit.Result.YES;
|
||||
}
|
||||
});
|
||||
'';
|
||||
}
|
||||
187
modules/server-deploy-finalize.nix
Normal file
187
modules/server-deploy-finalize.nix
Normal file
@@ -0,0 +1,187 @@
|
||||
# Deferred deploy finalize for deploy-rs-driven hosts.
|
||||
#
|
||||
# When deploy-rs activates via `switch-to-configuration switch` and the gitea-
|
||||
# actions runner driving the deploy lives on the same host, the runner unit
|
||||
# gets restarted mid-activation — its definition changes between builds. That
|
||||
# restart kills the SSH session, the CI job, and deploy-rs's magic-rollback
|
||||
# handshake, so CI reports failure even when the deploy itself completed.
|
||||
# This is deploy-rs#153, open since 2022.
|
||||
#
|
||||
# This module breaks the dependency: activation does `switch-to-configuration
|
||||
# boot` (bootloader only, no service restarts), then invokes deploy-finalize
|
||||
# which schedules a detached systemd transient unit that fires `delay` seconds
|
||||
# later with the real `switch` (or `systemctl reboot` when the kernel, initrd,
|
||||
# or kernel-modules changed since boot). The transient unit is owned by pid1,
|
||||
# so it survives the runner's eventual restart — by which time the CI job has
|
||||
# finished reporting.
|
||||
#
|
||||
# Prior art (reboot-or-switch logic, not the self-deploy detachment):
|
||||
# - nixpkgs `system.autoUpgrade` (allowReboot = true branch) is the canonical
|
||||
# source of the 3-path {initrd,kernel,kernel-modules} comparison.
|
||||
# - obsidiansystems/obelisk#957 merged the same snippet into `ob deploy` for
|
||||
# push-based remote deploys — but doesn't need detachment since its deployer
|
||||
# lives on a different machine from the target.
|
||||
# - nixpkgs#185030 tracks lifting this into switch-to-configuration proper.
|
||||
# Stale since 2025-07; until it lands, every downstream reimplements it.
|
||||
#
|
||||
# Bootstrap note: the activation snippet resolves deploy-finalize via
|
||||
# lib.getExe (store path), not via `/run/current-system/sw/bin` — `boot` mode
|
||||
# does not update `/run/current-system`, so the old binary would be resolved.
|
||||
{
|
||||
config,
|
||||
lib,
|
||||
pkgs,
|
||||
...
|
||||
}:
|
||||
let
|
||||
cfg = config.services.deployFinalize;
|
||||
|
||||
finalize = pkgs.writeShellApplication {
|
||||
name = "deploy-finalize";
|
||||
runtimeInputs = [
|
||||
pkgs.coreutils
|
||||
pkgs.systemd
|
||||
];
|
||||
text = ''
|
||||
delay=${toString cfg.delay}
|
||||
profile=/nix/var/nix/profiles/system
|
||||
dry_run=0
|
||||
|
||||
usage() {
|
||||
cat <<EOF
|
||||
Usage: deploy-finalize [--dry-run] [--delay N] [--profile PATH]
|
||||
|
||||
Compares /run/booted-system against PATH (default /nix/var/nix/profiles/system)
|
||||
and schedules either \`systemctl reboot\` (kernel or initrd changed) or
|
||||
\`switch-to-configuration switch\` (services only) via a detached systemd-run
|
||||
timer firing N seconds later.
|
||||
|
||||
Options:
|
||||
--dry-run Print the decision and would-be command without scheduling.
|
||||
--delay N Override the delay in seconds. Default: ${toString cfg.delay}.
|
||||
--profile PATH Override the profile path used for comparison.
|
||||
EOF
|
||||
}
|
||||
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case "$1" in
|
||||
--dry-run) dry_run=1; shift ;;
|
||||
--delay) delay="$2"; shift 2 ;;
|
||||
--profile) profile="$2"; shift 2 ;;
|
||||
-h|--help) usage; exit 0 ;;
|
||||
*)
|
||||
echo "deploy-finalize: unknown option $1" >&2
|
||||
usage >&2
|
||||
exit 2
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
# Comparing {kernel,initrd,kernel-modules} matches nixpkgs's canonical
|
||||
# `system.autoUpgrade` allowReboot logic. -e (not -f) so a dangling
|
||||
# symlink counts as missing: on a real NixOS profile all three exist,
|
||||
# but defensive: if a profile has bad symlinks we refuse to schedule
|
||||
# rather than scheduling against ghost paths.
|
||||
booted_kernel="$(readlink -e /run/booted-system/kernel 2>/dev/null || true)"
|
||||
booted_initrd="$(readlink -e /run/booted-system/initrd 2>/dev/null || true)"
|
||||
booted_modules="$(readlink -e /run/booted-system/kernel-modules 2>/dev/null || true)"
|
||||
new_kernel="$(readlink -e "$profile/kernel" 2>/dev/null || true)"
|
||||
new_initrd="$(readlink -e "$profile/initrd" 2>/dev/null || true)"
|
||||
new_modules="$(readlink -e "$profile/kernel-modules" 2>/dev/null || true)"
|
||||
|
||||
if [[ -z "$new_kernel" || -z "$new_initrd" || -z "$new_modules" ]]; then
|
||||
echo "deploy-finalize: refusing to schedule — $profile is missing kernel, initrd, or kernel-modules" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
changed=()
|
||||
if [[ -z "$booted_kernel" || -z "$booted_initrd" || -z "$booted_modules" ]]; then
|
||||
# Unreachable on a booted NixOS, but fail closed on reboot.
|
||||
changed+=("/run/booted-system incomplete")
|
||||
fi
|
||||
[[ "$booted_kernel" != "$new_kernel" ]] && changed+=("kernel")
|
||||
[[ "$booted_initrd" != "$new_initrd" ]] && changed+=("initrd")
|
||||
[[ "$booted_modules" != "$new_modules" ]] && changed+=("kernel-modules")
|
||||
|
||||
reboot_needed=0
|
||||
reason=""
|
||||
if [[ ''${#changed[@]} -gt 0 ]]; then
|
||||
reboot_needed=1
|
||||
# Join with commas so the reason reads as e.g. `kernel,initrd changed`.
|
||||
reason="$(IFS=, ; echo "''${changed[*]}") changed"
|
||||
fi
|
||||
|
||||
if [[ "$reboot_needed" == 1 ]]; then
|
||||
action=reboot
|
||||
cmd="systemctl reboot"
|
||||
else
|
||||
action=switch
|
||||
reason="services only"
|
||||
cmd="$profile/bin/switch-to-configuration switch"
|
||||
fi
|
||||
|
||||
# Nanosecond suffix so back-to-back deploys don't collide on unit names.
|
||||
unit="deploy-finalize-$(date +%s%N)"
|
||||
|
||||
printf 'deploy-finalize: booted_kernel=%s\n' "$booted_kernel"
|
||||
printf 'deploy-finalize: new_kernel=%s\n' "$new_kernel"
|
||||
printf 'deploy-finalize: booted_initrd=%s\n' "$booted_initrd"
|
||||
printf 'deploy-finalize: new_initrd=%s\n' "$new_initrd"
|
||||
printf 'deploy-finalize: booted_kernel-modules=%s\n' "$booted_modules"
|
||||
printf 'deploy-finalize: new_kernel-modules=%s\n' "$new_modules"
|
||||
printf 'deploy-finalize: action=%s reason=%s delay=%ss unit=%s\n' \
|
||||
"$action" "$reason" "$delay" "$unit"
|
||||
|
||||
if [[ "$dry_run" == 1 ]]; then
|
||||
printf 'deploy-finalize: dry-run — not scheduling\n'
|
||||
printf 'deploy-finalize: would run: %s\n' "$cmd"
|
||||
printf 'deploy-finalize: would schedule: systemd-run --collect --unit=%s --on-active=%s\n' \
|
||||
"$unit" "$delay"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Cancel any still-pending finalize timers from an earlier deploy so this
|
||||
# invocation is authoritative. Without this a stale timer could fire with
|
||||
# the old profile's action (reboot/switch) against the new profile and
|
||||
# briefly run new userspace under the old kernel.
|
||||
systemctl stop 'deploy-finalize-*.timer' 2>/dev/null || true
|
||||
|
||||
# --on-active arms a transient timer owned by pid1. systemd-run returns
|
||||
# once the timer is armed; the SSH session that called us can exit and
|
||||
# the gitea-runner can be restarted (by the switch the timer fires)
|
||||
# without affecting whether the finalize runs.
|
||||
systemd-run \
|
||||
--collect \
|
||||
--unit="$unit" \
|
||||
--description="Finalize NixOS deploy ($action after boot-mode activation)" \
|
||||
--on-active="$delay" \
|
||||
/bin/sh -c "$cmd"
|
||||
'';
|
||||
};
|
||||
in
|
||||
{
|
||||
options.services.deployFinalize = {
|
||||
enable = lib.mkEnableOption "deferred deploy finalize (switch or reboot) after boot-mode activation";
|
||||
|
||||
delay = lib.mkOption {
|
||||
type = lib.types.ints.positive;
|
||||
default = 60;
|
||||
description = ''
|
||||
Seconds between the deploy-rs activation completing and the scheduled
|
||||
finalize firing. Tuned so the CI job (or manual SSH session) has time
|
||||
to complete status reporting before the runner is restarted by the
|
||||
eventual switch-to-configuration.
|
||||
'';
|
||||
};
|
||||
};
|
||||
|
||||
config = lib.mkIf cfg.enable {
|
||||
environment.systemPackages = [ finalize ];
|
||||
|
||||
# Exposed for the deploy-rs activation snippet to reference by /nix/store
|
||||
# path via lib.getExe — `boot` mode does not update /run/current-system,
|
||||
# so reading through /run/current-system/sw/bin would resolve to the OLD
|
||||
# binary on a new-feature rollout or immediately after a rollback.
|
||||
system.build.deployFinalize = finalize;
|
||||
};
|
||||
}
|
||||
32
patches/bluez/0001-a2dp-connect-source-after-sink.patch
Normal file
32
patches/bluez/0001-a2dp-connect-source-after-sink.patch
Normal file
@@ -0,0 +1,32 @@
|
||||
From 066a164a524e4983b850f5659b921cb42f84a0e0 Mon Sep 17 00:00:00 2001
|
||||
From: Pauli Virtanen <pav@iki.fi>
|
||||
Date: Mon, 16 Feb 2026 18:17:08 +0200
|
||||
Subject: [PATCH] a2dp: connect source profile after sink
|
||||
|
||||
Since cdcd845f87ee the order in which profiles with the same priority
|
||||
are connected is the same order as btd_profile_register() is called,
|
||||
instead of being the opposite order. When initiating connections, we
|
||||
want to prefer a2dp-sink profile over a2dp-source, as connecting both at
|
||||
the same time does not work currently.
|
||||
|
||||
Add .after_services to specify the order.
|
||||
|
||||
Fixes: https://github.com/bluez/bluez/issues/1898
|
||||
---
|
||||
profiles/audio/a2dp.c | 3 +++
|
||||
1 file changed, 3 insertions(+)
|
||||
|
||||
diff --git a/profiles/audio/a2dp.c b/profiles/audio/a2dp.c
|
||||
index 7a37003a2b..c7e0fc75c0 100644
|
||||
--- a/profiles/audio/a2dp.c
|
||||
+++ b/profiles/audio/a2dp.c
|
||||
@@ -3769,6 +3769,9 @@ static struct btd_profile a2dp_source_profile = {
|
||||
|
||||
.adapter_probe = a2dp_sink_server_probe,
|
||||
.adapter_remove = a2dp_sink_server_remove,
|
||||
+
|
||||
+ /* Connect source after sink, to prefer sink when conflicting */
|
||||
+ .after_services = BTD_PROFILE_UUID_CB(NULL, A2DP_SINK_UUID),
|
||||
};
|
||||
|
||||
static struct btd_profile a2dp_sink_profile = {
|
||||
732
patches/flare/0001-feat-typing-Implement-typing-indicators.patch
Normal file
732
patches/flare/0001-feat-typing-Implement-typing-indicators.patch
Normal file
@@ -0,0 +1,732 @@
|
||||
From 733ad6e63fa6408e47d87a22cf51a784f5ce103f Mon Sep 17 00:00:00 2001
|
||||
From: Simon Gardling <titaniumtown@proton.me>
|
||||
Date: Wed, 29 Apr 2026 19:00:12 -0400
|
||||
Subject: [PATCH 1/6] feat(typing): Implement typing indicators
|
||||
|
||||
- Send TypingMessage Started/Stopped events as the user composes a
|
||||
message, including a periodic refresh and an idle-stop timer so the
|
||||
indicator follows actual composition activity.
|
||||
- Display a typing indicator strip above the message input, gated on
|
||||
the active channel's is-typing state.
|
||||
- Add the show-typing-indicators and send-typing-indicators settings,
|
||||
exposed through a new preferences group, and honour them both for
|
||||
display and outbound events.
|
||||
- Generalise Channel-level send_message_to_group to accept any
|
||||
ContentBody so the new TypingMessage path can reuse it.
|
||||
---
|
||||
CHANGELOG.md | 5 +
|
||||
data/de.schmidhuberj.Flare.gschema.xml | 9 +
|
||||
data/resources/style.css | 6 +
|
||||
data/resources/ui/channel_messages.blp | 33 +++
|
||||
data/resources/ui/preferences_window.blp | 15 ++
|
||||
src/backend/channel.rs | 59 +++++-
|
||||
src/backend/manager.rs | 43 +++-
|
||||
src/backend/manager_thread.rs | 8 +-
|
||||
src/backend/message/mod.rs | 12 +-
|
||||
src/gui/channel_messages.rs | 251 ++++++++++++++++++++++-
|
||||
src/gui/preferences_window.rs | 23 +++
|
||||
11 files changed, 443 insertions(+), 21 deletions(-)
|
||||
|
||||
diff --git a/CHANGELOG.md b/CHANGELOG.md
|
||||
index 20dc578..2bde927 100644
|
||||
--- a/CHANGELOG.md
|
||||
+++ b/CHANGELOG.md
|
||||
@@ -6,6 +6,11 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
+### Added
|
||||
+
|
||||
+- Send typing indicators while composing a message and display them above the message input.
|
||||
+- Settings to enable or disable sending and showing typing indicators.
|
||||
+
|
||||
## [0.20.4] - 2026-04-22
|
||||
|
||||
### Fixed
|
||||
diff --git a/data/de.schmidhuberj.Flare.gschema.xml b/data/de.schmidhuberj.Flare.gschema.xml
|
||||
index 8a58415..0705a73 100644
|
||||
--- a/data/de.schmidhuberj.Flare.gschema.xml
|
||||
+++ b/data/de.schmidhuberj.Flare.gschema.xml
|
||||
@@ -58,6 +58,15 @@
|
||||
<summary>Send a message when the Enter-key is pressed</summary>
|
||||
</key>
|
||||
|
||||
+ <key name="show-typing-indicators" type="b">
|
||||
+ <default>true</default>
|
||||
+ <summary>Show typing indicators of other users</summary>
|
||||
+ </key>
|
||||
+ <key name="send-typing-indicators" type="b">
|
||||
+ <default>true</default>
|
||||
+ <summary>Send typing indicators while composing</summary>
|
||||
+ </key>
|
||||
+
|
||||
<key name="sort-contacts-by" type="s">
|
||||
<default>"firstname"</default>
|
||||
<summary>How to sort contacts, e.g with "firstname" or "surname"</summary>
|
||||
diff --git a/data/resources/style.css b/data/resources/style.css
|
||||
index dcd0569..00e4783 100644
|
||||
--- a/data/resources/style.css
|
||||
+++ b/data/resources/style.css
|
||||
@@ -13,6 +13,12 @@
|
||||
border-top: 1px solid @borders;
|
||||
}
|
||||
|
||||
+.typing-indicator {
|
||||
+ background-color: @window_bg_color;
|
||||
+ border-top: 1px solid @borders;
|
||||
+ min-height: 18px;
|
||||
+}
|
||||
+
|
||||
.message-list row {
|
||||
padding:0;
|
||||
}
|
||||
diff --git a/data/resources/ui/channel_messages.blp b/data/resources/ui/channel_messages.blp
|
||||
index 53be7ab..7f438e4 100644
|
||||
--- a/data/resources/ui/channel_messages.blp
|
||||
+++ b/data/resources/ui/channel_messages.blp
|
||||
@@ -102,6 +102,39 @@ template $FlChannelMessages: Box {
|
||||
}
|
||||
}
|
||||
|
||||
+ // Typing indicator
|
||||
+ Box typing_indicator {
|
||||
+ styles [
|
||||
+ "typing-indicator",
|
||||
+ ]
|
||||
+
|
||||
+ orientation: horizontal;
|
||||
+ hexpand: true;
|
||||
+ visible: bind template.show-typing as <bool>;
|
||||
+
|
||||
+ Adw.Clamp {
|
||||
+ maximum-size: 800;
|
||||
+ tightening-threshold: 600;
|
||||
+ hexpand: true;
|
||||
+
|
||||
+ Label {
|
||||
+ styles [
|
||||
+ "caption",
|
||||
+ "dim-label",
|
||||
+ ]
|
||||
+
|
||||
+ halign: start;
|
||||
+ ellipsize: end;
|
||||
+ xalign: 0;
|
||||
+ margin-start: 12;
|
||||
+ margin-end: 12;
|
||||
+ margin-top: 2;
|
||||
+ margin-bottom: 2;
|
||||
+ label: bind template.active-channel as <$FlChannel>.typing-label;
|
||||
+ }
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
Box {
|
||||
styles [
|
||||
"toolbar",
|
||||
diff --git a/data/resources/ui/preferences_window.blp b/data/resources/ui/preferences_window.blp
|
||||
index dd84f74..2068cab 100644
|
||||
--- a/data/resources/ui/preferences_window.blp
|
||||
+++ b/data/resources/ui/preferences_window.blp
|
||||
@@ -66,6 +66,21 @@ template $FlPreferencesWindow: Adw.PreferencesDialog {
|
||||
);
|
||||
}
|
||||
}
|
||||
+
|
||||
+ Adw.PreferencesGroup {
|
||||
+ title: _("Typing Indicators");
|
||||
+ description: _("Inform other users when you are composing a message and show indicators when they are");
|
||||
+
|
||||
+ Adw.SwitchRow row_send_typing_indicators {
|
||||
+ title: _("Send Typing Indicators");
|
||||
+ subtitle: _("Notify others while you are composing a message");
|
||||
+ }
|
||||
+
|
||||
+ Adw.SwitchRow row_show_typing_indicators {
|
||||
+ title: _("Show Typing Indicators");
|
||||
+ subtitle: _("Display when other users are composing a message");
|
||||
+ }
|
||||
+ }
|
||||
}
|
||||
}
|
||||
|
||||
diff --git a/src/backend/channel.rs b/src/backend/channel.rs
|
||||
index 73e82f3..4bb1d38 100644
|
||||
--- a/src/backend/channel.rs
|
||||
+++ b/src/backend/channel.rs
|
||||
@@ -15,8 +15,9 @@ use glib::Bytes;
|
||||
use glib::{Object, prelude::Cast};
|
||||
|
||||
use libsignal_service::{
|
||||
- proto::{DataMessage, GroupContextV2},
|
||||
+ proto::{DataMessage, GroupContextV2, TypingMessage, typing_message::Action as TypingAction},
|
||||
protocol::ServiceId,
|
||||
+ zkgroup::groups::{GroupMasterKey, GroupSecretParams},
|
||||
};
|
||||
use presage::model::groups::Group;
|
||||
use presage::store::Thread;
|
||||
@@ -230,6 +231,62 @@ impl Channel {
|
||||
self.manager().send_session_reset(uuid, ts).await
|
||||
}
|
||||
|
||||
+ /// Send a typing indicator (started/stopped) to the channel.
|
||||
+ ///
|
||||
+ /// Returns `Ok(())` without sending if the user has disabled the
|
||||
+ /// `send-typing-indicators` setting or the channel has no resolvable peer.
|
||||
+ pub async fn send_typing(&self, started: bool) -> Result<(), ApplicationError> {
|
||||
+ // Note-to-self has no useful peer to inform, and routing the
|
||||
+ // event through `send_message(self_uuid, …)` would fan it out to
|
||||
+ // every other linked device on the account where flare's own
|
||||
+ // receive path lights up an "is typing" indicator on its copy of
|
||||
+ // Note-to-self.
|
||||
+ if self.is_self() {
|
||||
+ return Ok(());
|
||||
+ }
|
||||
+ let manager = self.manager();
|
||||
+ if !manager.settings().boolean("send-typing-indicators") {
|
||||
+ return Ok(());
|
||||
+ }
|
||||
+
|
||||
+ let timestamp = std::time::SystemTime::now()
|
||||
+ .duration_since(std::time::UNIX_EPOCH)
|
||||
+ .expect("Time went backwards")
|
||||
+ .as_millis() as u64;
|
||||
+
|
||||
+ let action = if started {
|
||||
+ TypingAction::Started
|
||||
+ } else {
|
||||
+ TypingAction::Stopped
|
||||
+ };
|
||||
+
|
||||
+ let group_id = self
|
||||
+ .group_context()
|
||||
+ .and_then(|c| c.master_key)
|
||||
+ .and_then(|k| <[u8; 32]>::try_from(k).ok())
|
||||
+ .map(|master_key| {
|
||||
+ GroupSecretParams::derive_from_master_key(GroupMasterKey::new(master_key))
|
||||
+ .get_group_identifier()
|
||||
+ .to_vec()
|
||||
+ });
|
||||
+
|
||||
+ let typing = TypingMessage {
|
||||
+ timestamp: Some(timestamp),
|
||||
+ action: Some(action as i32),
|
||||
+ group_id: group_id.clone(),
|
||||
+ };
|
||||
+
|
||||
+ if let Some(group_master_key) = self.group_context().and_then(|c| c.master_key) {
|
||||
+ manager
|
||||
+ .send_message_to_group(group_master_key, typing, timestamp)
|
||||
+ .await?;
|
||||
+ } else if let Some(uuid) = self.uuid() {
|
||||
+ manager.send_message(uuid, typing, timestamp).await?;
|
||||
+ }
|
||||
+
|
||||
+ Ok(())
|
||||
+ }
|
||||
+
|
||||
/// Register a new message with the channel.
|
||||
/// This does the following (based on the type of message):
|
||||
/// - Add a quote to the message if needed.
|
||||
diff --git a/src/backend/manager.rs b/src/backend/manager.rs
|
||||
index c25fba0..eaa41e0 100644
|
||||
--- a/src/backend/manager.rs
|
||||
+++ b/src/backend/manager.rs
|
||||
@@ -8,7 +8,7 @@ use libsignal_service::protocol::DeviceId;
|
||||
use libsignal_service::{
|
||||
Profile,
|
||||
content::ContentBody,
|
||||
- proto::{AttachmentPointer, DataMessage, GroupContextV2},
|
||||
+ proto::{AttachmentPointer, GroupContextV2},
|
||||
protocol::ServiceId,
|
||||
sender::{AttachmentSpec, AttachmentUploadError},
|
||||
websocket::account::DeviceInfo,
|
||||
@@ -490,20 +490,42 @@ impl Manager {
|
||||
Thread::Contact(uuid)
|
||||
};
|
||||
|
||||
+ // Fast path: return the cached channel if we already know it.
|
||||
+ // Without this, callers that arrive after initial channel discovery
|
||||
+ // (incoming TypingMessage routing, in particular) would receive a
|
||||
+ // freshly-built Channel object whose property notifications never
|
||||
+ // reach widgets bound to the cached one in the UI — typing
|
||||
+ // indicators on both the header bar and the channel-messages view
|
||||
+ // would silently never light up.
|
||||
+ if let Some(cached) = self.imp().channels.borrow().get(&thread).cloned() {
|
||||
+ return cached;
|
||||
+ }
|
||||
+
|
||||
let contact = Contact::from_service_address(&uuid, self).await;
|
||||
let channel = Channel::from_contact_or_group(contact, group, self).await;
|
||||
channel.initialize_avatar().await;
|
||||
|
||||
- let mut known_channels = self.imp().channels.borrow_mut();
|
||||
- known_channels.entry(thread).or_insert_with(|| {
|
||||
- log::trace!("Got a contact from the storage");
|
||||
+ // Another task may have inserted the same thread while we were
|
||||
+ // awaiting; pick whichever is already there or insert ours.
|
||||
+ let stored = {
|
||||
+ let mut known = self.imp().channels.borrow_mut();
|
||||
+ known
|
||||
+ .entry(thread)
|
||||
+ .or_insert_with(|| {
|
||||
+ log::trace!("Got a contact from the storage");
|
||||
+ channel.clone()
|
||||
+ })
|
||||
+ .clone()
|
||||
+ };
|
||||
+
|
||||
+ if stored == channel {
|
||||
self.emit_by_name::<()>("channel", &[&channel]);
|
||||
- channel.clone()
|
||||
- });
|
||||
+ }
|
||||
|
||||
- // No need to initialize avatar or last messages in here, will be done when initializing contacts.
|
||||
+ // No need to initialize avatar or last messages in here, will be
|
||||
+ // done when initializing contacts.
|
||||
|
||||
- channel
|
||||
+ stored
|
||||
}
|
||||
|
||||
pub fn channel_from_thread(&self, thread: Thread) -> Option<Channel> {
|
||||
@@ -737,14 +759,15 @@ impl Manager {
|
||||
pub(super) async fn send_message_to_group(
|
||||
&self,
|
||||
group_key: Vec<u8>,
|
||||
- message: DataMessage,
|
||||
+ message: impl Into<ContentBody>,
|
||||
timestamp: u64,
|
||||
) -> Result<(), ApplicationError> {
|
||||
log::trace!("`Manager::send_message_to_group` start");
|
||||
+ let body = message.into();
|
||||
let internal = self.internal();
|
||||
let r = tspawn!(async move {
|
||||
internal
|
||||
- .send_message_to_group(group_key, message.clone(), timestamp)
|
||||
+ .send_message_to_group(group_key, body.clone(), timestamp)
|
||||
.await
|
||||
})
|
||||
.await
|
||||
diff --git a/src/backend/manager_thread.rs b/src/backend/manager_thread.rs
|
||||
index 1f6a885..cba62ae 100644
|
||||
--- a/src/backend/manager_thread.rs
|
||||
+++ b/src/backend/manager_thread.rs
|
||||
@@ -21,7 +21,7 @@ use libsignal_service::{
|
||||
configuration::SignalServers,
|
||||
content::ContentBody,
|
||||
prelude::{ProfileKey, Uuid, phonenumber},
|
||||
- proto::{AttachmentPointer, DataMessage, GroupContextV2},
|
||||
+ proto::{AttachmentPointer, GroupContextV2},
|
||||
protocol::ServiceId,
|
||||
sender::{AttachmentSpec, AttachmentUploadError},
|
||||
websocket::account::DeviceInfo,
|
||||
@@ -65,7 +65,7 @@ enum Command {
|
||||
),
|
||||
SendMessageToGroup(
|
||||
Vec<u8>,
|
||||
- Box<DataMessage>,
|
||||
+ Box<ContentBody>,
|
||||
u64,
|
||||
oneshot::Sender<Result<(), Error>>,
|
||||
),
|
||||
@@ -353,7 +353,7 @@ impl ManagerThread {
|
||||
pub async fn send_message_to_group(
|
||||
&self,
|
||||
group_key: Vec<u8>,
|
||||
- message: DataMessage,
|
||||
+ message: impl Into<ContentBody>,
|
||||
timestamp: u64,
|
||||
) -> Result<(), Error> {
|
||||
let (sender, receiver) = oneshot::channel();
|
||||
@@ -361,7 +361,7 @@ impl ManagerThread {
|
||||
.clone()
|
||||
.send(Command::SendMessageToGroup(
|
||||
group_key,
|
||||
- Box::new(message),
|
||||
+ Box::new(message.into()),
|
||||
timestamp,
|
||||
sender,
|
||||
))
|
||||
diff --git a/src/backend/message/mod.rs b/src/backend/message/mod.rs
|
||||
index 11ccd7c..74952ac 100644
|
||||
--- a/src/backend/message/mod.rs
|
||||
+++ b/src/backend/message/mod.rs
|
||||
@@ -270,14 +270,16 @@ impl Message {
|
||||
// Typing messages.
|
||||
// Note that they are currently only implemented for contacts, this requires upstream updates to fix.
|
||||
ContentBody::TypingMessage(t) => {
|
||||
+ // Both group and contact branches stay cache-only: we only
|
||||
+ // surface typing for conversations the user already knows
|
||||
+ // about. Going through `channel_from_uuid_or_group` here
|
||||
+ // would mint a new Channel object on the first typing
|
||||
+ // event from a stranger and add them to the sidebar with
|
||||
+ // no actual messages.
|
||||
let channel = if let Some(id) = &t.group_id {
|
||||
manager.channel_from_group_id(id)
|
||||
} else {
|
||||
- Some(
|
||||
- manager
|
||||
- .channel_from_uuid_or_group(metadata.sender, &None)
|
||||
- .await,
|
||||
- )
|
||||
+ manager.channel_from_thread(presage::store::Thread::Contact(metadata.sender))
|
||||
};
|
||||
|
||||
let Some(channel) = channel else {
|
||||
diff --git a/src/gui/channel_messages.rs b/src/gui/channel_messages.rs
|
||||
index 0e8ae4e..c6684fc 100644
|
||||
--- a/src/gui/channel_messages.rs
|
||||
+++ b/src/gui/channel_messages.rs
|
||||
@@ -5,6 +5,16 @@ use crate::ApplicationError;
|
||||
|
||||
const MESSAGES_REQUEST_LOAD: usize = 10;
|
||||
|
||||
+/// Re-send the `Started` typing event at this interval so the receiver
|
||||
+/// does not let the indicator expire while the user keeps composing.
|
||||
+/// Must stay strictly below `TYPING_NOTIFICATION_DURATION_SECONDS` in
|
||||
+/// `crate::backend::channel`.
|
||||
+const TYPING_REFRESH_SECONDS: u32 = 8;
|
||||
+
|
||||
+/// Send `Stopped` if no buffer change has happened in this many seconds.
|
||||
+/// Mirrors how Signal apps treat composition pauses as the end of typing.
|
||||
+const TYPING_IDLE_SECONDS: u32 = 5;
|
||||
+
|
||||
glib::wrapper! {
|
||||
/// [ChannelMessages] is the right pane displaying the list of messages and the entry-bar.
|
||||
pub struct ChannelMessages(ObjectSubclass<imp::ChannelMessages>)
|
||||
@@ -103,6 +113,200 @@ impl ChannelMessages {
|
||||
));
|
||||
}
|
||||
|
||||
+ /// Connect the `show-typing-indicators` setting so the typing indicator
|
||||
+ /// updates immediately when the user toggles the preference.
|
||||
+ fn setup_typing_settings(&self) {
|
||||
+ self.manager().settings().connect_changed(
|
||||
+ Some("show-typing-indicators"),
|
||||
+ clone!(
|
||||
+ #[weak(rename_to = s)]
|
||||
+ self,
|
||||
+ move |_, _| s.refresh_show_typing()
|
||||
+ ),
|
||||
+ );
|
||||
+ }
|
||||
+
|
||||
+ /// Re-evaluate `show-typing` for the current channel based on the channel's
|
||||
+ /// `is-typing` state and the user's `show-typing-indicators` setting.
|
||||
+ fn refresh_show_typing(&self) {
|
||||
+ // The active-channel bind runs during template init before the
|
||||
+ // manager bind, so `self.manager()` (typed as Manager, not
|
||||
+ // Option<Manager>) would panic here. Read the manager directly so
|
||||
+ // a null intermediate state is harmless: with no manager we don't
|
||||
+ // know the user's preference, so default to showing the indicator.
|
||||
+ let allowed = self
|
||||
+ .imp()
|
||||
+ .manager
|
||||
+ .borrow()
|
||||
+ .as_ref()
|
||||
+ .is_none_or(|m| m.settings().boolean("show-typing-indicators"));
|
||||
+ let typing = self
|
||||
+ .active_channel()
|
||||
+ .map(|c| c.is_typing())
|
||||
+ .unwrap_or(false);
|
||||
+ self.set_show_typing(allowed && typing);
|
||||
+ }
|
||||
+
|
||||
+ /// Wire the `show-typing` property to the active channel's `is-typing`.
|
||||
+ /// Called whenever the active channel changes.
|
||||
+ fn setup_typing_indicator(&self) {
|
||||
+ self.refresh_show_typing();
|
||||
+
|
||||
+ // Disconnect the handler we attached on the previous active
|
||||
+ // channel so we don't accumulate one per channel switch.
|
||||
+ if let Some((prev_channel, handler)) = self.imp().typing_handler.take() {
|
||||
+ prev_channel.disconnect(handler);
|
||||
+ }
|
||||
+
|
||||
+ if let Some(channel) = self.active_channel() {
|
||||
+ let handler = channel.connect_notify_local(
|
||||
+ Some("is-typing"),
|
||||
+ clone!(
|
||||
+ #[weak(rename_to = s)]
|
||||
+ self,
|
||||
+ move |_, _| s.refresh_show_typing()
|
||||
+ ),
|
||||
+ );
|
||||
+ self.imp()
|
||||
+ .typing_handler
|
||||
+ .replace(Some((channel, handler)));
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
+ /// Send a `Started` typing event for the active channel.
|
||||
+ ///
|
||||
+ /// Schedules a periodic refresh so the receiver does not let the
|
||||
+ /// indicator expire while the user is still composing.
|
||||
+ fn send_typing_started(&self) {
|
||||
+ let imp = self.imp();
|
||||
+ let Some(channel) = self.active_channel() else {
|
||||
+ return;
|
||||
+ };
|
||||
+ if !self.manager().settings().boolean("send-typing-indicators") {
|
||||
+ return;
|
||||
+ }
|
||||
+
|
||||
+ // Mark this channel as the current typing target so a later channel
|
||||
+ // switch can still emit a matching `Stopped` event.
|
||||
+ imp.typing_target.replace(Some(channel.clone()));
|
||||
+
|
||||
+ // Refresh `Started` periodically while the user keeps composing.
|
||||
+ let needs_initial = !imp.sending_typing.replace(true);
|
||||
+
|
||||
+ if let Some(source) = imp.typing_refresh.borrow_mut().take() {
|
||||
+ source.remove();
|
||||
+ }
|
||||
+ let refresh = glib::timeout_add_seconds_local(
|
||||
+ TYPING_REFRESH_SECONDS,
|
||||
+ clone!(
|
||||
+ #[weak(rename_to = s)]
|
||||
+ self,
|
||||
+ #[upgrade_or]
|
||||
+ glib::ControlFlow::Break,
|
||||
+ move || {
|
||||
+ if !s.imp().sending_typing.get() {
|
||||
+ return glib::ControlFlow::Break;
|
||||
+ }
|
||||
+ s.dispatch_send_typing(true);
|
||||
+ glib::ControlFlow::Continue
|
||||
+ }
|
||||
+ ),
|
||||
+ );
|
||||
+ imp.typing_refresh.replace(Some(refresh));
|
||||
+
|
||||
+ if needs_initial {
|
||||
+ self.dispatch_send_typing(true);
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
+ /// Send a `Stopped` typing event for the channel that was last targeted.
|
||||
+ fn send_typing_stopped(&self) {
|
||||
+ let imp = self.imp();
|
||||
+ if let Some(source) = imp.typing_refresh.borrow_mut().take() {
|
||||
+ source.remove();
|
||||
+ }
|
||||
+ if let Some(source) = imp.typing_idle.borrow_mut().take() {
|
||||
+ source.remove();
|
||||
+ }
|
||||
+ if !imp.sending_typing.replace(false) {
|
||||
+ // Nothing to do — we never told anyone we were typing.
|
||||
+ imp.typing_target.replace(None);
|
||||
+ return;
|
||||
+ }
|
||||
+ let Some(channel) = imp.typing_target.replace(None) else {
|
||||
+ return;
|
||||
+ };
|
||||
+ gspawn!(async move {
|
||||
+ if let Err(e) = channel.send_typing(false).await {
|
||||
+ log::warn!("Failed to send `Stopped` typing event: {e}");
|
||||
+ }
|
||||
+ });
|
||||
+ }
|
||||
+
|
||||
+ /// Dispatch the actual `Started` typing event to whichever channel is
|
||||
+ /// currently considered the typing target.
|
||||
+ fn dispatch_send_typing(&self, started: bool) {
|
||||
+ let Some(channel) = self.imp().typing_target.borrow().clone() else {
|
||||
+ return;
|
||||
+ };
|
||||
+ gspawn!(async move {
|
||||
+ if let Err(e) = channel.send_typing(started).await {
|
||||
+ log::warn!("Failed to send typing event (started={started}): {e}");
|
||||
+ }
|
||||
+ });
|
||||
+ }
|
||||
+
|
||||
+ /// Connect the text entry's buffer so we can emit `Started`/`Stopped`
|
||||
+ /// typing events as the user composes a message.
|
||||
+ fn setup_typing_send(&self) {
|
||||
+ let buffer = self.imp().text_entry.buffer();
|
||||
+ let handler = buffer.connect_changed(clone!(
|
||||
+ #[weak(rename_to = s)]
|
||||
+ self,
|
||||
+ move |buf| {
|
||||
+ let (start, end) = buf.bounds();
|
||||
+ if start == end {
|
||||
+ s.send_typing_stopped();
|
||||
+ return;
|
||||
+ }
|
||||
+ // Both the Started event and the idle-stop timer are
|
||||
+ // outbound-typing-only behaviours; if the user has
|
||||
+ // disabled outgoing typing, do nothing and don't churn
|
||||
+ // a timer per keystroke.
|
||||
+ let allowed = s
|
||||
+ .imp()
|
||||
+ .manager
|
||||
+ .borrow()
|
||||
+ .as_ref()
|
||||
+ .is_none_or(|m| m.settings().boolean("send-typing-indicators"));
|
||||
+ if !allowed {
|
||||
+ return;
|
||||
+ }
|
||||
+ s.send_typing_started();
|
||||
+ s.reset_typing_idle_timer();
|
||||
+ }
|
||||
+ ));
|
||||
+ self.imp().typing_buffer_handler.replace(Some(handler));
|
||||
+ }
|
||||
+
|
||||
+ /// Schedule a one-shot timer that sends `Stopped` if the user lets the
|
||||
+ /// composition idle for more than `TYPING_IDLE_SECONDS` seconds.
|
||||
+ fn reset_typing_idle_timer(&self) {
|
||||
+ let imp = self.imp();
|
||||
+ if let Some(source) = imp.typing_idle.borrow_mut().take() {
|
||||
+ source.remove();
|
||||
+ }
|
||||
+ let source = glib::timeout_add_seconds_local_once(
|
||||
+ TYPING_IDLE_SECONDS,
|
||||
+ clone!(
|
||||
+ #[weak(rename_to = s)]
|
||||
+ self,
|
||||
+ move || s.send_typing_stopped()
|
||||
+ ),
|
||||
+ );
|
||||
+ imp.typing_idle.replace(Some(source));
|
||||
+ }
|
||||
+
|
||||
pub async fn clear_messages(&self) -> Result<(), ApplicationError> {
|
||||
if let Some(channel) = self.active_channel() {
|
||||
channel.clear_messages().await?;
|
||||
@@ -165,9 +369,36 @@ pub mod imp {
|
||||
filling_screen: Cell<bool>,
|
||||
#[property(get = Self::has_attachments)]
|
||||
has_attachments: PhantomData<bool>,
|
||||
+ #[property(get, set)]
|
||||
+ show_typing: Cell<bool>,
|
||||
+
|
||||
+ /// Whether we currently believe the user is composing a message in the
|
||||
+ /// active channel and have informed the peer with a `Started` event.
|
||||
+ pub(super) sending_typing: Cell<bool>,
|
||||
+ /// Channel to which we last sent a `Started` typing event, kept so we
|
||||
+ /// can send a matching `Stopped` even after the active channel changes.
|
||||
+ pub(super) typing_target: RefCell<Option<Channel>>,
|
||||
+ /// Periodic refresh of the `Started` typing event so it does not
|
||||
+ /// expire on the receiver side while the user is still composing.
|
||||
+ pub(super) typing_refresh: RefCell<Option<glib::SourceId>>,
|
||||
+ /// One-shot timer that emits `Stopped` after a stretch of no
|
||||
+ /// further buffer changes.
|
||||
+ pub(super) typing_idle: RefCell<Option<glib::SourceId>>,
|
||||
+ /// Notify handler installed on the active channel's `is-typing`
|
||||
+ /// property so we can disconnect it before re-attaching when the
|
||||
+ /// active channel changes.
|
||||
+ pub(super) typing_handler: RefCell<Option<(Channel, glib::SignalHandlerId)>>,
|
||||
+ /// Notify + selection-changed handlers installed on the active
|
||||
+ /// channel by `setup_selection_listener`, kept so we can disconnect
|
||||
+ /// them before re-attaching on the next channel change.
|
||||
+ pub(super) selection_handlers: RefCell<Vec<(Channel, glib::SignalHandlerId)>>,
|
||||
+ /// Buffer change handler that drives the typing-send logic; we
|
||||
+ /// block it while restoring a draft so loading a draft does not
|
||||
+ /// transmit a Started typing event.
|
||||
+ pub(super) typing_buffer_handler: RefCell<Option<glib::SignalHandlerId>>,
|
||||
|
||||
#[property(get, set = Self::set_manager, type = Manager)]
|
||||
- manager: RefCell<Option<Manager>>,
|
||||
+ pub(super) manager: RefCell<Option<Manager>>,
|
||||
}
|
||||
|
||||
#[gtk::template_callbacks]
|
||||
@@ -181,10 +412,16 @@ pub mod imp {
|
||||
self.manager.replace(man);
|
||||
if initialized {
|
||||
self.obj().setup_send_on_enter();
|
||||
+ self.obj().setup_typing_settings();
|
||||
+ self.obj().setup_typing_send();
|
||||
}
|
||||
}
|
||||
|
||||
fn set_active_channel(&self, chan: Option<Channel>) {
|
||||
+ // Inform the previous channel we have stopped typing before we
|
||||
+ // forget about it.
|
||||
+ self.obj().send_typing_stopped();
|
||||
+
|
||||
if let Some(active_chan) = self.active_channel.borrow().as_ref() {
|
||||
active_chan.set_property("draft", self.text_entry.text());
|
||||
}
|
||||
@@ -195,6 +432,7 @@ pub mod imp {
|
||||
}
|
||||
|
||||
self.obj().focus_input();
|
||||
+ self.obj().setup_typing_indicator();
|
||||
}
|
||||
|
||||
#[template_callback(function)]
|
||||
@@ -501,7 +739,18 @@ pub mod imp {
|
||||
s.obj().set_reply_message(None::<TextMessage>);
|
||||
if let Some(channel) = s.active_channel.borrow().as_ref() {
|
||||
let draft = channel.property("draft");
|
||||
+ // Block the typing buffer-changed handler so
|
||||
+ // restoring a stored draft does not transmit
|
||||
+ // a Started typing event to the peer.
|
||||
+ let buffer = s.text_entry.buffer();
|
||||
+ let handler_guard = s.typing_buffer_handler.borrow();
|
||||
+ if let Some(handler) = handler_guard.as_ref() {
|
||||
+ buffer.block_signal(handler);
|
||||
+ }
|
||||
s.text_entry.set_text(draft);
|
||||
+ if let Some(handler) = handler_guard.as_ref() {
|
||||
+ buffer.unblock_signal(handler);
|
||||
+ }
|
||||
};
|
||||
}
|
||||
),
|
||||
diff --git a/src/gui/preferences_window.rs b/src/gui/preferences_window.rs
|
||||
index 8137af7..b2b6405 100644
|
||||
--- a/src/gui/preferences_window.rs
|
||||
+++ b/src/gui/preferences_window.rs
|
||||
@@ -78,6 +78,11 @@ pub mod imp {
|
||||
#[template_child]
|
||||
row_send_on_enter: TemplateChild<adw::SwitchRow>,
|
||||
|
||||
+ #[template_child]
|
||||
+ row_send_typing_indicators: TemplateChild<adw::SwitchRow>,
|
||||
+ #[template_child]
|
||||
+ row_show_typing_indicators: TemplateChild<adw::SwitchRow>,
|
||||
+
|
||||
settings: Settings,
|
||||
}
|
||||
|
||||
@@ -173,6 +178,22 @@ pub mod imp {
|
||||
.bind("send-on-enter", &self.row_send_on_enter.get(), "active")
|
||||
.flags(SettingsBindFlags::DEFAULT)
|
||||
.build();
|
||||
+ self.settings
|
||||
+ .bind(
|
||||
+ "send-typing-indicators",
|
||||
+ &self.row_send_typing_indicators.get(),
|
||||
+ "active",
|
||||
+ )
|
||||
+ .flags(SettingsBindFlags::DEFAULT)
|
||||
+ .build();
|
||||
+ self.settings
|
||||
+ .bind(
|
||||
+ "show-typing-indicators",
|
||||
+ &self.row_show_typing_indicators.get(),
|
||||
+ "active",
|
||||
+ )
|
||||
+ .flags(SettingsBindFlags::DEFAULT)
|
||||
+ .build();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -194,6 +215,8 @@ pub mod imp {
|
||||
row_background: TemplateChild::default(),
|
||||
row_messages_selectable: TemplateChild::default(),
|
||||
row_send_on_enter: TemplateChild::default(),
|
||||
+ row_send_typing_indicators: TemplateChild::default(),
|
||||
+ row_show_typing_indicators: TemplateChild::default(),
|
||||
}
|
||||
}
|
||||
|
||||
--
|
||||
2.53.0
|
||||
|
||||
@@ -0,0 +1,622 @@
|
||||
From 45b21cee00bfc5545aea6fbc9a4f991cfd781cff Mon Sep 17 00:00:00 2001
|
||||
From: Simon Gardling <titaniumtown@proton.me>
|
||||
Date: Wed, 29 Apr 2026 19:13:52 -0400
|
||||
Subject: [PATCH 2/6] feat(messages): Implement formatted messages
|
||||
|
||||
- Display Signal BodyRange styles (bold, italic, strikethrough,
|
||||
spoiler, monospace) on incoming messages by translating them into
|
||||
pango attributes alongside the existing mention rendering, making
|
||||
the offset accounting work for mention substitutions and
|
||||
surrogate-pair text alike.
|
||||
- Parse a markdown-style formatting syntax on outbound messages and
|
||||
send the resulting BodyRanges with the cleaned body text. The
|
||||
parser lives in its own module with unit tests covering the
|
||||
supported markers, nesting, unmatched markers, and non-BMP UTF-16
|
||||
offsets.
|
||||
- Update the message-input tooltip to surface the supported markers.
|
||||
---
|
||||
CHANGELOG.md | 2 +
|
||||
data/resources/ui/channel_messages.blp | 2 +-
|
||||
src/backend/message/formatting.rs | 287 +++++++++++++++++++++++++
|
||||
src/backend/message/mod.rs | 2 +
|
||||
src/backend/message/text_message.rs | 200 +++++++++++++----
|
||||
5 files changed, 447 insertions(+), 46 deletions(-)
|
||||
create mode 100644 src/backend/message/formatting.rs
|
||||
|
||||
diff --git a/CHANGELOG.md b/CHANGELOG.md
|
||||
index 2bde927..50cd5f5 100644
|
||||
--- a/CHANGELOG.md
|
||||
+++ b/CHANGELOG.md
|
||||
@@ -10,6 +10,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
||||
|
||||
- Send typing indicators while composing a message and display them above the message input.
|
||||
- Settings to enable or disable sending and showing typing indicators.
|
||||
+- Render formatted message styles (bold, italic, strikethrough, spoiler, monospace) on incoming messages.
|
||||
+- Send formatted messages with markdown-style markers (`**bold**`, `*italic*`, `~~strike~~`, `||spoiler||`, `` `monospace` ``).
|
||||
|
||||
## [0.20.4] - 2026-04-22
|
||||
|
||||
diff --git a/data/resources/ui/channel_messages.blp b/data/resources/ui/channel_messages.blp
|
||||
index 7f438e4..6c3948f 100644
|
||||
--- a/data/resources/ui/channel_messages.blp
|
||||
+++ b/data/resources/ui/channel_messages.blp
|
||||
@@ -301,7 +301,7 @@ template $FlChannelMessages: Box {
|
||||
activate => $send_message() swapped;
|
||||
paste-file => $paste_file() swapped;
|
||||
paste-texture => $paste_texture() swapped;
|
||||
- tooltip-text: C_("tooltip", "Message input");
|
||||
+ tooltip-text: C_("tooltip", "Message input. Use **bold**, *italic*, ~~strike~~, ||spoiler|| or `monospace` to format text.");
|
||||
}
|
||||
|
||||
Button button_send {
|
||||
diff --git a/src/backend/message/formatting.rs b/src/backend/message/formatting.rs
|
||||
new file mode 100644
|
||||
index 0000000..5a1d596
|
||||
--- /dev/null
|
||||
+++ b/src/backend/message/formatting.rs
|
||||
@@ -0,0 +1,287 @@
|
||||
+//! Lightweight markdown-style formatting parser for outgoing messages.
|
||||
+//!
|
||||
+//! Supported syntax (mirroring the way Signal Desktop and iOS render
|
||||
+//! formatted messages):
|
||||
+//!
|
||||
+//! - `**text**` for bold
|
||||
+//! - `*text*` or `_text_` for italic
|
||||
+//! - `~~text~~` for strikethrough
|
||||
+//! - `||text||` for spoiler
|
||||
+//! - `` `text` `` for monospace
|
||||
+//!
|
||||
+//! Parsing is forgiving: any marker without a matching counterpart is left
|
||||
+//! verbatim in the resulting text. Markers may nest as long as the inner
|
||||
+//! marker is a different kind from the outer one.
|
||||
+//!
|
||||
+//! The function returns the cleaned message body plus the corresponding
|
||||
+//! `BodyRange`s with offsets in UTF-16 code units, as required by the
|
||||
+//! Signal protocol.
|
||||
+
|
||||
+use std::collections::HashMap;
|
||||
+
|
||||
+use libsignal_service::proto::BodyRange;
|
||||
+use libsignal_service::proto::body_range::{AssociatedValue, Style as BodyRangeStyle};
|
||||
+
|
||||
+#[derive(Clone, Copy, Debug, Hash, Eq, PartialEq)]
|
||||
+enum Marker {
|
||||
+ Bold,
|
||||
+ Italic,
|
||||
+ Strikethrough,
|
||||
+ Spoiler,
|
||||
+ Monospace,
|
||||
+}
|
||||
+
|
||||
+impl Marker {
|
||||
+ fn style(self) -> BodyRangeStyle {
|
||||
+ match self {
|
||||
+ Marker::Bold => BodyRangeStyle::Bold,
|
||||
+ Marker::Italic => BodyRangeStyle::Italic,
|
||||
+ Marker::Strikethrough => BodyRangeStyle::Strikethrough,
|
||||
+ Marker::Spoiler => BodyRangeStyle::Spoiler,
|
||||
+ Marker::Monospace => BodyRangeStyle::Monospace,
|
||||
+ }
|
||||
+ }
|
||||
+}
|
||||
+
|
||||
+/// Try to consume a marker starting at `chars[i]` and return its kind plus
|
||||
+/// the number of characters that make up the marker token.
|
||||
+fn detect_marker(chars: &[char], i: usize) -> Option<(Marker, usize)> {
|
||||
+ let cur = *chars.get(i)?;
|
||||
+ let next = chars.get(i + 1).copied();
|
||||
+ match (cur, next) {
|
||||
+ ('*', Some('*')) => Some((Marker::Bold, 2)),
|
||||
+ ('~', Some('~')) => Some((Marker::Strikethrough, 2)),
|
||||
+ ('|', Some('|')) => Some((Marker::Spoiler, 2)),
|
||||
+ ('*', _) | ('_', _) => Some((Marker::Italic, 1)),
|
||||
+ ('`', _) => Some((Marker::Monospace, 1)),
|
||||
+ _ => None,
|
||||
+ }
|
||||
+}
|
||||
+
|
||||
+#[derive(Debug, Clone, Copy)]
|
||||
+struct MatchedSpan {
|
||||
+ marker: Marker,
|
||||
+ open_pos: usize,
|
||||
+ close_pos: usize,
|
||||
+ marker_len: usize,
|
||||
+}
|
||||
+
|
||||
+/// Walk the character stream left-to-right and pair markers of the same
|
||||
+/// kind. The first occurrence opens a span, the next occurrence of the same
|
||||
+/// kind closes it; markers without a partner are simply ignored.
|
||||
+fn detect_matched_markers(chars: &[char]) -> Vec<MatchedSpan> {
|
||||
+ let mut open: HashMap<Marker, (usize, usize)> = HashMap::new();
|
||||
+ let mut spans: Vec<MatchedSpan> = Vec::new();
|
||||
+ let mut i = 0;
|
||||
+ while i < chars.len() {
|
||||
+ if let Some((marker, len)) = detect_marker(chars, i) {
|
||||
+ if let Some((open_pos, marker_len)) = open.remove(&marker) {
|
||||
+ spans.push(MatchedSpan {
|
||||
+ marker,
|
||||
+ open_pos,
|
||||
+ close_pos: i,
|
||||
+ marker_len,
|
||||
+ });
|
||||
+ } else {
|
||||
+ open.insert(marker, (i, len));
|
||||
+ }
|
||||
+ i += len;
|
||||
+ } else {
|
||||
+ i += 1;
|
||||
+ }
|
||||
+ }
|
||||
+ spans
|
||||
+}
|
||||
+
|
||||
+/// Parse markdown-style formatting markers in `input` and produce the cleaned
|
||||
+/// text plus the corresponding Signal [BodyRange]s with UTF-16 offsets.
|
||||
+///
|
||||
+/// Empty matched spans (e.g. `**` followed immediately by `**`) are dropped.
|
||||
+pub fn parse_formatting(input: &str) -> (String, Vec<BodyRange>) {
|
||||
+ let chars: Vec<char> = input.chars().collect();
|
||||
+ let spans = detect_matched_markers(&chars);
|
||||
+
|
||||
+ if spans.is_empty() {
|
||||
+ return (input.to_owned(), Vec::new());
|
||||
+ }
|
||||
+
|
||||
+ // Mark which character positions are part of a matched marker token and
|
||||
+ // therefore must be removed from the cleaned output.
|
||||
+ let mut skip = vec![false; chars.len()];
|
||||
+ for sp in &spans {
|
||||
+ for k in sp.open_pos..(sp.open_pos + sp.marker_len).min(chars.len()) {
|
||||
+ skip[k] = true;
|
||||
+ }
|
||||
+ for k in sp.close_pos..(sp.close_pos + sp.marker_len).min(chars.len()) {
|
||||
+ skip[k] = true;
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
+ // Build the cleaned output and a per-input-char map into the output's
|
||||
+ // UTF-16 code-unit offset.
|
||||
+ let mut output = String::with_capacity(input.len());
|
||||
+ let mut input_to_output_utf16 = vec![0u32; chars.len() + 1];
|
||||
+ let mut utf16_count: u32 = 0;
|
||||
+ for (i, c) in chars.iter().enumerate() {
|
||||
+ input_to_output_utf16[i] = utf16_count;
|
||||
+ if !skip[i] {
|
||||
+ output.push(*c);
|
||||
+ utf16_count += c.len_utf16() as u32;
|
||||
+ }
|
||||
+ }
|
||||
+ input_to_output_utf16[chars.len()] = utf16_count;
|
||||
+
|
||||
+ let mut ranges: Vec<BodyRange> = Vec::with_capacity(spans.len());
|
||||
+ for sp in spans {
|
||||
+ let start = input_to_output_utf16[sp.open_pos + sp.marker_len];
|
||||
+ let end = input_to_output_utf16[sp.close_pos];
|
||||
+ if end <= start {
|
||||
+ continue;
|
||||
+ }
|
||||
+ ranges.push(BodyRange {
|
||||
+ start: Some(start),
|
||||
+ length: Some(end - start),
|
||||
+ associated_value: Some(AssociatedValue::Style(sp.marker.style() as i32)),
|
||||
+ });
|
||||
+ }
|
||||
+
|
||||
+ // Sort by start so the final ranges are stable for tests and for
|
||||
+ // downstream consumers that expect ordered ranges.
|
||||
+ ranges.sort_by_key(|r| r.start);
|
||||
+
|
||||
+ (output, ranges)
|
||||
+}
|
||||
+
|
||||
+#[cfg(test)]
|
||||
+mod tests {
|
||||
+ use super::*;
|
||||
+
|
||||
+ fn ranges_summary(ranges: &[BodyRange]) -> Vec<(u32, u32, BodyRangeStyle)> {
|
||||
+ ranges
|
||||
+ .iter()
|
||||
+ .map(|r| {
|
||||
+ let style = match r.associated_value {
|
||||
+ Some(AssociatedValue::Style(s)) => {
|
||||
+ BodyRangeStyle::try_from(s).unwrap_or(BodyRangeStyle::None)
|
||||
+ }
|
||||
+ _ => BodyRangeStyle::None,
|
||||
+ };
|
||||
+ (r.start.unwrap_or(0), r.length.unwrap_or(0), style)
|
||||
+ })
|
||||
+ .collect()
|
||||
+ }
|
||||
+
|
||||
+ #[test]
|
||||
+ fn no_markers() {
|
||||
+ let (text, ranges) = parse_formatting("hello world");
|
||||
+ assert_eq!(text, "hello world");
|
||||
+ assert!(ranges.is_empty());
|
||||
+ }
|
||||
+
|
||||
+ #[test]
|
||||
+ fn bold() {
|
||||
+ let (text, ranges) = parse_formatting("**bold**");
|
||||
+ assert_eq!(text, "bold");
|
||||
+ assert_eq!(ranges_summary(&ranges), vec![(0, 4, BodyRangeStyle::Bold)]);
|
||||
+ }
|
||||
+
|
||||
+ #[test]
|
||||
+ fn italic_asterisk() {
|
||||
+ let (text, ranges) = parse_formatting("*italic*");
|
||||
+ assert_eq!(text, "italic");
|
||||
+ assert_eq!(
|
||||
+ ranges_summary(&ranges),
|
||||
+ vec![(0, 6, BodyRangeStyle::Italic)]
|
||||
+ );
|
||||
+ }
|
||||
+
|
||||
+ #[test]
|
||||
+ fn italic_underscore() {
|
||||
+ let (text, ranges) = parse_formatting("_italic_");
|
||||
+ assert_eq!(text, "italic");
|
||||
+ assert_eq!(
|
||||
+ ranges_summary(&ranges),
|
||||
+ vec![(0, 6, BodyRangeStyle::Italic)]
|
||||
+ );
|
||||
+ }
|
||||
+
|
||||
+ #[test]
|
||||
+ fn strikethrough() {
|
||||
+ let (text, ranges) = parse_formatting("~~strike~~");
|
||||
+ assert_eq!(text, "strike");
|
||||
+ assert_eq!(
|
||||
+ ranges_summary(&ranges),
|
||||
+ vec![(0, 6, BodyRangeStyle::Strikethrough)]
|
||||
+ );
|
||||
+ }
|
||||
+
|
||||
+ #[test]
|
||||
+ fn spoiler() {
|
||||
+ let (text, ranges) = parse_formatting("||hidden||");
|
||||
+ assert_eq!(text, "hidden");
|
||||
+ assert_eq!(
|
||||
+ ranges_summary(&ranges),
|
||||
+ vec![(0, 6, BodyRangeStyle::Spoiler)]
|
||||
+ );
|
||||
+ }
|
||||
+
|
||||
+ #[test]
|
||||
+ fn monospace() {
|
||||
+ let (text, ranges) = parse_formatting("`code`");
|
||||
+ assert_eq!(text, "code");
|
||||
+ assert_eq!(
|
||||
+ ranges_summary(&ranges),
|
||||
+ vec![(0, 4, BodyRangeStyle::Monospace)]
|
||||
+ );
|
||||
+ }
|
||||
+
|
||||
+ #[test]
|
||||
+ fn bold_and_italic_nested() {
|
||||
+ let (text, ranges) = parse_formatting("**bold *italic***");
|
||||
+ assert_eq!(text, "bold italic");
|
||||
+ let summary = ranges_summary(&ranges);
|
||||
+ assert!(summary.contains(&(0, 11, BodyRangeStyle::Bold)));
|
||||
+ assert!(summary.contains(&(5, 6, BodyRangeStyle::Italic)));
|
||||
+ }
|
||||
+
|
||||
+ #[test]
|
||||
+ fn unmatched_open_left_literal() {
|
||||
+ let (text, ranges) = parse_formatting("**only one start");
|
||||
+ assert_eq!(text, "**only one start");
|
||||
+ assert!(ranges.is_empty());
|
||||
+ }
|
||||
+
|
||||
+ #[test]
|
||||
+ fn surrounding_text_preserved() {
|
||||
+ let (text, ranges) = parse_formatting("hello **world**!");
|
||||
+ assert_eq!(text, "hello world!");
|
||||
+ assert_eq!(ranges_summary(&ranges), vec![(6, 5, BodyRangeStyle::Bold)]);
|
||||
+ }
|
||||
+
|
||||
+ #[test]
|
||||
+ fn multiple_pairs() {
|
||||
+ let (text, ranges) = parse_formatting("**a**b**c**");
|
||||
+ assert_eq!(text, "abc");
|
||||
+ let summary = ranges_summary(&ranges);
|
||||
+ assert_eq!(summary.len(), 2);
|
||||
+ assert_eq!(summary[0], (0, 1, BodyRangeStyle::Bold));
|
||||
+ assert_eq!(summary[1], (2, 1, BodyRangeStyle::Bold));
|
||||
+ }
|
||||
+
|
||||
+ #[test]
|
||||
+ fn empty_pair_dropped() {
|
||||
+ let (text, ranges) = parse_formatting("****");
|
||||
+ assert_eq!(text, "");
|
||||
+ assert!(ranges.is_empty());
|
||||
+ }
|
||||
+
|
||||
+ #[test]
|
||||
+ fn utf16_offsets_for_non_bmp() {
|
||||
+ // Character "𝟚" (U+1D7DA) is a non-BMP codepoint occupying two
|
||||
+ // UTF-16 code units, so a Bold range over a string containing it
|
||||
+ // must reflect that in its `length`.
|
||||
+ let (text, ranges) = parse_formatting("**𝟚**");
|
||||
+ assert_eq!(text, "𝟚");
|
||||
+ assert_eq!(ranges_summary(&ranges), vec![(0, 2, BodyRangeStyle::Bold)]);
|
||||
+ }
|
||||
+}
|
||||
diff --git a/src/backend/message/mod.rs b/src/backend/message/mod.rs
|
||||
index 74952ac..4e0f584 100644
|
||||
--- a/src/backend/message/mod.rs
|
||||
+++ b/src/backend/message/mod.rs
|
||||
@@ -1,12 +1,14 @@
|
||||
mod call_message;
|
||||
mod deletion_message;
|
||||
mod display_message;
|
||||
+mod formatting;
|
||||
mod reaction_message;
|
||||
mod text_message;
|
||||
|
||||
pub use call_message::{CallMessage, CallMessageType};
|
||||
pub use deletion_message::DeletionMessage;
|
||||
pub use display_message::{DisplayMessage, DisplayMessageExt};
|
||||
+pub use formatting::parse_formatting;
|
||||
pub use reaction_message::ReactionMessage;
|
||||
pub use text_message::TextMessage;
|
||||
|
||||
diff --git a/src/backend/message/text_message.rs b/src/backend/message/text_message.rs
|
||||
index a9adb04..c06bcfa 100644
|
||||
--- a/src/backend/message/text_message.rs
|
||||
+++ b/src/backend/message/text_message.rs
|
||||
@@ -2,9 +2,9 @@ use crate::prelude::*;
|
||||
|
||||
use libsignal_service::content::Reaction;
|
||||
use libsignal_service::proto::DataMessage;
|
||||
-use libsignal_service::proto::body_range::AssociatedValue;
|
||||
+use libsignal_service::proto::body_range::{AssociatedValue, Style as BodyRangeStyle};
|
||||
use libsignal_service::proto::data_message::Delete;
|
||||
-use pango::{AttrColor, AttrList};
|
||||
+use pango::{AttrColor, AttrInt, AttrList, AttrString, Style as PangoStyle, Weight};
|
||||
|
||||
use crate::backend::timeline::{TimelineItem, TimelineItemExt};
|
||||
use crate::backend::{Attachment, Channel, Contact};
|
||||
@@ -19,6 +19,48 @@ gtk::glib::wrapper! {
|
||||
const MENTION_CHAR: char = '@';
|
||||
const MENTION_COLOR: (u16, u16, u16) = (0, 0, u16::MAX);
|
||||
|
||||
+/// Convert a Signal [BodyRangeStyle] into the pango attributes that render
|
||||
+/// the same visual style. Spoilers are approximated as a black-on-black
|
||||
+/// span as pango has no native spoiler primitive.
|
||||
+fn style_to_pango_attrs(
|
||||
+ style: BodyRangeStyle,
|
||||
+ start_byte: u32,
|
||||
+ end_byte: u32,
|
||||
+) -> Vec<pango::Attribute> {
|
||||
+ fn span<A: Into<pango::Attribute>>(attr: A, start: u32, end: u32) -> pango::Attribute {
|
||||
+ let mut attr: pango::Attribute = attr.into();
|
||||
+ attr.set_start_index(start);
|
||||
+ attr.set_end_index(end);
|
||||
+ attr
|
||||
+ }
|
||||
+
|
||||
+ match style {
|
||||
+ BodyRangeStyle::Bold => vec![span(
|
||||
+ AttrInt::new_weight(Weight::Bold),
|
||||
+ start_byte,
|
||||
+ end_byte,
|
||||
+ )],
|
||||
+ BodyRangeStyle::Italic => vec![span(
|
||||
+ AttrInt::new_style(PangoStyle::Italic),
|
||||
+ start_byte,
|
||||
+ end_byte,
|
||||
+ )],
|
||||
+ BodyRangeStyle::Strikethrough => {
|
||||
+ vec![span(AttrInt::new_strikethrough(true), start_byte, end_byte)]
|
||||
+ }
|
||||
+ BodyRangeStyle::Monospace => vec![span(
|
||||
+ AttrString::new_family("monospace"),
|
||||
+ start_byte,
|
||||
+ end_byte,
|
||||
+ )],
|
||||
+ BodyRangeStyle::Spoiler => vec![
|
||||
+ span(AttrColor::new_foreground(0, 0, 0), start_byte, end_byte),
|
||||
+ span(AttrColor::new_background(0, 0, 0), start_byte, end_byte),
|
||||
+ ],
|
||||
+ BodyRangeStyle::None => Vec::new(),
|
||||
+ }
|
||||
+}
|
||||
+
|
||||
impl TextMessage {
|
||||
pub fn from_text_channel_sender<S: AsRef<str>>(
|
||||
text: S,
|
||||
@@ -65,14 +107,16 @@ impl TextMessage {
|
||||
.build();
|
||||
|
||||
let text_owned = text.as_ref().to_owned();
|
||||
- let body = if text_owned.is_empty() {
|
||||
- None
|
||||
+ let (body, body_ranges) = if text_owned.is_empty() {
|
||||
+ (None, Vec::new())
|
||||
} else {
|
||||
- Some(text_owned)
|
||||
+ let (cleaned, ranges) = super::parse_formatting(&text_owned);
|
||||
+ (Some(cleaned), ranges)
|
||||
};
|
||||
|
||||
let message = DataMessage {
|
||||
body,
|
||||
+ body_ranges,
|
||||
timestamp: Some(timestamp),
|
||||
..Default::default()
|
||||
};
|
||||
@@ -245,10 +289,17 @@ impl TextMessage {
|
||||
self.notify_body();
|
||||
}
|
||||
|
||||
- /// Formats the message body based on its ranges, e.g. to insert mention names.
|
||||
+ /// Format the message body based on its body ranges.
|
||||
+ ///
|
||||
+ /// This both substitutes mentions with the resolved participant name and
|
||||
+ /// applies styling (bold, italic, monospace, strikethrough, spoiler) as
|
||||
+ /// pango attributes on the resulting text.
|
||||
///
|
||||
- /// Returns the resulting strings and an [AttrList] that can be used in labels to highlight areas.
|
||||
- /// Be carefull when editing this function and note that Signal uses UTF-16 byte offsets, while Rust uses UTF-8 byte offsets.
|
||||
+ /// Note that Signal uses UTF-16 byte offsets, while Rust strings use
|
||||
+ /// UTF-8. The implementation maintains an explicit per-utf16-index
|
||||
+ /// mapping into the resulting UTF-8 string so that styles applied to a
|
||||
+ /// range that survives a mention substitution still land on the right
|
||||
+ /// bytes.
|
||||
async fn format_body(&self) -> (Option<String>, AttrList) {
|
||||
let Some(body) = self.internal_data().and_then(|m| m.body) else {
|
||||
return (None, AttrList::new());
|
||||
@@ -264,53 +315,112 @@ impl TextMessage {
|
||||
|
||||
let channel = self.channel();
|
||||
|
||||
- // Sort by growing start index
|
||||
+ // Sort by growing start index so mention substitutions happen left-to-right.
|
||||
ranges.sort_unstable_by_key(|r| r.start());
|
||||
|
||||
- let attrs = AttrList::new();
|
||||
-
|
||||
- // Signal (Java) uses UTF-16 body and therefore also UTF-16 offsets, while Flare (Rust) uses UTF-8. Need to convert.
|
||||
- let body_utf16: Vec<u16> = body.encode_utf16().collect();
|
||||
-
|
||||
- let mut result_utf8 = String::new();
|
||||
- let mut index_utf16 = 0;
|
||||
- let mut index_utf8 = 0;
|
||||
- for r in ranges {
|
||||
- let start = r.start() as usize;
|
||||
- let end = start + r.length() as usize;
|
||||
- let uuid = match r.associated_value {
|
||||
+ // Resolve mention names asynchronously up front so the rest of the
|
||||
+ // formatting can be a synchronous walk.
|
||||
+ let mut mentions: Vec<(usize, usize, String)> = Vec::new();
|
||||
+ for r in &ranges {
|
||||
+ let uuid = match &r.associated_value {
|
||||
Some(AssociatedValue::MentionAci(u)) => u.parse().ok(),
|
||||
Some(AssociatedValue::MentionAciBinary(u)) => {
|
||||
- u.try_into().ok().map(Uuid::from_bytes)
|
||||
+ u.clone().try_into().ok().map(Uuid::from_bytes)
|
||||
}
|
||||
_ => None,
|
||||
};
|
||||
- let Some(uuid) = uuid else {
|
||||
+ if let Some(uuid) = uuid {
|
||||
+ let start = r.start() as usize;
|
||||
+ let end = (r.start() + r.length()) as usize;
|
||||
+ let name = format!(
|
||||
+ "{}{}",
|
||||
+ MENTION_CHAR,
|
||||
+ channel.participant_by_uuid(uuid).await.title()
|
||||
+ );
|
||||
+ mentions.push((start, end, name));
|
||||
+ }
|
||||
+ }
|
||||
+ // Mentions cannot overlap each other; ensure the iterator order is stable.
|
||||
+ mentions.sort_unstable_by_key(|(s, _, _)| *s);
|
||||
+
|
||||
+ let body_utf16: Vec<u16> = body.encode_utf16().collect();
|
||||
+ let attrs = AttrList::new();
|
||||
+
|
||||
+ // Build the result string while constructing a per-utf16-index map
|
||||
+ // into the resulting UTF-8 byte offsets.
|
||||
+ let mut byte_at: Vec<usize> = Vec::with_capacity(body_utf16.len() + 1);
|
||||
+ let mut result_utf8 = String::new();
|
||||
+ let mut mention_iter = mentions.into_iter().peekable();
|
||||
+
|
||||
+ let mut i = 0;
|
||||
+ while i < body_utf16.len() {
|
||||
+ // Inject mention substitutions at their start position.
|
||||
+ if mention_iter
|
||||
+ .peek()
|
||||
+ .is_some_and(|(m_start, _, _)| *m_start == i)
|
||||
+ {
|
||||
+ let (m_start, m_end, name) = mention_iter.next().expect("peeked entry to exist");
|
||||
+ let mention_byte_start = result_utf8.len();
|
||||
+ // Mark every UTF-16 index inside the mention span as the start
|
||||
+ // of the substituted text. Indices >= m_end will be filled by
|
||||
+ // subsequent iterations.
|
||||
+ for _ in m_start..m_end {
|
||||
+ byte_at.push(mention_byte_start);
|
||||
+ }
|
||||
+ result_utf8.push_str(&name);
|
||||
+
|
||||
+ let mut highlight =
|
||||
+ AttrColor::new_foreground(MENTION_COLOR.0, MENTION_COLOR.1, MENTION_COLOR.2);
|
||||
+ highlight.set_start_index(mention_byte_start as u32);
|
||||
+ highlight.set_end_index(result_utf8.len() as u32);
|
||||
+ attrs.insert(highlight);
|
||||
+
|
||||
+ i = m_end.min(body_utf16.len());
|
||||
continue;
|
||||
- };
|
||||
- let name = format!(
|
||||
- "{}{}",
|
||||
- MENTION_CHAR,
|
||||
- channel.participant_by_uuid(uuid).await.title()
|
||||
- );
|
||||
- let to_add_body = String::from_utf16_lossy(&body_utf16[index_utf16..start]);
|
||||
- result_utf8.push_str(&to_add_body);
|
||||
- result_utf8.push_str(&name);
|
||||
- index_utf16 = end;
|
||||
-
|
||||
- let index_start_highlight = index_utf8 + to_add_body.len();
|
||||
- index_utf8 += to_add_body.len() + name.len();
|
||||
- let index_end_highlight = index_utf8;
|
||||
-
|
||||
- let (red, green, blue) = MENTION_COLOR;
|
||||
- let mut highlight = AttrColor::new_foreground(red, green, blue);
|
||||
- highlight.set_start_index(index_start_highlight as u32);
|
||||
- highlight.set_end_index(index_end_highlight as u32);
|
||||
- attrs.insert(highlight);
|
||||
+ }
|
||||
+
|
||||
+ byte_at.push(result_utf8.len());
|
||||
+ let unit = body_utf16[i];
|
||||
+ if (0xD800..=0xDBFF).contains(&unit) && i + 1 < body_utf16.len() {
|
||||
+ // High surrogate: consume the pair as one codepoint.
|
||||
+ let pair = [unit, body_utf16[i + 1]];
|
||||
+ let decoded = char::decode_utf16(pair.iter().copied())
|
||||
+ .next()
|
||||
+ .and_then(|r| r.ok())
|
||||
+ .unwrap_or('\u{FFFD}');
|
||||
+ result_utf8.push(decoded);
|
||||
+ byte_at.push(result_utf8.len());
|
||||
+ i += 2;
|
||||
+ } else {
|
||||
+ let decoded = char::decode_utf16([unit].iter().copied())
|
||||
+ .next()
|
||||
+ .and_then(|r| r.ok())
|
||||
+ .unwrap_or('\u{FFFD}');
|
||||
+ result_utf8.push(decoded);
|
||||
+ i += 1;
|
||||
+ }
|
||||
}
|
||||
+ byte_at.push(result_utf8.len());
|
||||
|
||||
- if index_utf16 < body_utf16.len() {
|
||||
- result_utf8.push_str(&String::from_utf16_lossy(&body_utf16[index_utf16..]))
|
||||
+ // Apply style ranges using the byte-offset map.
|
||||
+ for r in ranges {
|
||||
+ let Some(AssociatedValue::Style(s)) = r.associated_value else {
|
||||
+ continue;
|
||||
+ };
|
||||
+ let style = match BodyRangeStyle::try_from(s) {
|
||||
+ Ok(BodyRangeStyle::None) | Err(_) => continue,
|
||||
+ Ok(other) => other,
|
||||
+ };
|
||||
+ let start_utf16 = (r.start() as usize).min(byte_at.len() - 1);
|
||||
+ let end_utf16 = ((r.start() + r.length()) as usize).min(byte_at.len() - 1);
|
||||
+ if start_utf16 >= end_utf16 {
|
||||
+ continue;
|
||||
+ }
|
||||
+ let start_byte = byte_at[start_utf16] as u32;
|
||||
+ let end_byte = byte_at[end_utf16] as u32;
|
||||
+ for attr in style_to_pango_attrs(style, start_byte, end_byte) {
|
||||
+ attrs.insert(attr);
|
||||
+ }
|
||||
}
|
||||
|
||||
(Some(result_utf8), attrs)
|
||||
--
|
||||
2.53.0
|
||||
|
||||
919
patches/flare/0003-feat-messages-Implement-edited-messages.patch
Normal file
919
patches/flare/0003-feat-messages-Implement-edited-messages.patch
Normal file
@@ -0,0 +1,919 @@
|
||||
From 2437960d0fe4daf512ae77fd99bba77e86c70ce9 Mon Sep 17 00:00:00 2001
|
||||
From: Simon Gardling <titaniumtown@proton.me>
|
||||
Date: Wed, 29 Apr 2026 19:33:06 -0400
|
||||
Subject: [PATCH 3/6] feat(messages): Implement edited messages
|
||||
|
||||
- Receive incoming EditMessage (1-1 and sync) and replace the body,
|
||||
body_ranges, and attachments of the targeted message in place. The
|
||||
receive path uses an EditMessageItem wrapper that mirrors the
|
||||
DeletionMessage flow.
|
||||
- Send EditMessage from the message context menu: an Edit action loads
|
||||
the original body into the input bar and a dedicated indicator
|
||||
takes the place of the reply hint while editing. Submitting the
|
||||
edited text dispatches an EditMessage to the channel via a new
|
||||
Channel::send_internal_content helper that forwards any ContentBody
|
||||
to the right send path.
|
||||
- Display an 'edited' label in the message indicators when the local
|
||||
copy of a message has been replaced by an edit.
|
||||
---
|
||||
CHANGELOG.md | 1 +
|
||||
data/resources/ui/channel_messages.blp | 89 +++++++++++++++++++
|
||||
data/resources/ui/components/indicators.blp | 10 +++
|
||||
data/resources/ui/message_item.blp | 10 +++
|
||||
src/backend/channel.rs | 94 ++++++++++++++++++++-
|
||||
src/backend/message/edit_message_item.rs | 66 +++++++++++++++
|
||||
src/backend/message/formatting.rs | 11 ++-
|
||||
src/backend/message/mod.rs | 71 ++++++++++++++++
|
||||
src/backend/message/text_message.rs | 62 ++++++++++++++
|
||||
src/gui/channel_messages.rs | 67 +++++++++++++++
|
||||
src/gui/components/indicators.rs | 2 +
|
||||
src/gui/components/item_row.rs | 58 ++++++-------
|
||||
src/gui/message_item.rs | 27 ++++++
|
||||
13 files changed, 530 insertions(+), 38 deletions(-)
|
||||
create mode 100644 src/backend/message/edit_message_item.rs
|
||||
|
||||
diff --git a/CHANGELOG.md b/CHANGELOG.md
|
||||
index 50cd5f5..0338ed8 100644
|
||||
--- a/CHANGELOG.md
|
||||
+++ b/CHANGELOG.md
|
||||
@@ -12,6 +12,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
||||
- Settings to enable or disable sending and showing typing indicators.
|
||||
- Render formatted message styles (bold, italic, strikethrough, spoiler, monospace) on incoming messages.
|
||||
- Send formatted messages with markdown-style markers (`**bold**`, `*italic*`, `~~strike~~`, `||spoiler||`, `` `monospace` ``).
|
||||
+- Display incoming edited messages with an `edited` indicator and edit your own sent messages from their context menu.
|
||||
|
||||
## [0.20.4] - 2026-04-22
|
||||
|
||||
diff --git a/data/resources/ui/channel_messages.blp b/data/resources/ui/channel_messages.blp
|
||||
index 6c3948f..f3d2348 100644
|
||||
--- a/data/resources/ui/channel_messages.blp
|
||||
+++ b/data/resources/ui/channel_messages.blp
|
||||
@@ -238,6 +238,95 @@ template $FlChannelMessages: Box {
|
||||
}
|
||||
}
|
||||
|
||||
+ // Editing indicator
|
||||
+ Box {
|
||||
+ vexpand-set: true;
|
||||
+
|
||||
+ styles [
|
||||
+ "currently-replied-box",
|
||||
+ ]
|
||||
+
|
||||
+ visible: bind $is_some(template.editing-message) as <bool>;
|
||||
+
|
||||
+ Image {
|
||||
+ styles [
|
||||
+ "accent",
|
||||
+ ]
|
||||
+
|
||||
+ icon-name: "document-edit-symbolic";
|
||||
+ width-request: 34;
|
||||
+ }
|
||||
+
|
||||
+ Separator {
|
||||
+ styles [
|
||||
+ "accent",
|
||||
+ ]
|
||||
+
|
||||
+ margin-start: 6;
|
||||
+ width-request: 2;
|
||||
+ }
|
||||
+
|
||||
+ Grid {
|
||||
+ hexpand: true;
|
||||
+ margin-start: 12;
|
||||
+ margin-end: 12;
|
||||
+
|
||||
+ Label {
|
||||
+ styles [
|
||||
+ "heading",
|
||||
+ ]
|
||||
+
|
||||
+ halign: start;
|
||||
+ label: _("Editing message");
|
||||
+ wrap: true;
|
||||
+ wrap-mode: word_char;
|
||||
+ lines: 1;
|
||||
+ ellipsize: end;
|
||||
+
|
||||
+ layout {
|
||||
+ row: 0;
|
||||
+ column: 0;
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
+ Label {
|
||||
+ styles [
|
||||
+ "message-text",
|
||||
+ ]
|
||||
+
|
||||
+ halign: fill;
|
||||
+ label: bind template.editing-message as <$FlTextMessage>.body;
|
||||
+ wrap: true;
|
||||
+ wrap-mode: word_char;
|
||||
+ lines: 2;
|
||||
+ ellipsize: end;
|
||||
+ xalign: 0;
|
||||
+
|
||||
+ layout {
|
||||
+ row: 1;
|
||||
+ column: 0;
|
||||
+ }
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
+ Button {
|
||||
+ accessibility {
|
||||
+ label: C_("accessibility", "Cancel editing");
|
||||
+ }
|
||||
+
|
||||
+ tooltip-text: C_("tooltip", "Cancel editing");
|
||||
+
|
||||
+ styles [
|
||||
+ "flat",
|
||||
+ "circular",
|
||||
+ ]
|
||||
+
|
||||
+ valign: center;
|
||||
+ clicked => $cancel_edit() swapped;
|
||||
+ icon-name: "window-close-symbolic";
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
// Box for attachments
|
||||
Box {
|
||||
vexpand-set: true;
|
||||
diff --git a/data/resources/ui/components/indicators.blp b/data/resources/ui/components/indicators.blp
|
||||
index f6c51f6..977f1c4 100644
|
||||
--- a/data/resources/ui/components/indicators.blp
|
||||
+++ b/data/resources/ui/components/indicators.blp
|
||||
@@ -8,6 +8,16 @@ template $FlMessageIndicators {
|
||||
halign: end;
|
||||
valign: end;
|
||||
|
||||
+ Label edited_label {
|
||||
+ styles [
|
||||
+ "dim-label",
|
||||
+ "caption",
|
||||
+ ]
|
||||
+
|
||||
+ visible: bind template.edited;
|
||||
+ label: _("edited");
|
||||
+ }
|
||||
+
|
||||
Label message_info_label {
|
||||
styles [
|
||||
"dim-label",
|
||||
diff --git a/data/resources/ui/message_item.blp b/data/resources/ui/message_item.blp
|
||||
index 82c018b..2c21b8b 100644
|
||||
--- a/data/resources/ui/message_item.blp
|
||||
+++ b/data/resources/ui/message_item.blp
|
||||
@@ -16,6 +16,14 @@ menu message-menu {
|
||||
icon: "mail-reply-sender-symbolic";
|
||||
}
|
||||
|
||||
+ item {
|
||||
+ label: _("Edit");
|
||||
+ action: "msg.edit";
|
||||
+ verb-icon: "document-edit-symbolic";
|
||||
+ icon: "document-edit-symbolic";
|
||||
+ hidden-when: "action-disabled";
|
||||
+ }
|
||||
+
|
||||
item {
|
||||
label: _("Delete");
|
||||
action: "msg.delete";
|
||||
@@ -243,6 +251,7 @@ template $FlMessageItem: $ContextMenuBin {
|
||||
valign: end;
|
||||
halign: end;
|
||||
timestamp: bind $format_time_human(template.message as <$FlTextMessage>.datetime) as <string>;
|
||||
+ edited: bind template.message as <$FlTextMessage>.is-edited;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -253,6 +262,7 @@ template $FlMessageItem: $ContextMenuBin {
|
||||
indicators: $FlMessageIndicators timestamp {
|
||||
timestamp: bind $format_time_human(template.message as <$FlTextMessage>.datetime) as <string>;
|
||||
visible: bind $not_empty(template.message as <$FlTextMessage>.body) as <bool>;
|
||||
+ edited: bind template.message as <$FlTextMessage>.is-edited;
|
||||
};
|
||||
}
|
||||
|
||||
diff --git a/src/backend/channel.rs b/src/backend/channel.rs
|
||||
index 4bb1d38..94fa6bb 100644
|
||||
--- a/src/backend/channel.rs
|
||||
+++ b/src/backend/channel.rs
|
||||
@@ -1,8 +1,8 @@
|
||||
use crate::backend::{
|
||||
Contact, Manager, Message,
|
||||
message::{
|
||||
- DeletionMessage, DisplayMessage, DisplayMessageExt, MessageExt, ReactionMessage,
|
||||
- TextMessage,
|
||||
+ DeletionMessage, DisplayMessage, DisplayMessageExt, EditMessageItem, MessageExt,
|
||||
+ ReactionMessage, TextMessage,
|
||||
},
|
||||
timeline::{TimelineItem, TimelineItemExt},
|
||||
};
|
||||
@@ -336,6 +336,17 @@ impl Channel {
|
||||
message.react(reaction);
|
||||
}
|
||||
}
|
||||
+
|
||||
+ // Apply pending edits queued while the original was unloaded.
|
||||
+ // Edits are stored ordered by ascending edit-timestamp so
|
||||
+ // applying them in sequence converges on the latest content.
|
||||
+ let pending = self.imp().pending_edits.borrow_mut().remove(&id);
|
||||
+ if let Some(edits) = pending {
|
||||
+ log::trace!("Applying {} pending edit(s) to message {id}", edits.len());
|
||||
+ for edit in edits {
|
||||
+ message.apply_edit(edit).await;
|
||||
+ }
|
||||
+ }
|
||||
}
|
||||
|
||||
// Apply reactions or store them.
|
||||
@@ -420,6 +431,46 @@ impl Channel {
|
||||
log::trace!("Deletion message aimed at a unloaded message. Will be ignored");
|
||||
}
|
||||
}
|
||||
+
|
||||
+ // Edit messages: replace the original message's body with the new
|
||||
+ // content and remember that the message was edited.
|
||||
+ if let Some(edit_item) = message.dynamic_cast_ref::<EditMessageItem>() {
|
||||
+ let Some(target_ts) = edit_item.target_timestamp() else {
|
||||
+ log::warn!("Got an EditMessage without a target timestamp; ignoring.");
|
||||
+ return Ok(());
|
||||
+ };
|
||||
+ let Some(new_data) = edit_item.edit().and_then(|e| e.data_message) else {
|
||||
+ log::warn!("Got an EditMessage without a data_message; ignoring.");
|
||||
+ return Ok(());
|
||||
+ };
|
||||
+ crate::trace!(
|
||||
+ "Channel {} got an edit message targeting timestamp: {}",
|
||||
+ self.title(),
|
||||
+ target_ts
|
||||
+ );
|
||||
+ let edited_msg = self
|
||||
+ .imp()
|
||||
+ .timeline
|
||||
+ .borrow()
|
||||
+ .get_by_timestamp(target_ts)
|
||||
+ .and_then(|o| o.dynamic_cast::<TextMessage>().ok());
|
||||
+ if let Some(edited_msg) = edited_msg {
|
||||
+ edited_msg.apply_edit(new_data).await;
|
||||
+ self.notify("last-message");
|
||||
+ } else {
|
||||
+ log::trace!(
|
||||
+ "Edit target {target_ts} not loaded yet; queueing for when it lands."
|
||||
+ );
|
||||
+ let edit_ts = new_data.timestamp.unwrap_or(0);
|
||||
+ let mut pending = self.imp().pending_edits.borrow_mut();
|
||||
+ let entry = pending.entry(target_ts).or_default();
|
||||
+ let to_insert = entry
|
||||
+ .binary_search_by_key(&edit_ts, |d| d.timestamp.unwrap_or(0))
|
||||
+ .unwrap_or_else(|e| e);
|
||||
+ entry.insert(to_insert, new_data);
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@@ -482,6 +533,38 @@ impl Channel {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
+ /// Send an arbitrary [ContentBody] to this channel, dispatching to the
|
||||
+ /// appropriate single-contact or group send path.
|
||||
+ pub(super) async fn send_internal_content(
|
||||
+ &self,
|
||||
+ body: impl Into<libsignal_service::content::ContentBody>,
|
||||
+ timestamp: u64,
|
||||
+ ) -> Result<(), crate::ApplicationError> {
|
||||
+ let manager = self.manager();
|
||||
+ let body = body.into();
|
||||
+ let receiver_contact = self
|
||||
+ .imp()
|
||||
+ .contact
|
||||
+ .borrow()
|
||||
+ .as_ref()
|
||||
+ .and_then(|c| c.address());
|
||||
+
|
||||
+ if let Some(contact) = receiver_contact {
|
||||
+ manager.send_message(contact, body, timestamp).await?;
|
||||
+ } else {
|
||||
+ let group_master_key = self
|
||||
+ .imp()
|
||||
+ .group_context
|
||||
+ .borrow()
|
||||
+ .as_ref()
|
||||
+ .and_then(|c| c.master_key.clone());
|
||||
+ if let Some(key) = group_master_key {
|
||||
+ manager.send_message_to_group(key, body, timestamp).await?;
|
||||
+ }
|
||||
+ }
|
||||
+ Ok(())
|
||||
+ }
|
||||
+
|
||||
/// Send a message to the channel and add it to the channel.
|
||||
pub async fn send_message(&self, msg: Message) -> Result<(), crate::ApplicationError> {
|
||||
msg.mark_as_read();
|
||||
@@ -749,7 +832,7 @@ mod imp {
|
||||
|
||||
use gdk::Paintable;
|
||||
|
||||
- use libsignal_service::proto::GroupContextV2;
|
||||
+ use libsignal_service::proto::{DataMessage, GroupContextV2};
|
||||
use presage::model::groups::Group;
|
||||
|
||||
#[derive(Default, glib::Properties)]
|
||||
@@ -762,6 +845,11 @@ mod imp {
|
||||
pub(super) participants: RefCell<Vec<Contact>>,
|
||||
|
||||
pub(super) pending_reactions: RefCell<HashMap<u64, Vec<ReactionMessage>>>,
|
||||
+ /// Edits whose target message is not yet in the timeline. Cold
|
||||
+ /// scrollback walks newest-first, so an EditMessage may arrive in
|
||||
+ /// `do_new_message` before its original TextMessage has been
|
||||
+ /// loaded; the original picks up any queued edits when it lands.
|
||||
+ pub(super) pending_edits: RefCell<HashMap<u64, Vec<DataMessage>>>,
|
||||
pub(super) typing: RefCell<HashMap<Uuid, TypingNotification>>,
|
||||
|
||||
#[property(name = "avatar", get = Self::avatar)]
|
||||
diff --git a/src/backend/message/edit_message_item.rs b/src/backend/message/edit_message_item.rs
|
||||
new file mode 100644
|
||||
index 0000000..9655f50
|
||||
--- /dev/null
|
||||
+++ b/src/backend/message/edit_message_item.rs
|
||||
@@ -0,0 +1,66 @@
|
||||
+use crate::backend::timeline::TimelineItem;
|
||||
+use crate::backend::{Channel, Contact};
|
||||
+use crate::prelude::*;
|
||||
+
|
||||
+use libsignal_service::proto::EditMessage;
|
||||
+
|
||||
+use super::{Manager, Message};
|
||||
+
|
||||
+gtk::glib::wrapper! {
|
||||
+ /// An incoming edit-message wrapper carrying the new content for a
|
||||
+ /// previously-sent message identified by its sent-timestamp.
|
||||
+ pub struct EditMessageItem(ObjectSubclass<imp::EditMessageItem>) @extends Message, TimelineItem;
|
||||
+}
|
||||
+
|
||||
+impl EditMessageItem {
|
||||
+ pub fn from_edit(
|
||||
+ sender: &Contact,
|
||||
+ channel: &Channel,
|
||||
+ timestamp: u64,
|
||||
+ manager: &Manager,
|
||||
+ edit: EditMessage,
|
||||
+ ) -> Self {
|
||||
+ let s: Self = Object::builder::<Self>()
|
||||
+ .property("sender", sender)
|
||||
+ .property("channel", channel)
|
||||
+ .property("timestamp", timestamp)
|
||||
+ .property("manager", manager)
|
||||
+ .build();
|
||||
+ s.imp().edit.swap(&RefCell::new(Some(edit)));
|
||||
+ s
|
||||
+ }
|
||||
+
|
||||
+ /// Sent-timestamp of the message this edit replaces.
|
||||
+ pub fn target_timestamp(&self) -> Option<u64> {
|
||||
+ self.edit().and_then(|e| e.target_sent_timestamp)
|
||||
+ }
|
||||
+
|
||||
+ /// The new [DataMessage] payload that replaces the targeted message's
|
||||
+ /// content.
|
||||
+ pub fn edit(&self) -> Option<EditMessage> {
|
||||
+ self.imp().edit.borrow().clone()
|
||||
+ }
|
||||
+}
|
||||
+
|
||||
+mod imp {
|
||||
+ use crate::backend::{Message, message::MessageImpl, timeline::TimelineItemImpl};
|
||||
+ use crate::prelude::*;
|
||||
+
|
||||
+ use libsignal_service::proto::EditMessage;
|
||||
+
|
||||
+ #[derive(Default)]
|
||||
+ pub struct EditMessageItem {
|
||||
+ pub(super) edit: RefCell<Option<EditMessage>>,
|
||||
+ }
|
||||
+
|
||||
+ #[glib::object_subclass]
|
||||
+ impl ObjectSubclass for EditMessageItem {
|
||||
+ const NAME: &'static str = "FlEditMessageItem";
|
||||
+ type Type = super::EditMessageItem;
|
||||
+ type ParentType = Message;
|
||||
+ }
|
||||
+
|
||||
+ impl TimelineItemImpl for EditMessageItem {}
|
||||
+ impl MessageImpl for EditMessageItem {}
|
||||
+ impl ObjectImpl for EditMessageItem {}
|
||||
+}
|
||||
diff --git a/src/backend/message/formatting.rs b/src/backend/message/formatting.rs
|
||||
index 5a1d596..ed12a85 100644
|
||||
--- a/src/backend/message/formatting.rs
|
||||
+++ b/src/backend/message/formatting.rs
|
||||
@@ -108,13 +108,12 @@ pub fn parse_formatting(input: &str) -> (String, Vec<BodyRange>) {
|
||||
// Mark which character positions are part of a matched marker token and
|
||||
// therefore must be removed from the cleaned output.
|
||||
let mut skip = vec![false; chars.len()];
|
||||
+ let total = chars.len();
|
||||
for sp in &spans {
|
||||
- for k in sp.open_pos..(sp.open_pos + sp.marker_len).min(chars.len()) {
|
||||
- skip[k] = true;
|
||||
- }
|
||||
- for k in sp.close_pos..(sp.close_pos + sp.marker_len).min(chars.len()) {
|
||||
- skip[k] = true;
|
||||
- }
|
||||
+ let open_end = (sp.open_pos + sp.marker_len).min(total);
|
||||
+ skip[sp.open_pos..open_end].fill(true);
|
||||
+ let close_end = (sp.close_pos + sp.marker_len).min(total);
|
||||
+ skip[sp.close_pos..close_end].fill(true);
|
||||
}
|
||||
|
||||
// Build the cleaned output and a per-input-char map into the output's
|
||||
diff --git a/src/backend/message/mod.rs b/src/backend/message/mod.rs
|
||||
index 4e0f584..f3a0537 100644
|
||||
--- a/src/backend/message/mod.rs
|
||||
+++ b/src/backend/message/mod.rs
|
||||
@@ -1,6 +1,7 @@
|
||||
mod call_message;
|
||||
mod deletion_message;
|
||||
mod display_message;
|
||||
+mod edit_message_item;
|
||||
mod formatting;
|
||||
mod reaction_message;
|
||||
mod text_message;
|
||||
@@ -8,6 +9,7 @@ mod text_message;
|
||||
pub use call_message::{CallMessage, CallMessageType};
|
||||
pub use deletion_message::DeletionMessage;
|
||||
pub use display_message::{DisplayMessage, DisplayMessageExt};
|
||||
+pub use edit_message_item::EditMessageItem;
|
||||
pub use formatting::parse_formatting;
|
||||
pub use reaction_message::ReactionMessage;
|
||||
pub use text_message::TextMessage;
|
||||
@@ -253,6 +255,75 @@ impl Message {
|
||||
.upcast(),
|
||||
)
|
||||
}
|
||||
+ // An edit-message replacing the body of an earlier message.
|
||||
+ ContentBody::EditMessage(edit) => {
|
||||
+ let Some(data_message) = edit.data_message.as_ref() else {
|
||||
+ log::warn!("Got an EditMessage without a data_message; ignoring.");
|
||||
+ return None;
|
||||
+ };
|
||||
+ let channel = manager
|
||||
+ .channel_from_uuid_or_group(metadata.sender, &data_message.group_v2)
|
||||
+ .await;
|
||||
+ let contact = channel
|
||||
+ .participant_by_uuid(metadata.sender.raw_uuid())
|
||||
+ .await;
|
||||
+ if contact.is_blocked() {
|
||||
+ log::debug!("Got message from a blocked contact. Ignoring");
|
||||
+ return None;
|
||||
+ }
|
||||
+ log::trace!("Got an edit message");
|
||||
+ Some(
|
||||
+ EditMessageItem::from_edit(
|
||||
+ &contact,
|
||||
+ &channel,
|
||||
+ timestamp,
|
||||
+ manager,
|
||||
+ edit.clone(),
|
||||
+ )
|
||||
+ .upcast(),
|
||||
+ )
|
||||
+ }
|
||||
+ // An edit-message sent from another device of the same account.
|
||||
+ ContentBody::SynchronizeMessage(SyncMessage {
|
||||
+ sent:
|
||||
+ Some(
|
||||
+ sent @ Sent {
|
||||
+ edit_message: Some(edit),
|
||||
+ ..
|
||||
+ },
|
||||
+ ),
|
||||
+ ..
|
||||
+ }) => {
|
||||
+ let Some(data_message) = edit.data_message.as_ref() else {
|
||||
+ log::warn!("Got a sync EditMessage without a data_message; ignoring.");
|
||||
+ return None;
|
||||
+ };
|
||||
+ let channel = manager
|
||||
+ .channel_from_uuid_or_group(
|
||||
+ sent.parse_destination_service_id()
|
||||
+ .unwrap_or(metadata.sender),
|
||||
+ &data_message.group_v2,
|
||||
+ )
|
||||
+ .await;
|
||||
+ let contact = channel
|
||||
+ .participant_by_uuid(metadata.sender.raw_uuid())
|
||||
+ .await;
|
||||
+ if contact.is_blocked() {
|
||||
+ log::debug!("Got message from a blocked contact. Ignoring");
|
||||
+ return None;
|
||||
+ }
|
||||
+ log::trace!("Got an edit message (sync)");
|
||||
+ Some(
|
||||
+ EditMessageItem::from_edit(
|
||||
+ &contact,
|
||||
+ &channel,
|
||||
+ timestamp,
|
||||
+ manager,
|
||||
+ edit.clone(),
|
||||
+ )
|
||||
+ .upcast(),
|
||||
+ )
|
||||
+ }
|
||||
// Call message.
|
||||
ContentBody::CallMessage(c) => {
|
||||
// TODO: Group calls?
|
||||
diff --git a/src/backend/message/text_message.rs b/src/backend/message/text_message.rs
|
||||
index c06bcfa..ff7aaaa 100644
|
||||
--- a/src/backend/message/text_message.rs
|
||||
+++ b/src/backend/message/text_message.rs
|
||||
@@ -199,6 +199,66 @@ impl TextMessage {
|
||||
self.set_property("is-deleted", true);
|
||||
}
|
||||
|
||||
+ /// Replace the message's body with `new_data` and mark the message as
|
||||
+ /// edited so the UI can surface this to the user.
|
||||
+ pub async fn apply_edit(&self, new_data: DataMessage) {
|
||||
+ let new_body = new_data.body.clone();
|
||||
+ let new_body_ranges = new_data.body_ranges.clone();
|
||||
+ let edit_attachments = new_data.attachments.clone();
|
||||
+ if let Some(data) = self.internal_data_mut().as_mut() {
|
||||
+ data.body = new_body;
|
||||
+ data.body_ranges = new_body_ranges;
|
||||
+ // Replacing attachments matches Signal Desktop's behaviour;
|
||||
+ // a typical edit only changes the body but the protocol allows
|
||||
+ // updating the attachments as well.
|
||||
+ if !edit_attachments.is_empty() {
|
||||
+ data.attachments = edit_attachments;
|
||||
+ }
|
||||
+ }
|
||||
+ self.set_property("is-edited", true);
|
||||
+ self.prepare_format_body().await;
|
||||
+ }
|
||||
+
|
||||
+ /// Send an edit for this message, replacing its body with `text`.
|
||||
+ pub async fn send_edit<S: AsRef<str>>(&self, text: S) -> Result<(), crate::ApplicationError> {
|
||||
+ let target_sent_timestamp = Some(self.timestamp());
|
||||
+ let send_timestamp = std::time::SystemTime::now()
|
||||
+ .duration_since(std::time::UNIX_EPOCH)
|
||||
+ .expect("Time went backwards")
|
||||
+ .as_millis() as u64;
|
||||
+
|
||||
+ let cleaned = text.as_ref().to_owned();
|
||||
+ let (body, body_ranges) = if cleaned.is_empty() {
|
||||
+ (None, Vec::new())
|
||||
+ } else {
|
||||
+ let (body, ranges) = super::parse_formatting(&cleaned);
|
||||
+ (Some(body), ranges)
|
||||
+ };
|
||||
+
|
||||
+ // Carry forward the original message's structural fields (quote,
|
||||
+ // attachments, expire_timer, sticker, group_v2, etc.) so peers do
|
||||
+ // not see them cleared when applying the edit. Only body,
|
||||
+ // body_ranges, and timestamp differ.
|
||||
+ let mut inner = self.internal_data().unwrap_or_default();
|
||||
+ inner.body = body;
|
||||
+ inner.body_ranges = body_ranges;
|
||||
+ inner.timestamp = Some(send_timestamp);
|
||||
+
|
||||
+ let edit = libsignal_service::proto::EditMessage {
|
||||
+ target_sent_timestamp,
|
||||
+ data_message: Some(inner.clone()),
|
||||
+ };
|
||||
+
|
||||
+ self.channel()
|
||||
+ .send_internal_content(edit, send_timestamp)
|
||||
+ .await?;
|
||||
+
|
||||
+ // Mirror the change locally.
|
||||
+ self.apply_edit(inner).await;
|
||||
+ self.channel().notify("last-message");
|
||||
+ Ok(())
|
||||
+ }
|
||||
+
|
||||
/// Send a reaction for a message and apply it.
|
||||
pub async fn send_reaction<S: AsRef<str>>(
|
||||
&self,
|
||||
@@ -462,6 +522,8 @@ mod imp {
|
||||
pub(super) message_attributes: RefCell<AttrList>,
|
||||
#[property(get, set)]
|
||||
pub(super) is_deleted: RefCell<bool>,
|
||||
+ #[property(get, set)]
|
||||
+ pub(super) is_edited: RefCell<bool>,
|
||||
}
|
||||
|
||||
impl TextMessage {
|
||||
diff --git a/src/gui/channel_messages.rs b/src/gui/channel_messages.rs
|
||||
index c6684fc..b929957 100644
|
||||
--- a/src/gui/channel_messages.rs
|
||||
+++ b/src/gui/channel_messages.rs
|
||||
@@ -2,6 +2,7 @@ use crate::prelude::*;
|
||||
use gio::SettingsBindFlags;
|
||||
|
||||
use crate::ApplicationError;
|
||||
+use crate::backend::message::TextMessage;
|
||||
|
||||
const MESSAGES_REQUEST_LOAD: usize = 10;
|
||||
|
||||
@@ -24,6 +25,19 @@ glib::wrapper! {
|
||||
}
|
||||
|
||||
impl ChannelMessages {
|
||||
+ /// Begin editing `msg`: load its body into the text entry, mark it as
|
||||
+ /// the editing target, and clear any pending reply.
|
||||
+ pub fn start_editing(&self, msg: Option<TextMessage>) {
|
||||
+ if let Some(msg) = msg.as_ref() {
|
||||
+ self.set_reply_message(None::<TextMessage>);
|
||||
+ self.imp()
|
||||
+ .text_entry
|
||||
+ .set_text(msg.body().unwrap_or_default());
|
||||
+ }
|
||||
+ self.set_editing_message(msg);
|
||||
+ self.imp().text_entry.grab_focus();
|
||||
+ }
|
||||
+
|
||||
pub fn focus_input(&self) {
|
||||
self.imp().text_entry.grab_focus();
|
||||
}
|
||||
@@ -360,6 +374,8 @@ pub mod imp {
|
||||
active_channel: RefCell<Option<Channel>>,
|
||||
#[property(get, set, nullable)]
|
||||
reply_message: RefCell<Option<TextMessage>>,
|
||||
+ #[property(get, set, nullable)]
|
||||
+ editing_message: RefCell<Option<TextMessage>>,
|
||||
|
||||
#[property(get, set, default = true)]
|
||||
sticky: Cell<bool>,
|
||||
@@ -482,6 +498,13 @@ pub mod imp {
|
||||
self.obj().set_reply_message(None::<TextMessage>);
|
||||
}
|
||||
|
||||
+ #[template_callback]
|
||||
+ fn cancel_edit(&self) {
|
||||
+ log::trace!("Unsetting editing message");
|
||||
+ self.obj().set_editing_message(None::<TextMessage>);
|
||||
+ self.text_entry.clear();
|
||||
+ }
|
||||
+
|
||||
#[template_callback]
|
||||
fn remove_attachments(&self) {
|
||||
log::trace!("Unsetting attachments");
|
||||
@@ -587,6 +610,33 @@ pub mod imp {
|
||||
};
|
||||
self.obj().notify("has-attachments");
|
||||
|
||||
+ // If we are editing an existing message, send an EditMessage
|
||||
+ // instead of constructing a new one.
|
||||
+ if let Some(target) = self.obj().editing_message() {
|
||||
+ self.obj().set_editing_message(None::<TextMessage>);
|
||||
+ if text.is_empty() {
|
||||
+ log::warn!("Refusing to send an empty edit; dropping the change.");
|
||||
+ return;
|
||||
+ }
|
||||
+ let obj = self.obj();
|
||||
+ gspawn!(clone!(
|
||||
+ #[strong]
|
||||
+ obj,
|
||||
+ async move {
|
||||
+ if let Err(e) = target.send_edit(text).await {
|
||||
+ let root = obj
|
||||
+ .root()
|
||||
+ .expect("`ChannelMessages` to have a root")
|
||||
+ .dynamic_cast::<crate::gui::Window>()
|
||||
+ .expect("Root of `ChannelMessages` to be a `Window`.");
|
||||
+ let dialog = ErrorDialog::new(&e, &root);
|
||||
+ dialog.present(Some(&root));
|
||||
+ }
|
||||
+ }
|
||||
+ ));
|
||||
+ return;
|
||||
+ }
|
||||
+
|
||||
if text.is_empty() && attachments.is_empty() {
|
||||
log::warn!("Got requested to send empty message, skipping");
|
||||
}
|
||||
@@ -685,6 +735,22 @@ pub mod imp {
|
||||
}
|
||||
),
|
||||
);
|
||||
+ widget.connect_local(
|
||||
+ "edit",
|
||||
+ false,
|
||||
+ clone!(
|
||||
+ #[weak]
|
||||
+ obj,
|
||||
+ #[upgrade_or_default]
|
||||
+ move |args| {
|
||||
+ let msg = args[1].get::<Option<TextMessage>>().expect(
|
||||
+ "Type of signal `edit` of `ItemRow` to be `TextMessage`.",
|
||||
+ );
|
||||
+ obj.start_editing(msg);
|
||||
+ None
|
||||
+ }
|
||||
+ ),
|
||||
+ );
|
||||
let list_item = object.downcast_ref::<gtk::ListItem>().unwrap();
|
||||
list_item.set_activatable(false);
|
||||
list_item.set_selectable(false);
|
||||
@@ -737,6 +803,7 @@ pub mod imp {
|
||||
self,
|
||||
move |_, _| {
|
||||
s.obj().set_reply_message(None::<TextMessage>);
|
||||
+ s.obj().set_editing_message(None::<TextMessage>);
|
||||
if let Some(channel) = s.active_channel.borrow().as_ref() {
|
||||
let draft = channel.property("draft");
|
||||
// Block the typing buffer-changed handler so
|
||||
diff --git a/src/gui/components/indicators.rs b/src/gui/components/indicators.rs
|
||||
index ce38221..4356607 100644
|
||||
--- a/src/gui/components/indicators.rs
|
||||
+++ b/src/gui/components/indicators.rs
|
||||
@@ -26,6 +26,8 @@ mod imp {
|
||||
pub struct MessageIndicators {
|
||||
#[property(get, set)]
|
||||
pub(super) timestamp: RefCell<String>,
|
||||
+ #[property(get, set)]
|
||||
+ pub(super) edited: Cell<bool>,
|
||||
//TODO: Implement sending state
|
||||
//#[template_child]
|
||||
//pub(super) sending_state_icon: TemplateChild<gtk::Image>,
|
||||
diff --git a/src/gui/components/item_row.rs b/src/gui/components/item_row.rs
|
||||
index b2c20d3..538b1bb 100644
|
||||
--- a/src/gui/components/item_row.rs
|
||||
+++ b/src/gui/components/item_row.rs
|
||||
@@ -1,5 +1,3 @@
|
||||
-use glib::SignalHandlerId;
|
||||
-
|
||||
use crate::prelude::*;
|
||||
|
||||
use crate::{
|
||||
@@ -27,23 +25,25 @@ impl ItemRow {
|
||||
fn timeline_item_to_widget(&self, item: &TimelineItem) -> Option<gtk::Widget> {
|
||||
if let Some(message) = item.dynamic_cast_ref::<TextMessage>() {
|
||||
let widget = MessageItem::new(message);
|
||||
- let handler = widget.connect_local(
|
||||
- "reply",
|
||||
- false,
|
||||
- clone!(
|
||||
- #[weak(rename_to = s)]
|
||||
- self,
|
||||
- #[upgrade_or_default]
|
||||
- move |args| {
|
||||
- let msg = args[1]
|
||||
- .get::<TextMessage>()
|
||||
- .expect("Type of signal `reply` of `MessageItem` to be `TextMessage`.");
|
||||
- s.emit_by_name::<()>("reply", &[&msg]);
|
||||
- None
|
||||
- }
|
||||
- ),
|
||||
- );
|
||||
- self.set_handler(handler);
|
||||
+ for signal in ["reply", "edit"] {
|
||||
+ let handler = widget.connect_local(
|
||||
+ signal,
|
||||
+ false,
|
||||
+ clone!(
|
||||
+ #[weak(rename_to = s)]
|
||||
+ self,
|
||||
+ #[upgrade_or_default]
|
||||
+ move |args| {
|
||||
+ let msg = args[1]
|
||||
+ .get::<TextMessage>()
|
||||
+ .expect("Type of signal of `MessageItem` to be `TextMessage`.");
|
||||
+ s.emit_by_name::<()>(signal, &[&msg]);
|
||||
+ None
|
||||
+ }
|
||||
+ ),
|
||||
+ );
|
||||
+ self.imp().handlers.borrow_mut().push(handler);
|
||||
+ }
|
||||
Some(widget.dynamic_cast().unwrap())
|
||||
} else if let Some(message) = item.dynamic_cast_ref::<CallMessage>() {
|
||||
let widget = CallMessageItem::new(message);
|
||||
@@ -53,12 +53,6 @@ impl ItemRow {
|
||||
None
|
||||
}
|
||||
}
|
||||
-
|
||||
- /// Set the pending handler of the ItemRow.
|
||||
- /// At most one handler may be pending.
|
||||
- fn set_handler(&self, handler: SignalHandlerId) {
|
||||
- self.imp().handler.replace(Some(handler));
|
||||
- }
|
||||
}
|
||||
|
||||
mod imp {
|
||||
@@ -77,7 +71,7 @@ mod imp {
|
||||
#[derive(Debug, Default, CompositeTemplate)]
|
||||
#[template(resource = "/ui/components/item_row.ui")]
|
||||
pub struct ItemRow {
|
||||
- pub(super) handler: RefCell<Option<SignalHandlerId>>,
|
||||
+ pub(super) handlers: RefCell<Vec<SignalHandlerId>>,
|
||||
}
|
||||
|
||||
#[glib::object_subclass]
|
||||
@@ -116,12 +110,15 @@ mod imp {
|
||||
.get::<Option<TimelineItem>>()
|
||||
.expect("ItemRow to only get TimelineItem");
|
||||
|
||||
- if let Some(handler) = self.handler.take() {
|
||||
+ let handlers = self.handlers.take();
|
||||
+ if !handlers.is_empty() {
|
||||
if let Some(child) = obj.child() {
|
||||
- child.disconnect(handler);
|
||||
+ for handler in handlers {
|
||||
+ child.disconnect(handler);
|
||||
+ }
|
||||
} else {
|
||||
log::warn!(
|
||||
- "A handler was set for an item row, but no child registered. This should not happen."
|
||||
+ "Handlers were set for an item row, but no child registered. This should not happen."
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -143,6 +140,9 @@ mod imp {
|
||||
Signal::builder("reply")
|
||||
.param_types([TextMessage::static_type()])
|
||||
.build(),
|
||||
+ Signal::builder("edit")
|
||||
+ .param_types([TextMessage::static_type()])
|
||||
+ .build(),
|
||||
]
|
||||
});
|
||||
SIGNALS.as_ref()
|
||||
diff --git a/src/gui/message_item.rs b/src/gui/message_item.rs
|
||||
index 21f504a..59d2778 100644
|
||||
--- a/src/gui/message_item.rs
|
||||
+++ b/src/gui/message_item.rs
|
||||
@@ -94,6 +94,14 @@ impl MessageItem {
|
||||
s.get_pressed_attachment().imp().open();
|
||||
}
|
||||
));
|
||||
+ let action_edit = SimpleAction::new("edit", None);
|
||||
+ action_edit.connect_activate(clone!(
|
||||
+ #[weak(rename_to = s)]
|
||||
+ self,
|
||||
+ move |_, _| {
|
||||
+ s.imp().handle_edit();
|
||||
+ }
|
||||
+ ));
|
||||
|
||||
let actions = SimpleActionGroup::new();
|
||||
self.insert_action_group("msg", Some(&actions));
|
||||
@@ -101,6 +109,7 @@ impl MessageItem {
|
||||
actions.add_action(&action_delete);
|
||||
actions.add_action(&action_copy);
|
||||
actions.add_action(&action_download);
|
||||
+ actions.add_action(&action_edit);
|
||||
actions.add_action(&action_open);
|
||||
|
||||
self.bind_property("message", &action_delete, "enabled")
|
||||
@@ -108,6 +117,13 @@ impl MessageItem {
|
||||
.sync_create()
|
||||
.build();
|
||||
|
||||
+ self.bind_property("message", &action_edit, "enabled")
|
||||
+ .transform_to(|_, msg: Option<TextMessage>| {
|
||||
+ msg.map(|m| m.sender().is_self() && m.body().is_some())
|
||||
+ })
|
||||
+ .sync_create()
|
||||
+ .build();
|
||||
+
|
||||
self.bind_property("pressed-attachment", &action_download, "enabled")
|
||||
.transform_to(|_, att: Option<Attachment>| Some(att.is_some()))
|
||||
.sync_create()
|
||||
@@ -550,6 +566,14 @@ pub mod imp {
|
||||
gspawn!(async move { msg.delete().await });
|
||||
}
|
||||
|
||||
+ #[template_callback]
|
||||
+ pub(super) fn handle_edit(&self) {
|
||||
+ let obj = self.obj();
|
||||
+ let msg = obj.message();
|
||||
+ crate::trace!("Editing a message: {}", msg.body().unwrap_or_default());
|
||||
+ obj.emit_by_name::<()>("edit", &[&msg]);
|
||||
+ }
|
||||
+
|
||||
// Signal uses the old unicode for the heart emoji, which is recognized as a black heart by gtk. This function converts it to the standard red heart
|
||||
#[template_callback(function)]
|
||||
pub(super) fn fix_emoji(emoji: Option<String>) -> Option<String> {
|
||||
@@ -637,6 +661,9 @@ pub mod imp {
|
||||
Signal::builder("reply")
|
||||
.param_types([TextMessage::static_type()])
|
||||
.build(),
|
||||
+ Signal::builder("edit")
|
||||
+ .param_types([TextMessage::static_type()])
|
||||
+ .build(),
|
||||
]
|
||||
});
|
||||
SIGNALS.as_ref()
|
||||
--
|
||||
2.53.0
|
||||
|
||||
@@ -0,0 +1,674 @@
|
||||
From 250b11530e8d29a42707dde8ff3dd516e0073863 Mon Sep 17 00:00:00 2001
|
||||
From: Simon Gardling <titaniumtown@proton.me>
|
||||
Date: Wed, 29 Apr 2026 19:53:22 -0400
|
||||
Subject: [PATCH 4/6] feat(messages): Multi-select messages and delete for me
|
||||
|
||||
- Add a 'Select' action to the message context menu that puts the
|
||||
channel into selection mode and pre-selects the message. While in
|
||||
selection mode, every message item shows a check button and a
|
||||
toolbar replaces nothing in particular but offers a 'Delete for me'
|
||||
destructive action plus a cancel button.
|
||||
- Track selection state with a transient `selected` property on
|
||||
`Message` and a `selection-mode` property on `Channel`. A new
|
||||
`selection-changed` signal lets the channel-messages view update
|
||||
the selection summary without polling.
|
||||
- Add `Channel::delete_messages_locally` plus the matching
|
||||
`Manager::delete_messages_locally` and a new
|
||||
`Timeline::remove_by_timestamp` helper. The action only purges the
|
||||
local copy and never sends a remote deletion.
|
||||
---
|
||||
CHANGELOG.md | 1 +
|
||||
data/resources/style.css | 14 +++
|
||||
data/resources/ui/channel_messages.blp | 60 ++++++++++++
|
||||
data/resources/ui/message_item.blp | 20 ++++
|
||||
src/backend/channel.rs | 25 +++++
|
||||
src/backend/manager.rs | 32 ++++++
|
||||
src/backend/message/mod.rs | 2 +
|
||||
src/backend/timeline/mod.rs | 16 +++
|
||||
src/gui/channel_messages.rs | 129 ++++++++++++++++++++++++-
|
||||
src/gui/message_item.rs | 119 +++++++++++++++++++++++
|
||||
10 files changed, 417 insertions(+), 1 deletion(-)
|
||||
|
||||
diff --git a/CHANGELOG.md b/CHANGELOG.md
|
||||
index 0338ed8..47ec77a 100644
|
||||
--- a/CHANGELOG.md
|
||||
+++ b/CHANGELOG.md
|
||||
@@ -13,6 +13,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
||||
- Render formatted message styles (bold, italic, strikethrough, spoiler, monospace) on incoming messages.
|
||||
- Send formatted messages with markdown-style markers (`**bold**`, `*italic*`, `~~strike~~`, `||spoiler||`, `` `monospace` ``).
|
||||
- Display incoming edited messages with an `edited` indicator and edit your own sent messages from their context menu.
|
||||
+- Multi-select messages from their context menu and delete the selection locally with a single action.
|
||||
|
||||
## [0.20.4] - 2026-04-22
|
||||
|
||||
diff --git a/data/resources/style.css b/data/resources/style.css
|
||||
index 00e4783..1c0cdfd 100644
|
||||
--- a/data/resources/style.css
|
||||
+++ b/data/resources/style.css
|
||||
@@ -19,6 +19,20 @@
|
||||
min-height: 18px;
|
||||
}
|
||||
|
||||
+.message-item.in-selection-mode .avatar-other {
|
||||
+ opacity: 0;
|
||||
+}
|
||||
+
|
||||
+.message-item.in-selection-mode.selected .message-bubble {
|
||||
+ outline: 2px solid @accent_color;
|
||||
+}
|
||||
+
|
||||
+.selection-toolbar {
|
||||
+ padding: 6px 12px;
|
||||
+ background-color: @window_bg_color;
|
||||
+ border-top: 1px solid @borders;
|
||||
+}
|
||||
+
|
||||
.message-list row {
|
||||
padding:0;
|
||||
}
|
||||
diff --git a/data/resources/ui/channel_messages.blp b/data/resources/ui/channel_messages.blp
|
||||
index f3d2348..eb927f8 100644
|
||||
--- a/data/resources/ui/channel_messages.blp
|
||||
+++ b/data/resources/ui/channel_messages.blp
|
||||
@@ -135,6 +135,66 @@ template $FlChannelMessages: Box {
|
||||
}
|
||||
}
|
||||
|
||||
+ // Selection toolbar (shown when in multi-select mode).
|
||||
+ Box selection_toolbar {
|
||||
+ styles [
|
||||
+ "selection-toolbar",
|
||||
+ ]
|
||||
+
|
||||
+ orientation: horizontal;
|
||||
+ spacing: 12;
|
||||
+ hexpand: true;
|
||||
+ visible: bind template.active-channel as <$FlChannel>.selection-mode;
|
||||
+
|
||||
+ Button {
|
||||
+ accessibility {
|
||||
+ label: C_("accessibility", "Cancel selection");
|
||||
+ }
|
||||
+
|
||||
+ tooltip-text: C_("tooltip", "Cancel selection");
|
||||
+
|
||||
+ styles [
|
||||
+ "flat",
|
||||
+ "circular",
|
||||
+ ]
|
||||
+
|
||||
+ valign: center;
|
||||
+ clicked => $cancel_selection() swapped;
|
||||
+ icon-name: "window-close-symbolic";
|
||||
+ }
|
||||
+
|
||||
+ Label {
|
||||
+ hexpand: true;
|
||||
+ halign: start;
|
||||
+ label: bind template.selection-summary;
|
||||
+ }
|
||||
+
|
||||
+ Button {
|
||||
+ styles [
|
||||
+ "destructive-action",
|
||||
+ "pill",
|
||||
+ ]
|
||||
+
|
||||
+ valign: center;
|
||||
+ clicked => $delete_selection() swapped;
|
||||
+ sensitive: bind template.has-selection;
|
||||
+
|
||||
+ Box {
|
||||
+ orientation: horizontal;
|
||||
+ spacing: 6;
|
||||
+
|
||||
+ Image {
|
||||
+ icon-name: "user-trash-symbolic";
|
||||
+ }
|
||||
+
|
||||
+ Label {
|
||||
+ label: _("Delete for me");
|
||||
+ }
|
||||
+ }
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
+
|
||||
Box {
|
||||
styles [
|
||||
"toolbar",
|
||||
diff --git a/data/resources/ui/message_item.blp b/data/resources/ui/message_item.blp
|
||||
index 2c21b8b..ba3fd23 100644
|
||||
--- a/data/resources/ui/message_item.blp
|
||||
+++ b/data/resources/ui/message_item.blp
|
||||
@@ -24,6 +24,13 @@ menu message-menu {
|
||||
hidden-when: "action-disabled";
|
||||
}
|
||||
|
||||
+ item {
|
||||
+ label: _("Select");
|
||||
+ action: "msg.select";
|
||||
+ verb-icon: "checkbox-checked-symbolic";
|
||||
+ icon: "checkbox-checked-symbolic";
|
||||
+ }
|
||||
+
|
||||
item {
|
||||
label: _("Delete");
|
||||
action: "msg.delete";
|
||||
@@ -87,6 +94,19 @@ template $FlMessageItem: $ContextMenuBin {
|
||||
}
|
||||
}
|
||||
|
||||
+ CheckButton selection_check {
|
||||
+ visible: bind template.message as <$FlTextMessage>.channel as <$FlChannel>.selection-mode;
|
||||
+ active: bind template.message as <$FlTextMessage>.selected;
|
||||
+ valign: center;
|
||||
+ can-target: false;
|
||||
+ can-focus: false;
|
||||
+
|
||||
+ layout {
|
||||
+ row: 0;
|
||||
+ column: 0;
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
Adw.Spinner {
|
||||
visible: bind template.message as <$FlTextMessage>.pending;
|
||||
tooltip-text: _("This message is currently being sent");
|
||||
diff --git a/src/backend/channel.rs b/src/backend/channel.rs
|
||||
index 94fa6bb..0fbc51d 100644
|
||||
--- a/src/backend/channel.rs
|
||||
+++ b/src/backend/channel.rs
|
||||
@@ -199,6 +199,28 @@ impl Channel {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
+ /// Delete a set of messages locally only ("Delete for me"). Removes them
|
||||
+ /// from the encrypted local store and from the in-memory timeline.
|
||||
+ pub async fn delete_messages_locally(
|
||||
+ &self,
|
||||
+ timestamps: Vec<u64>,
|
||||
+ ) -> Result<(), ApplicationError> {
|
||||
+ if timestamps.is_empty() {
|
||||
+ return Ok(());
|
||||
+ }
|
||||
+ let purged = self
|
||||
+ .manager()
|
||||
+ .delete_messages_locally(self, timestamps)
|
||||
+ .await?;
|
||||
+ let timeline = self.imp().timeline.borrow();
|
||||
+ for ts in &purged {
|
||||
+ timeline.remove_by_timestamp(*ts);
|
||||
+ }
|
||||
+ drop(timeline);
|
||||
+ self.notify("last-message");
|
||||
+ Ok(())
|
||||
+ }
|
||||
+
|
||||
pub(super) fn group_context(&self) -> Option<GroupContextV2> {
|
||||
self.imp().group_context.borrow().clone()
|
||||
}
|
||||
@@ -861,6 +883,8 @@ mod imp {
|
||||
pub(super) draft: RefCell<String>,
|
||||
#[property(get, set)]
|
||||
pub(super) is_active: RefCell<bool>,
|
||||
+ #[property(get, set)]
|
||||
+ pub(super) selection_mode: RefCell<bool>,
|
||||
|
||||
#[property(get = Self::last_message)]
|
||||
pub(super) last_message: PhantomData<Option<DisplayMessage>>,
|
||||
@@ -1000,6 +1024,7 @@ mod imp {
|
||||
Signal::builder("message")
|
||||
.param_types([DisplayMessage::static_type()])
|
||||
.build(),
|
||||
+ Signal::builder("selection-changed").build(),
|
||||
]
|
||||
});
|
||||
SIGNALS.as_ref()
|
||||
diff --git a/src/backend/manager.rs b/src/backend/manager.rs
|
||||
index eaa41e0..0964681 100644
|
||||
--- a/src/backend/manager.rs
|
||||
+++ b/src/backend/manager.rs
|
||||
@@ -210,6 +210,38 @@ impl Manager {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
+ /// Delete a set of messages locally only ("Delete for me"). The remote
|
||||
+ /// peer is not informed; this only purges the local copy from storage.
|
||||
+ ///
|
||||
+ /// Returns the timestamps that were actually purged from the store; the
|
||||
+ /// caller should mirror only those into the in-memory timeline so a
|
||||
+ /// per-message store failure does not leave the on-disk state and the UI
|
||||
+ /// permanently disagreed.
|
||||
+ pub async fn delete_messages_locally(
|
||||
+ &self,
|
||||
+ channel: &Channel,
|
||||
+ timestamps: Vec<u64>,
|
||||
+ ) -> Result<Vec<u64>, ApplicationError> {
|
||||
+ let thread = channel.thread();
|
||||
+ let mut store = self.store();
|
||||
+ let purged = tspawn!(async move {
|
||||
+ let mut purged = Vec::with_capacity(timestamps.len());
|
||||
+ for ts in timestamps {
|
||||
+ match store.delete_message(&thread, ts).await {
|
||||
+ // Both "row deleted" and "row was already absent" mean
|
||||
+ // the store no longer holds this message, so it is safe
|
||||
+ // for the timeline to drop it.
|
||||
+ Ok(_) => purged.push(ts),
|
||||
+ Err(e) => log::warn!("Failed to locally delete message {ts}: {e}"),
|
||||
+ }
|
||||
+ }
|
||||
+ Ok::<Vec<u64>, ApplicationError>(purged)
|
||||
+ })
|
||||
+ .await
|
||||
+ .expect("Failed to spawn tokio")?;
|
||||
+ Ok(purged)
|
||||
+ }
|
||||
+
|
||||
pub async fn submit_recaptcha_challenge<S: AsRef<str>>(
|
||||
&self,
|
||||
token: S,
|
||||
diff --git a/src/backend/message/mod.rs b/src/backend/message/mod.rs
|
||||
index f3a0537..eba08ec 100644
|
||||
--- a/src/backend/message/mod.rs
|
||||
+++ b/src/backend/message/mod.rs
|
||||
@@ -518,6 +518,8 @@ mod imp {
|
||||
pub(super) pending: RefCell<bool>,
|
||||
#[property(get, set)]
|
||||
pub(super) error: RefCell<bool>,
|
||||
+ #[property(get, set)]
|
||||
+ pub(super) selected: RefCell<bool>,
|
||||
|
||||
pub(super) data: RefCell<Option<DataMessage>>,
|
||||
|
||||
diff --git a/src/backend/timeline/mod.rs b/src/backend/timeline/mod.rs
|
||||
index 1ce6a24..18dd436 100644
|
||||
--- a/src/backend/timeline/mod.rs
|
||||
+++ b/src/backend/timeline/mod.rs
|
||||
@@ -44,6 +44,22 @@ impl Timeline {
|
||||
self.items_changed(0, len as u32, 0);
|
||||
}
|
||||
|
||||
+ /// Remove the item with the given timestamp from the timeline, if any.
|
||||
+ /// Returns whether an item was actually removed.
|
||||
+ pub fn remove_by_timestamp(&self, timestamp: u64) -> bool {
|
||||
+ let mut list = self.imp().list.borrow_mut();
|
||||
+ let position = list.binary_search_by_key(×tamp, |i| i.timestamp());
|
||||
+ match position {
|
||||
+ Ok(idx) => {
|
||||
+ list.remove(idx);
|
||||
+ drop(list);
|
||||
+ self.items_changed(idx as u32, 1, 0);
|
||||
+ true
|
||||
+ }
|
||||
+ Err(_) => false,
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
pub fn get_by_timestamp(&self, timestamp: u64) -> Option<TimelineItem> {
|
||||
let current_items = self.imp().list.borrow();
|
||||
let index = current_items.binary_search_by_key(×tamp, |i| i.timestamp());
|
||||
diff --git a/src/gui/channel_messages.rs b/src/gui/channel_messages.rs
|
||||
index b929957..747b36a 100644
|
||||
--- a/src/gui/channel_messages.rs
|
||||
+++ b/src/gui/channel_messages.rs
|
||||
@@ -2,7 +2,8 @@ use crate::prelude::*;
|
||||
use gio::SettingsBindFlags;
|
||||
|
||||
use crate::ApplicationError;
|
||||
-use crate::backend::message::TextMessage;
|
||||
+use crate::backend::message::{DisplayMessage, TextMessage};
|
||||
+use crate::backend::timeline::TimelineItemExt;
|
||||
|
||||
const MESSAGES_REQUEST_LOAD: usize = 10;
|
||||
|
||||
@@ -38,6 +39,52 @@ impl ChannelMessages {
|
||||
self.imp().text_entry.grab_focus();
|
||||
}
|
||||
|
||||
+ /// Collect timestamps of every currently-selected message in the active
|
||||
+ /// channel.
|
||||
+ pub fn collect_selected_timestamps(&self) -> Vec<u64> {
|
||||
+ let Some(channel) = self.active_channel() else {
|
||||
+ return Vec::new();
|
||||
+ };
|
||||
+ channel
|
||||
+ .timeline()
|
||||
+ .iter_forwards()
|
||||
+ .filter(|i| i.is::<DisplayMessage>())
|
||||
+ .filter_map(|i| i.dynamic_cast::<DisplayMessage>().ok())
|
||||
+ .filter(|m| m.property::<bool>("selected"))
|
||||
+ .map(|m| m.timestamp())
|
||||
+ .collect()
|
||||
+ }
|
||||
+
|
||||
+ /// Exit selection mode, clearing all per-message selection state.
|
||||
+ pub fn exit_selection_mode(&self) {
|
||||
+ let Some(channel) = self.active_channel() else {
|
||||
+ return;
|
||||
+ };
|
||||
+ for item in channel.timeline().iter_forwards() {
|
||||
+ if let Some(msg) = item.dynamic_cast_ref::<DisplayMessage>()
|
||||
+ && msg.property::<bool>("selected")
|
||||
+ {
|
||||
+ msg.set_property("selected", false);
|
||||
+ }
|
||||
+ }
|
||||
+ channel.set_selection_mode(false);
|
||||
+ self.refresh_selection_summary();
|
||||
+ }
|
||||
+
|
||||
+ /// Walk the timeline, count how many messages are selected, and update
|
||||
+ /// the displayed selection summary plus the `has-selection` flag.
|
||||
+ pub fn refresh_selection_summary(&self) {
|
||||
+ let count = self.collect_selected_timestamps().len() as u32;
|
||||
+ let summary = if count == 0 {
|
||||
+ gettextrs::gettext("Select messages to delete for yourself")
|
||||
+ } else {
|
||||
+ gettextrs::ngettext("{} message selected", "{} messages selected", count)
|
||||
+ .replace("{}", &count.to_string())
|
||||
+ };
|
||||
+ self.set_selection_summary(summary);
|
||||
+ self.set_has_selection(count > 0);
|
||||
+ }
|
||||
+
|
||||
pub fn focus_input(&self) {
|
||||
self.imp().text_entry.grab_focus();
|
||||
}
|
||||
@@ -187,6 +234,45 @@ impl ChannelMessages {
|
||||
}
|
||||
}
|
||||
|
||||
+ /// Wire the selection summary so the toolbar reflects how many messages
|
||||
+ /// are selected. Called whenever the active channel changes.
|
||||
+ fn setup_selection_listener(&self) {
|
||||
+ self.refresh_selection_summary();
|
||||
+
|
||||
+ // Disconnect handlers attached on the previous active channel so we
|
||||
+ // don't accumulate one per channel switch.
|
||||
+ for (prev_channel, handler) in self.imp().selection_handlers.take() {
|
||||
+ prev_channel.disconnect(handler);
|
||||
+ }
|
||||
+
|
||||
+ if let Some(channel) = self.active_channel() {
|
||||
+ let mut handlers = self.imp().selection_handlers.borrow_mut();
|
||||
+ let h = channel.connect_local(
|
||||
+ "selection-changed",
|
||||
+ false,
|
||||
+ clone!(
|
||||
+ #[weak(rename_to = s)]
|
||||
+ self,
|
||||
+ #[upgrade_or_default]
|
||||
+ move |_| {
|
||||
+ s.refresh_selection_summary();
|
||||
+ None
|
||||
+ }
|
||||
+ ),
|
||||
+ );
|
||||
+ handlers.push((channel.clone(), h));
|
||||
+ let h = channel.connect_notify_local(
|
||||
+ Some("selection-mode"),
|
||||
+ clone!(
|
||||
+ #[weak(rename_to = s)]
|
||||
+ self,
|
||||
+ move |_, _| s.refresh_selection_summary()
|
||||
+ ),
|
||||
+ );
|
||||
+ handlers.push((channel, h));
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
/// Send a `Started` typing event for the active channel.
|
||||
///
|
||||
/// Schedules a periodic refresh so the receiver does not let the
|
||||
@@ -387,6 +473,10 @@ pub mod imp {
|
||||
has_attachments: PhantomData<bool>,
|
||||
#[property(get, set)]
|
||||
show_typing: Cell<bool>,
|
||||
+ #[property(get, set)]
|
||||
+ selection_summary: RefCell<String>,
|
||||
+ #[property(get, set)]
|
||||
+ has_selection: Cell<bool>,
|
||||
|
||||
/// Whether we currently believe the user is composing a message in the
|
||||
/// active channel and have informed the peer with a `Started` event.
|
||||
@@ -437,6 +527,7 @@ pub mod imp {
|
||||
// Inform the previous channel we have stopped typing before we
|
||||
// forget about it.
|
||||
self.obj().send_typing_stopped();
|
||||
+ self.obj().exit_selection_mode();
|
||||
|
||||
if let Some(active_chan) = self.active_channel.borrow().as_ref() {
|
||||
active_chan.set_property("draft", self.text_entry.text());
|
||||
@@ -449,6 +540,7 @@ pub mod imp {
|
||||
|
||||
self.obj().focus_input();
|
||||
self.obj().setup_typing_indicator();
|
||||
+ self.obj().setup_selection_listener();
|
||||
}
|
||||
|
||||
#[template_callback(function)]
|
||||
@@ -505,6 +597,41 @@ pub mod imp {
|
||||
self.text_entry.clear();
|
||||
}
|
||||
|
||||
+ #[template_callback]
|
||||
+ fn cancel_selection(&self) {
|
||||
+ self.obj().exit_selection_mode();
|
||||
+ }
|
||||
+
|
||||
+ #[template_callback]
|
||||
+ fn delete_selection(&self) {
|
||||
+ let obj = self.obj();
|
||||
+ let Some(channel) = obj.active_channel() else {
|
||||
+ return;
|
||||
+ };
|
||||
+ let timestamps = obj.collect_selected_timestamps();
|
||||
+ obj.exit_selection_mode();
|
||||
+ if timestamps.is_empty() {
|
||||
+ return;
|
||||
+ }
|
||||
+ gspawn!(clone!(
|
||||
+ #[strong]
|
||||
+ channel,
|
||||
+ #[strong]
|
||||
+ obj,
|
||||
+ async move {
|
||||
+ if let Err(e) = channel.delete_messages_locally(timestamps).await {
|
||||
+ let root = obj
|
||||
+ .root()
|
||||
+ .expect("`ChannelMessages` to have a root")
|
||||
+ .dynamic_cast::<crate::gui::Window>()
|
||||
+ .expect("Root of `ChannelMessages` to be a `Window`.");
|
||||
+ let dialog = ErrorDialog::new(&e, &root);
|
||||
+ dialog.present(Some(&root));
|
||||
+ }
|
||||
+ }
|
||||
+ ));
|
||||
+ }
|
||||
+
|
||||
#[template_callback]
|
||||
fn remove_attachments(&self) {
|
||||
log::trace!("Unsetting attachments");
|
||||
diff --git a/src/gui/message_item.rs b/src/gui/message_item.rs
|
||||
index 59d2778..d88306f 100644
|
||||
--- a/src/gui/message_item.rs
|
||||
+++ b/src/gui/message_item.rs
|
||||
@@ -34,6 +34,7 @@ impl MessageItem {
|
||||
s.setup_text();
|
||||
s.setup_requires_attention();
|
||||
s.setup_pending_and_error();
|
||||
+ s.setup_selection();
|
||||
s
|
||||
}
|
||||
|
||||
@@ -102,6 +103,16 @@ impl MessageItem {
|
||||
s.imp().handle_edit();
|
||||
}
|
||||
));
|
||||
+ let action_select = SimpleAction::new("select", None);
|
||||
+ action_select.connect_activate(clone!(
|
||||
+ #[weak(rename_to = s)]
|
||||
+ self,
|
||||
+ move |_, _| {
|
||||
+ let msg = s.message();
|
||||
+ msg.channel().set_selection_mode(true);
|
||||
+ msg.set_property("selected", true);
|
||||
+ }
|
||||
+ ));
|
||||
|
||||
let actions = SimpleActionGroup::new();
|
||||
self.insert_action_group("msg", Some(&actions));
|
||||
@@ -111,6 +122,7 @@ impl MessageItem {
|
||||
actions.add_action(&action_download);
|
||||
actions.add_action(&action_edit);
|
||||
actions.add_action(&action_open);
|
||||
+ actions.add_action(&action_select);
|
||||
|
||||
self.bind_property("message", &action_delete, "enabled")
|
||||
.transform_to(|_, msg: Option<TextMessage>| msg.map(|m| m.sender().is_self()))
|
||||
@@ -236,6 +248,22 @@ impl MessageItem {
|
||||
self.imp().msg_menu.popup();
|
||||
}
|
||||
|
||||
+ /// Connect a notify handler on `target` and remember its handler id so
|
||||
+ /// we can disconnect it when the MessageItem is disposed; without this,
|
||||
+ /// closures keep accumulating on long-lived Channel/Message objects as
|
||||
+ /// the ListView recycles widgets across the timeline.
|
||||
+ fn track_notify_local<F>(&self, target: &impl IsA<glib::Object>, name: &str, f: F)
|
||||
+ where
|
||||
+ F: Fn(&glib::Object, &glib::ParamSpec) + 'static,
|
||||
+ {
|
||||
+ let target_obj = target.clone().upcast::<glib::Object>();
|
||||
+ let handler = target_obj.connect_notify_local(Some(name), f);
|
||||
+ self.imp()
|
||||
+ .tracked_handlers
|
||||
+ .borrow_mut()
|
||||
+ .push((target_obj, handler));
|
||||
+ }
|
||||
+
|
||||
/// Set whether this item should show its header.
|
||||
pub fn set_show_header(&self) {
|
||||
let visible = self.message().show_header() || self.property("force-show-header");
|
||||
@@ -330,6 +358,81 @@ impl MessageItem {
|
||||
message.notify("pending");
|
||||
message.notify("error");
|
||||
}
|
||||
+
|
||||
+ /// Wire the message item's selection-mode CSS class so it visually
|
||||
+ /// reflects the channel's `selection-mode` and the message's `selected`
|
||||
+ /// state.
|
||||
+ pub fn setup_selection(&self) {
|
||||
+ let message = self.message();
|
||||
+ let channel = message.channel();
|
||||
+ // The closure only updates visual state. Resetting `selected` on
|
||||
+ // exit lives in `ChannelMessages::exit_selection_mode` (the only
|
||||
+ // path that flips `selection-mode` back off), which guards the
|
||||
+ // write so it does not bounce off glib's autogen notify-always
|
||||
+ // setter. Doing the write here unconditionally would re-enter the
|
||||
+ // notify::selected handler and recurse via the selection-changed
|
||||
+ // signal we emit from it — visible as a cpu-bound spin on first
|
||||
+ // load of a long timeline like Note to self.
|
||||
+ let update = clone!(
|
||||
+ #[weak(rename_to = s)]
|
||||
+ self,
|
||||
+ move || {
|
||||
+ let chan = s.message().channel();
|
||||
+ let in_mode = chan.selection_mode();
|
||||
+ if in_mode {
|
||||
+ s.add_css_class("in-selection-mode");
|
||||
+ } else {
|
||||
+ s.remove_css_class("in-selection-mode");
|
||||
+ }
|
||||
+ if s.message().property::<bool>("selected") && in_mode {
|
||||
+ s.add_css_class("selected");
|
||||
+ } else {
|
||||
+ s.remove_css_class("selected");
|
||||
+ }
|
||||
+ }
|
||||
+ );
|
||||
+ self.track_notify_local(&channel, "selection-mode", {
|
||||
+ let update = update.clone();
|
||||
+ move |_, _| update()
|
||||
+ });
|
||||
+ self.track_notify_local(&message, "selected", {
|
||||
+ let update = update.clone();
|
||||
+ let weak = self.downgrade();
|
||||
+ move |_, _| {
|
||||
+ update();
|
||||
+ if let Some(s) = weak.upgrade() {
|
||||
+ s.message()
|
||||
+ .channel()
|
||||
+ .emit_by_name::<()>("selection-changed", &[]);
|
||||
+ }
|
||||
+ }
|
||||
+ });
|
||||
+
|
||||
+ // While the channel is in selection mode, primary-button clicks
|
||||
+ // anywhere on the row toggle the message's `selected` flag and the
|
||||
+ // gesture claims the event sequence so child widgets (label links,
|
||||
+ // attachments, the popover trigger, the check button) do not also
|
||||
+ // act on the click. The check button itself is `can-target: false`
|
||||
+ // in the template so its visual state is driven purely by the bind
|
||||
+ // to `message.selected` rather than its own toggled signal.
|
||||
+ let click = gtk::GestureClick::builder()
|
||||
+ .button(gdk::BUTTON_PRIMARY)
|
||||
+ .propagation_phase(gtk::PropagationPhase::Capture)
|
||||
+ .build();
|
||||
+ click.connect_pressed(clone!(
|
||||
+ #[weak(rename_to = s)]
|
||||
+ self,
|
||||
+ move |gesture, _, _, _| {
|
||||
+ if s.message().channel().selection_mode() {
|
||||
+ let cur: bool = s.message().property("selected");
|
||||
+ s.message().set_property("selected", !cur);
|
||||
+ gesture.set_state(gtk::EventSequenceState::Claimed);
|
||||
+ }
|
||||
+ }
|
||||
+ ));
|
||||
+ self.add_controller(click);
|
||||
+ update();
|
||||
+ }
|
||||
}
|
||||
|
||||
pub mod imp {
|
||||
@@ -360,6 +463,8 @@ pub mod imp {
|
||||
#[template_child]
|
||||
pub(super) avatar: TemplateChild<adw::Avatar>,
|
||||
#[template_child]
|
||||
+ pub(super) selection_check: TemplateChild<gtk::CheckButton>,
|
||||
+ #[template_child]
|
||||
pub(super) header: TemplateChild<gtk::Label>,
|
||||
#[template_child]
|
||||
pub(super) reactions: TemplateChild<gtk::Label>,
|
||||
@@ -399,6 +504,14 @@ pub mod imp {
|
||||
shows_media_loading: PhantomData<bool>,
|
||||
#[property(get = Self::has_reaction)]
|
||||
has_reaction: PhantomData<bool>,
|
||||
+
|
||||
+ /// Handlers we attached on long-lived objects (the message and the
|
||||
+ /// channel). Channel outlives every MessageItem and the timeline
|
||||
+ /// holds messages across list-view widget recycling, so without an
|
||||
+ /// explicit disconnect each MessageItem we ever build leaves a
|
||||
+ /// no-op closure attached to its message and channel forever.
|
||||
+ pub(super) tracked_handlers:
|
||||
+ RefCell<Vec<(glib::Object, glib::SignalHandlerId)>>,
|
||||
}
|
||||
|
||||
#[glib::object_subclass]
|
||||
@@ -668,6 +781,12 @@ pub mod imp {
|
||||
});
|
||||
SIGNALS.as_ref()
|
||||
}
|
||||
+
|
||||
+ fn dispose(&self) {
|
||||
+ for (target, handler) in self.tracked_handlers.take() {
|
||||
+ target.disconnect(handler);
|
||||
+ }
|
||||
+ }
|
||||
}
|
||||
|
||||
impl WidgetImpl for MessageItem {}
|
||||
--
|
||||
2.53.0
|
||||
|
||||
406
patches/flare/0005-feat-messages-In-channel-message-search.patch
Normal file
406
patches/flare/0005-feat-messages-In-channel-message-search.patch
Normal file
@@ -0,0 +1,406 @@
|
||||
From 91731979312b65e9b59b2dc58be0067bc5f9f206 Mon Sep 17 00:00:00 2001
|
||||
From: Simon Gardling <titaniumtown@proton.me>
|
||||
Date: Wed, 29 Apr 2026 19:58:54 -0400
|
||||
Subject: [PATCH 5/6] feat(messages): In-channel message search
|
||||
|
||||
- Add a SearchBar above the message list that searches the
|
||||
currently-loaded timeline using a case-insensitive substring match
|
||||
against the message body. Bind it to a new
|
||||
channel-messages.toggle-search action wired to Ctrl+Shift+F.
|
||||
- Surface a match counter (current/total) and previous/next buttons
|
||||
next to the entry, plus reuse the existing flash_requires_attention
|
||||
helper to scroll to and briefly highlight the focused match.
|
||||
- Reset matches when the bar closes or the active channel changes.
|
||||
---
|
||||
CHANGELOG.md | 1 +
|
||||
data/resources/ui/channel_messages.blp | 55 +++++++
|
||||
data/resources/ui/shortcuts.blp | 5 +
|
||||
src/gui/channel_messages.rs | 203 ++++++++++++++++++++++++-
|
||||
src/gui/window.rs | 11 ++
|
||||
5 files changed, 274 insertions(+), 1 deletion(-)
|
||||
|
||||
diff --git a/CHANGELOG.md b/CHANGELOG.md
|
||||
index 47ec77a..16880cd 100644
|
||||
--- a/CHANGELOG.md
|
||||
+++ b/CHANGELOG.md
|
||||
@@ -14,6 +14,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
||||
- Send formatted messages with markdown-style markers (`**bold**`, `*italic*`, `~~strike~~`, `||spoiler||`, `` `monospace` ``).
|
||||
- Display incoming edited messages with an `edited` indicator and edit your own sent messages from their context menu.
|
||||
- Multi-select messages from their context menu and delete the selection locally with a single action.
|
||||
+- In-channel message search (Ctrl+Shift+F) over the loaded timeline with prev/next navigation and a match counter.
|
||||
|
||||
## [0.20.4] - 2026-04-22
|
||||
|
||||
diff --git a/data/resources/ui/channel_messages.blp b/data/resources/ui/channel_messages.blp
|
||||
index eb927f8..a166f7b 100644
|
||||
--- a/data/resources/ui/channel_messages.blp
|
||||
+++ b/data/resources/ui/channel_messages.blp
|
||||
@@ -43,6 +43,61 @@ template $FlChannelMessages: Box {
|
||||
hexpand: true;
|
||||
orientation: vertical;
|
||||
|
||||
+ // In-channel message search.
|
||||
+ SearchBar search_bar {
|
||||
+ key-capture-widget: scrolled_window;
|
||||
+ search-mode-enabled: bind template.search-active bidirectional;
|
||||
+
|
||||
+ Adw.Clamp {
|
||||
+ maximum-size: 600;
|
||||
+
|
||||
+ Box {
|
||||
+ orientation: horizontal;
|
||||
+ spacing: 6;
|
||||
+
|
||||
+ SearchEntry search_entry {
|
||||
+ hexpand: true;
|
||||
+ placeholder-text: _("Search loaded messages");
|
||||
+ search-changed => $on_search_query_changed() swapped;
|
||||
+ previous-match => $on_search_previous() swapped;
|
||||
+ next-match => $on_search_next() swapped;
|
||||
+ stop-search => $on_search_stop() swapped;
|
||||
+ }
|
||||
+
|
||||
+ Label {
|
||||
+ styles [
|
||||
+ "caption",
|
||||
+ "dim-label",
|
||||
+ ]
|
||||
+
|
||||
+ label: bind template.search-summary;
|
||||
+ }
|
||||
+
|
||||
+ Button {
|
||||
+ icon-name: "go-up-symbolic";
|
||||
+ tooltip-text: C_("tooltip", "Previous match");
|
||||
+ sensitive: bind template.has-matches;
|
||||
+ clicked => $on_search_previous() swapped;
|
||||
+
|
||||
+ styles [
|
||||
+ "flat",
|
||||
+ ]
|
||||
+ }
|
||||
+
|
||||
+ Button {
|
||||
+ icon-name: "go-down-symbolic";
|
||||
+ tooltip-text: C_("tooltip", "Next match");
|
||||
+ sensitive: bind template.has-matches;
|
||||
+ clicked => $on_search_next() swapped;
|
||||
+
|
||||
+ styles [
|
||||
+ "flat",
|
||||
+ ]
|
||||
+ }
|
||||
+ }
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
Overlay {
|
||||
[overlay]
|
||||
Adw.Spinner {
|
||||
diff --git a/data/resources/ui/shortcuts.blp b/data/resources/ui/shortcuts.blp
|
||||
index ed2a959..79339cc 100644
|
||||
--- a/data/resources/ui/shortcuts.blp
|
||||
+++ b/data/resources/ui/shortcuts.blp
|
||||
@@ -58,5 +58,10 @@ Adw.ShortcutsDialog help_overlay {
|
||||
title: C_("shortcut window", "Load more messages");
|
||||
accelerator: "<Ctrl>&l";
|
||||
}
|
||||
+
|
||||
+ Adw.ShortcutsItem {
|
||||
+ title: C_("shortcut window", "Search messages in current channel");
|
||||
+ accelerator: "<Ctrl><Shift>&f";
|
||||
+ }
|
||||
}
|
||||
}
|
||||
diff --git a/src/gui/channel_messages.rs b/src/gui/channel_messages.rs
|
||||
index 747b36a..6dd8e84 100644
|
||||
--- a/src/gui/channel_messages.rs
|
||||
+++ b/src/gui/channel_messages.rs
|
||||
@@ -2,7 +2,7 @@ use crate::prelude::*;
|
||||
use gio::SettingsBindFlags;
|
||||
|
||||
use crate::ApplicationError;
|
||||
-use crate::backend::message::{DisplayMessage, TextMessage};
|
||||
+use crate::backend::message::{DisplayMessage, DisplayMessageExt, TextMessage};
|
||||
use crate::backend::timeline::TimelineItemExt;
|
||||
|
||||
const MESSAGES_REQUEST_LOAD: usize = 10;
|
||||
@@ -85,6 +85,163 @@ impl ChannelMessages {
|
||||
self.set_has_selection(count > 0);
|
||||
}
|
||||
|
||||
+ /// Connect the `search-active` property so the entry is focused when
|
||||
+ /// the bar opens and the matches are cleared when it closes.
|
||||
+ fn setup_search(&self) {
|
||||
+ self.connect_notify_local(
|
||||
+ Some("search-active"),
|
||||
+ clone!(
|
||||
+ #[weak(rename_to = s)]
|
||||
+ self,
|
||||
+ move |_, _| {
|
||||
+ if s.search_active() {
|
||||
+ s.imp().search_entry.grab_focus();
|
||||
+ s.attach_search_message_listener();
|
||||
+ } else {
|
||||
+ s.detach_search_message_listener();
|
||||
+ s.imp().search_entry.set_text("");
|
||||
+ s.imp().search_matches.replace(Vec::new());
|
||||
+ s.imp().search_index.set(0);
|
||||
+ s.set_has_matches(false);
|
||||
+ s.set_search_summary(String::new());
|
||||
+ }
|
||||
+ }
|
||||
+ ),
|
||||
+ );
|
||||
+ }
|
||||
+
|
||||
+ /// While the search bar is open, watch the active channel for new
|
||||
+ /// messages so the result set stays in sync with the timeline. Only
|
||||
+ /// one handler is alive at a time; `detach_search_message_listener` or
|
||||
+ /// the next `attach` call clears the previous one.
|
||||
+ fn attach_search_message_listener(&self) {
|
||||
+ self.detach_search_message_listener();
|
||||
+ let Some(channel) = self.active_channel() else {
|
||||
+ return;
|
||||
+ };
|
||||
+ let handler = channel.connect_local(
|
||||
+ "message",
|
||||
+ false,
|
||||
+ clone!(
|
||||
+ #[weak(rename_to = s)]
|
||||
+ self,
|
||||
+ #[upgrade_or_default]
|
||||
+ move |_| {
|
||||
+ let query = s.imp().search_entry.text().to_string();
|
||||
+ s.refresh_search(&query);
|
||||
+ None
|
||||
+ }
|
||||
+ ),
|
||||
+ );
|
||||
+ self.imp()
|
||||
+ .search_message_handler
|
||||
+ .replace(Some((channel, handler)));
|
||||
+ }
|
||||
+
|
||||
+ fn detach_search_message_listener(&self) {
|
||||
+ if let Some((channel, handler)) = self.imp().search_message_handler.take() {
|
||||
+ channel.disconnect(handler);
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
+ /// Re-run the loaded-message search using `query`. The search is
|
||||
+ /// case-insensitive substring match against the message body.
|
||||
+ pub fn refresh_search(&self, query: &str) {
|
||||
+ let imp = self.imp();
|
||||
+ let Some(channel) = self.active_channel() else {
|
||||
+ imp.search_matches.replace(Vec::new());
|
||||
+ imp.search_index.set(0);
|
||||
+ self.set_has_matches(false);
|
||||
+ self.set_search_summary(String::new());
|
||||
+ return;
|
||||
+ };
|
||||
+
|
||||
+ let trimmed = query.trim();
|
||||
+ if trimmed.is_empty() {
|
||||
+ imp.search_matches.replace(Vec::new());
|
||||
+ imp.search_index.set(0);
|
||||
+ self.set_has_matches(false);
|
||||
+ self.set_search_summary(String::new());
|
||||
+ return;
|
||||
+ }
|
||||
+
|
||||
+ let needle = trimmed.to_lowercase();
|
||||
+ let matches: Vec<u64> = channel
|
||||
+ .timeline()
|
||||
+ .iter_forwards()
|
||||
+ .filter_map(|i| i.dynamic_cast::<TextMessage>().ok())
|
||||
+ .filter(|m| {
|
||||
+ m.body()
|
||||
+ .map(|b| b.to_lowercase().contains(&needle))
|
||||
+ .unwrap_or(false)
|
||||
+ })
|
||||
+ .map(|m| m.timestamp())
|
||||
+ .collect();
|
||||
+
|
||||
+ let total = matches.len();
|
||||
+ imp.search_matches.replace(matches);
|
||||
+ // Snap to the latest match (most recent in time) by default so the
|
||||
+ // user lands at the bottom of the conversation, matching how the
|
||||
+ // existing scroll-to-unread heuristic works.
|
||||
+ imp.search_index.set(total.saturating_sub(1));
|
||||
+ self.set_has_matches(total > 0);
|
||||
+ self.update_search_summary();
|
||||
+ self.flash_current_search_match();
|
||||
+ }
|
||||
+
|
||||
+ /// Move the search cursor to the next or previous match and flash that
|
||||
+ /// message.
|
||||
+ pub fn goto_search_match(&self, forwards: bool) {
|
||||
+ let imp = self.imp();
|
||||
+ let total = imp.search_matches.borrow().len();
|
||||
+ if total == 0 {
|
||||
+ return;
|
||||
+ }
|
||||
+ let current = imp.search_index.get();
|
||||
+ let next = if forwards {
|
||||
+ (current + 1) % total
|
||||
+ } else if current == 0 {
|
||||
+ total - 1
|
||||
+ } else {
|
||||
+ current - 1
|
||||
+ };
|
||||
+ imp.search_index.set(next);
|
||||
+ self.update_search_summary();
|
||||
+ self.flash_current_search_match();
|
||||
+ }
|
||||
+
|
||||
+ fn update_search_summary(&self) {
|
||||
+ let imp = self.imp();
|
||||
+ let total = imp.search_matches.borrow().len();
|
||||
+ let summary = if total == 0 {
|
||||
+ String::new()
|
||||
+ } else {
|
||||
+ // Translators: e.g. "3 of 12" indicating the focused match
|
||||
+ // index out of total search matches.
|
||||
+ gettextrs::gettext("{current} of {total}")
|
||||
+ .replace("{current}", &(imp.search_index.get() + 1).to_string())
|
||||
+ .replace("{total}", &total.to_string())
|
||||
+ };
|
||||
+ self.set_search_summary(summary);
|
||||
+ }
|
||||
+
|
||||
+ fn flash_current_search_match(&self) {
|
||||
+ let imp = self.imp();
|
||||
+ let matches = imp.search_matches.borrow();
|
||||
+ let Some(timestamp) = matches.get(imp.search_index.get()).copied() else {
|
||||
+ return;
|
||||
+ };
|
||||
+ drop(matches);
|
||||
+ let Some(channel) = self.active_channel() else {
|
||||
+ return;
|
||||
+ };
|
||||
+ if let Some(item) = channel.timeline().get_by_timestamp(timestamp)
|
||||
+ && let Some(msg) = item.dynamic_cast_ref::<DisplayMessage>()
|
||||
+ {
|
||||
+ msg.flash_requires_attention();
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
pub fn focus_input(&self) {
|
||||
self.imp().text_entry.grab_focus();
|
||||
}
|
||||
@@ -446,6 +603,8 @@ pub mod imp {
|
||||
#[template_child]
|
||||
pub(super) text_entry: TemplateChild<TextEntry>,
|
||||
#[template_child]
|
||||
+ pub(super) search_entry: TemplateChild<gtk::SearchEntry>,
|
||||
+ #[template_child]
|
||||
pub(super) list_view: TemplateChild<gtk::ListView>,
|
||||
#[template_child]
|
||||
no_channels_page: TemplateChild<adw::StatusPage>,
|
||||
@@ -478,6 +637,25 @@ pub mod imp {
|
||||
#[property(get, set)]
|
||||
has_selection: Cell<bool>,
|
||||
|
||||
+ #[property(get, set)]
|
||||
+ search_active: Cell<bool>,
|
||||
+ #[property(get, set)]
|
||||
+ search_summary: RefCell<String>,
|
||||
+ #[property(get, set)]
|
||||
+ has_matches: Cell<bool>,
|
||||
+
|
||||
+ /// Cached timestamps of every message currently matching the search
|
||||
+ /// query, in chronological order.
|
||||
+ pub(super) search_matches: RefCell<Vec<u64>>,
|
||||
+ /// Index into `search_matches` of the currently focused match.
|
||||
+ pub(super) search_index: Cell<usize>,
|
||||
+ /// Handler installed on the active channel's `message` signal
|
||||
+ /// while the search bar is open, so newly-arrived messages are
|
||||
+ /// folded into the match set without the user re-running the
|
||||
+ /// search by hand.
|
||||
+ pub(super) search_message_handler:
|
||||
+ RefCell<Option<(Channel, glib::SignalHandlerId)>>,
|
||||
+
|
||||
/// Whether we currently believe the user is composing a message in the
|
||||
/// active channel and have informed the peer with a `Started` event.
|
||||
pub(super) sending_typing: Cell<bool>,
|
||||
@@ -520,6 +698,7 @@ pub mod imp {
|
||||
self.obj().setup_send_on_enter();
|
||||
self.obj().setup_typing_settings();
|
||||
self.obj().setup_typing_send();
|
||||
+ self.obj().setup_search();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -528,6 +707,7 @@ pub mod imp {
|
||||
// forget about it.
|
||||
self.obj().send_typing_stopped();
|
||||
self.obj().exit_selection_mode();
|
||||
+ self.obj().set_search_active(false);
|
||||
|
||||
if let Some(active_chan) = self.active_channel.borrow().as_ref() {
|
||||
active_chan.set_property("draft", self.text_entry.text());
|
||||
@@ -820,6 +1000,27 @@ pub mod imp {
|
||||
}
|
||||
}
|
||||
|
||||
+ #[template_callback]
|
||||
+ fn on_search_query_changed(&self) {
|
||||
+ let query = self.search_entry.text().to_string();
|
||||
+ self.obj().refresh_search(&query);
|
||||
+ }
|
||||
+
|
||||
+ #[template_callback]
|
||||
+ fn on_search_previous(&self) {
|
||||
+ self.obj().goto_search_match(false);
|
||||
+ }
|
||||
+
|
||||
+ #[template_callback]
|
||||
+ fn on_search_next(&self) {
|
||||
+ self.obj().goto_search_match(true);
|
||||
+ }
|
||||
+
|
||||
+ #[template_callback]
|
||||
+ fn on_search_stop(&self) {
|
||||
+ self.obj().set_search_active(false);
|
||||
+ }
|
||||
+
|
||||
#[template_callback]
|
||||
fn handle_row_activated(&self, row: gtk::ListBoxRow) {
|
||||
if let Ok(msg) = row
|
||||
diff --git a/src/gui/window.rs b/src/gui/window.rs
|
||||
index 6335f3d..ce097ce 100644
|
||||
--- a/src/gui/window.rs
|
||||
+++ b/src/gui/window.rs
|
||||
@@ -20,6 +20,7 @@ impl Window {
|
||||
app.set_accels_for_action("window.close", &["<Control>q"]);
|
||||
app.set_accels_for_action("channel-messages.activate-input", &["<Control>i"]);
|
||||
app.set_accels_for_action("channel-messages.load-more", &["<Control>l"]);
|
||||
+ app.set_accels_for_action("channel-messages.toggle-search", &["<Control><Shift>f"]);
|
||||
for i in 1..=9 {
|
||||
app.set_accels_for_action(
|
||||
&format!("channel-list.activate-channel({i})"),
|
||||
@@ -531,10 +532,20 @@ pub mod imp {
|
||||
channel_messages.load_more();
|
||||
}
|
||||
));
|
||||
+ let action_toggle_search = SimpleAction::new("toggle-search", None);
|
||||
+ action_toggle_search.connect_activate(clone!(
|
||||
+ #[strong(rename_to = channel_messages)]
|
||||
+ self.channel_messages,
|
||||
+ move |_, _| {
|
||||
+ let active = !channel_messages.search_active();
|
||||
+ channel_messages.set_search_active(active);
|
||||
+ }
|
||||
+ ));
|
||||
let actions = SimpleActionGroup::new();
|
||||
obj.insert_action_group("channel-messages", Some(&actions));
|
||||
actions.add_action(&action_activate_input);
|
||||
actions.add_action(&action_load_more);
|
||||
+ actions.add_action(&action_toggle_search);
|
||||
|
||||
// Channel list actions.
|
||||
|
||||
--
|
||||
2.53.0
|
||||
|
||||
@@ -0,0 +1,175 @@
|
||||
From 46765e848362129bb2d0fc34b2047e6cb2555258 Mon Sep 17 00:00:00 2001
|
||||
From: Simon Gardling <titaniumtown@proton.me>
|
||||
Date: Thu, 30 Apr 2026 04:25:07 -0400
|
||||
Subject: [PATCH 6/6] feat(messages): Show 'This message was deleted.'
|
||||
placeholder
|
||||
|
||||
Upstream hides the whole MessageItem when is-deleted is true via a
|
||||
top-level visible bind on the template root. Replace that with a
|
||||
Signal-Desktop-style behaviour: the row stays in the timeline, but the
|
||||
bubble's regular content (header, quote, attachments, label, popover
|
||||
trigger) and any reactions are hidden, and a single italic-dim
|
||||
placeholder label takes their place.
|
||||
|
||||
The two pieces of imperative state set in code (media_overlay and the
|
||||
floating timestamp_img indicator over media-only messages) are reset
|
||||
in a small setup_deleted helper that subscribes to is-deleted, since
|
||||
they are not reachable through bindings.
|
||||
---
|
||||
data/resources/style.css | 6 ++++++
|
||||
data/resources/ui/message_item.blp | 29 +++++++++++++++++++++----
|
||||
src/gui/message_item.rs | 34 ++++++++++++++++++++++++++++++
|
||||
3 files changed, 65 insertions(+), 4 deletions(-)
|
||||
|
||||
diff --git a/data/resources/style.css b/data/resources/style.css
|
||||
index 1c0cdfd..3b0de9a 100644
|
||||
--- a/data/resources/style.css
|
||||
+++ b/data/resources/style.css
|
||||
@@ -9,6 +9,12 @@
|
||||
background-color: @bubble_bg_color;
|
||||
}
|
||||
|
||||
+/* Deletion placeholder shown in place of remotely-deleted messages. */
|
||||
+.deleted-message {
|
||||
+ font-style: italic;
|
||||
+ opacity: 0.6;
|
||||
+}
|
||||
+
|
||||
.message-input-bar {
|
||||
border-top: 1px solid @borders;
|
||||
}
|
||||
diff --git a/data/resources/ui/message_item.blp b/data/resources/ui/message_item.blp
|
||||
index ba3fd23..49002a5 100644
|
||||
--- a/data/resources/ui/message_item.blp
|
||||
+++ b/data/resources/ui/message_item.blp
|
||||
@@ -71,8 +71,6 @@ template $FlMessageItem: $ContextMenuBin {
|
||||
"message-item",
|
||||
]
|
||||
|
||||
- visible: bind $not(template.message as <$FlTextMessage>.is-deleted) as <bool>;
|
||||
-
|
||||
Grid {
|
||||
column-spacing: 12;
|
||||
row-spacing: 12;
|
||||
@@ -149,11 +147,32 @@ template $FlMessageItem: $ContextMenuBin {
|
||||
"message-bubble",
|
||||
]
|
||||
|
||||
+ // Deletion placeholder "This message was deleted." — visible only when
|
||||
+ // the message has been remotely deleted; hides the rest of the bubble's
|
||||
+ // content and any reactions/attachments via the .deleted CSS class on
|
||||
+ // the message-item.
|
||||
+ Label deleted_label {
|
||||
+ styles [
|
||||
+ "deleted-message",
|
||||
+ ]
|
||||
+
|
||||
+ label: _("This message was deleted.");
|
||||
+ visible: bind template.message as <$FlTextMessage>.is-deleted;
|
||||
+ halign: start;
|
||||
+ xalign: 0;
|
||||
+
|
||||
+ layout {
|
||||
+ row: 0;
|
||||
+ column: 0;
|
||||
+ }
|
||||
+ }
|
||||
+
|
||||
Label header {
|
||||
styles [
|
||||
"heading",
|
||||
]
|
||||
|
||||
+ visible: bind $not(template.message as <$FlTextMessage>.is-deleted) as <bool>;
|
||||
label: bind template.message as <$FlTextMessage>.sender as <$FlContact>.title;
|
||||
hexpand: true;
|
||||
halign: start;
|
||||
@@ -177,7 +196,7 @@ template $FlMessageItem: $ContextMenuBin {
|
||||
"quote",
|
||||
]
|
||||
|
||||
- visible: bind $is_some(template.message as <$FlTextMessage>.quote) as <bool>;
|
||||
+ visible: bind $and($is_some(template.message as <$FlTextMessage>.quote) as <bool>, $not(template.message as <$FlTextMessage>.is-deleted) as <bool>) as <bool>;
|
||||
|
||||
Label {
|
||||
styles [
|
||||
@@ -219,6 +238,7 @@ template $FlMessageItem: $ContextMenuBin {
|
||||
}
|
||||
|
||||
Box box_attachments {
|
||||
+ visible: bind $not(template.message as <$FlTextMessage>.is-deleted) as <bool>;
|
||||
layout {
|
||||
row: 2;
|
||||
column: 0;
|
||||
@@ -276,6 +296,7 @@ template $FlMessageItem: $ContextMenuBin {
|
||||
}
|
||||
|
||||
$FlMessageLabel label_message {
|
||||
+ visible: bind $not(template.message as <$FlTextMessage>.is-deleted) as <bool>;
|
||||
label: bind $markup_urls(template.message as <$FlTextMessage>.body) as <string>;
|
||||
attributes: bind template.message as <$FlTextMessage>.message-attributes;
|
||||
|
||||
@@ -306,7 +327,7 @@ template $FlMessageItem: $ContextMenuBin {
|
||||
]
|
||||
|
||||
label: bind $fix_emoji(template.message as <$FlTextMessage>.reactions) as <string>;
|
||||
- visible: bind template.has-reaction;
|
||||
+ visible: bind $and(template.has-reaction, $not(template.message as <$FlTextMessage>.is-deleted) as <bool>) as <bool>;
|
||||
wrap-mode: word;
|
||||
justify: left;
|
||||
vexpand: false;
|
||||
diff --git a/src/gui/message_item.rs b/src/gui/message_item.rs
|
||||
index d88306f..33479da 100644
|
||||
--- a/src/gui/message_item.rs
|
||||
+++ b/src/gui/message_item.rs
|
||||
@@ -35,6 +35,7 @@ impl MessageItem {
|
||||
s.setup_requires_attention();
|
||||
s.setup_pending_and_error();
|
||||
s.setup_selection();
|
||||
+ s.setup_deleted();
|
||||
s
|
||||
}
|
||||
|
||||
@@ -325,6 +326,39 @@ impl MessageItem {
|
||||
message.notify("requires-attention");
|
||||
}
|
||||
|
||||
+ /// Reflect the message's `is-deleted` state in the UI.
|
||||
+ ///
|
||||
+ /// Upstream simply hid the row entirely; we instead keep the row but
|
||||
+ /// show a `"This message was deleted."` placeholder (handled in the
|
||||
+ /// blueprint) and clean up the bits that the deletion pseudo-message
|
||||
+ /// can't reach via the bind layer: the media overlay (whose visibility
|
||||
+ /// is set imperatively in `set_message`) and the standalone
|
||||
+ /// `timestamp_img` indicator that floats over media-only messages.
|
||||
+ pub fn setup_deleted(&self) {
|
||||
+ let message = self.message();
|
||||
+ let apply = clone!(
|
||||
+ #[weak(rename_to = s)]
|
||||
+ self,
|
||||
+ move || {
|
||||
+ if s.message().is_deleted() {
|
||||
+ s.add_css_class("deleted");
|
||||
+ s.imp().media_overlay.set_visible(false);
|
||||
+ s.imp().timestamp_img.set_visible(false);
|
||||
+ } else {
|
||||
+ // Symmetric reset for any future code path that flips
|
||||
+ // is-deleted back off (e.g. an unsend/restore flow).
|
||||
+ // Today nothing does, but the asymmetry is fragile.
|
||||
+ s.remove_css_class("deleted");
|
||||
+ }
|
||||
+ }
|
||||
+ );
|
||||
+ self.track_notify_local(&message, "is-deleted", {
|
||||
+ let apply = apply.clone();
|
||||
+ move |_, _| apply()
|
||||
+ });
|
||||
+ apply();
|
||||
+ }
|
||||
+
|
||||
pub fn setup_pending_and_error(&self) {
|
||||
let message = self.message();
|
||||
message.connect_notify_local(
|
||||
--
|
||||
2.53.0
|
||||
|
||||
804
patches/omp/0001-fix-reasoning_content.patch
Normal file
804
patches/omp/0001-fix-reasoning_content.patch
Normal file
@@ -0,0 +1,804 @@
|
||||
From e145b627cffb6907e6bde348f1318f48acba3801 Mon Sep 17 00:00:00 2001
|
||||
From: sonhyrd <son.hong.do@hyrd.ai>
|
||||
Date: Mon, 27 Apr 2026 00:00:18 +0700
|
||||
Subject: [PATCH 1/5] fix(ai/providers): cover opencode-go reasoning tool-call
|
||||
history
|
||||
|
||||
---
|
||||
.../providers/openai-completions-compat.ts | 12 +++--
|
||||
.../ai/src/providers/openai-completions.ts | 4 +-
|
||||
.../ai/test/openai-completions-compat.test.ts | 51 +++++++++++++++----
|
||||
3 files changed, 49 insertions(+), 18 deletions(-)
|
||||
|
||||
diff --git a/packages/ai/src/providers/openai-completions-compat.ts b/packages/ai/src/providers/openai-completions-compat.ts
|
||||
index 69f4811c8..c777f312b 100644
|
||||
--- a/packages/ai/src/providers/openai-completions-compat.ts
|
||||
+++ b/packages/ai/src/providers/openai-completions-compat.ts
|
||||
@@ -107,12 +107,14 @@ export function detectOpenAICompat(model: Model<"openai-completions">, resolvedB
|
||||
reasoningContentField: "reasoning_content",
|
||||
// Backends that 400 follow-up requests when prior assistant tool-call turns lack `reasoning_content`:
|
||||
// - Kimi: documented invariant on its native API and via OpenCode-Go.
|
||||
- // - Any reasoning-capable model reached through OpenRouter: DeepSeek V4 Pro and similar enforce
|
||||
- // this server-side whenever the request is in thinking mode. We can't translate Anthropic's
|
||||
- // redacted/encrypted reasoning into DeepSeek's plaintext form, so cross-provider continuations
|
||||
- // rely on a placeholder — see `convertMessages` for the placeholder injection.
|
||||
+ // - Reasoning-capable models reached through OpenRouter or OpenCode-Go: DeepSeek V4 Pro and
|
||||
+ // similar enforce this server-side whenever the request is in thinking mode.
|
||||
+ // We can't translate Anthropic's redacted/encrypted reasoning into DeepSeek's plaintext form, so
|
||||
+ // cross-provider continuations rely on a placeholder — see `convertMessages` for injection rules.
|
||||
requiresReasoningContentForToolCalls:
|
||||
- isKimiModel || ((provider === "openrouter" || baseUrl.includes("openrouter.ai")) && Boolean(model.reasoning)),
|
||||
+ isKimiModel ||
|
||||
+ ((provider === "openrouter" || baseUrl.includes("openrouter.ai") || provider === "opencode-go" ||
|
||||
+ baseUrl.includes("opencode.ai/zen/go")) && Boolean(model.reasoning)),
|
||||
requiresAssistantContentForToolCalls: isKimiModel,
|
||||
openRouterRouting: undefined,
|
||||
vercelGatewayRouting: undefined,
|
||||
diff --git a/packages/ai/src/providers/openai-completions.ts b/packages/ai/src/providers/openai-completions.ts
|
||||
index 3785af106..70f2e3b63 100644
|
||||
--- a/packages/ai/src/providers/openai-completions.ts
|
||||
+++ b/packages/ai/src/providers/openai-completions.ts
|
||||
@@ -1213,8 +1213,8 @@ export function convertMessages(
|
||||
// Inject a `reasoning_content` placeholder on assistant tool-call turns when the backend
|
||||
// rejects history without it. The compat flag captures the rule:
|
||||
// - Kimi (native or via OpenCode-Go): chat completion endpoint demands the field.
|
||||
- // - Reasoning models reached through OpenRouter (e.g. DeepSeek V4 Pro): the underlying
|
||||
- // provider's thinking-mode validator demands it on every prior assistant turn. omp
|
||||
+ // - Reasoning models reached through OpenRouter or OpenCode-Go (e.g. DeepSeek V4 Pro):
|
||||
+ // the upstream thinking-mode validator demands it on every prior assistant turn. omp
|
||||
// cannot synthesize real reasoning when the conversation was warmed up by another
|
||||
// provider whose reasoning is redacted/encrypted (Anthropic) or simply absent, so we
|
||||
// emit a placeholder. Real captured reasoning, when present, is preserved earlier via
|
||||
diff --git a/packages/ai/test/openai-completions-compat.test.ts b/packages/ai/test/openai-completions-compat.test.ts
|
||||
index 6fc3ca9af..6d60ba5e4 100644
|
||||
--- a/packages/ai/test/openai-completions-compat.test.ts
|
||||
+++ b/packages/ai/test/openai-completions-compat.test.ts
|
||||
@@ -283,23 +283,59 @@ describe("openai-completions compatibility", () => {
|
||||
});
|
||||
|
||||
describe("kimi model detection via detectCompat", () => {
|
||||
- function kimiOpenCodeModel(id: string): Model<"openai-completions"> {
|
||||
+ function openCodeGoModel(id: string, reasoning = true): Model<"openai-completions"> {
|
||||
return {
|
||||
...getBundledModel("openai", "gpt-4o-mini"),
|
||||
api: "openai-completions",
|
||||
provider: "opencode-go",
|
||||
baseUrl: "https://opencode.ai/zen/go/v1",
|
||||
id,
|
||||
- reasoning: true,
|
||||
+ reasoning,
|
||||
};
|
||||
}
|
||||
|
||||
+ function kimiOpenCodeModel(id: string): Model<"openai-completions"> {
|
||||
+ return openCodeGoModel(id, true);
|
||||
+ }
|
||||
+
|
||||
it("requires reasoning_content for tool calls on kimi-k2.5 (opencode-go)", () => {
|
||||
const compat = detectCompat(kimiOpenCodeModel("kimi-k2.5"));
|
||||
expect(compat.requiresReasoningContentForToolCalls).toBe(true);
|
||||
expect(compat.requiresAssistantContentForToolCalls).toBe(true);
|
||||
});
|
||||
|
||||
+ it("requires reasoning_content for tool calls on reasoning DeepSeek models via opencode-go", () => {
|
||||
+ const compat = detectCompat(openCodeGoModel("deepseek-v4-pro", true));
|
||||
+ expect(compat.requiresReasoningContentForToolCalls).toBe(true);
|
||||
+ expect(compat.requiresAssistantContentForToolCalls).toBe(false);
|
||||
+ });
|
||||
+
|
||||
+ it("injects reasoning_content placeholder for reasoning DeepSeek tool-call turns via opencode-go", () => {
|
||||
+ const model = openCodeGoModel("deepseek-v4-pro", true);
|
||||
+ const compat = detectCompat(model);
|
||||
+ const toolCallMessage: AssistantMessage = {
|
||||
+ role: "assistant",
|
||||
+ content: [{ type: "toolCall", id: "call_ds_go", name: "web_search", arguments: { query: "hi" } }],
|
||||
+ api: model.api,
|
||||
+ provider: model.provider,
|
||||
+ model: model.id,
|
||||
+ usage: {
|
||||
+ input: 0,
|
||||
+ output: 0,
|
||||
+ cacheRead: 0,
|
||||
+ cacheWrite: 0,
|
||||
+ totalTokens: 0,
|
||||
+ cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0, total: 0 },
|
||||
+ },
|
||||
+ stopReason: "toolUse",
|
||||
+ timestamp: Date.now(),
|
||||
+ };
|
||||
+ const messages = convertMessages(model, { messages: [toolCallMessage] }, compat);
|
||||
+ const assistant = messages.find(m => m.role === "assistant");
|
||||
+ expect(assistant).toBeDefined();
|
||||
+ expect(Reflect.get(assistant as object, "reasoning_content")).toBe(".");
|
||||
+ });
|
||||
+
|
||||
it("injects reasoning_content placeholder when assistant with tool calls has no reasoning field", () => {
|
||||
const model = kimiOpenCodeModel("kimi-k2.5");
|
||||
const compat = detectCompat(model);
|
||||
@@ -338,15 +374,8 @@ describe("kimi model detection via detectCompat", () => {
|
||||
expect((reasoningContent as string).length).toBeGreaterThan(0);
|
||||
});
|
||||
|
||||
- it("does not inject reasoning_content when model is not kimi", () => {
|
||||
- const model: Model<"openai-completions"> = {
|
||||
- ...getBundledModel("openai", "gpt-4o-mini"),
|
||||
- api: "openai-completions",
|
||||
- provider: "opencode-go",
|
||||
- baseUrl: "https://opencode.ai/zen/go/v1",
|
||||
- id: "some-other-model",
|
||||
- };
|
||||
- const compat = detectCompat(model);
|
||||
+ it("does not require reasoning_content when opencode-go model is not reasoning-capable", () => {
|
||||
+ const compat = detectCompat(openCodeGoModel("some-other-model", false));
|
||||
expect(compat.requiresReasoningContentForToolCalls).toBe(false);
|
||||
});
|
||||
|
||||
|
||||
From 70eda0132d7ff48314cbf2dc9560339f0a765d9e Mon Sep 17 00:00:00 2001
|
||||
From: sonhyrd <son.hong.do@hyrd.ai>
|
||||
Date: Mon, 27 Apr 2026 00:08:04 +0700
|
||||
Subject: [PATCH 2/5] fix(ai/providers): generalize opencode reasoning_content
|
||||
gating
|
||||
|
||||
---
|
||||
.../providers/openai-completions-compat.ts | 14 +-
|
||||
.../ai/src/providers/openai-completions.ts | 4 +-
|
||||
.../ai/test/openai-completions-compat.test.ts | 160 ++++++++----------
|
||||
3 files changed, 82 insertions(+), 96 deletions(-)
|
||||
|
||||
diff --git a/packages/ai/src/providers/openai-completions-compat.ts b/packages/ai/src/providers/openai-completions-compat.ts
|
||||
index c777f312b..b4825a31c 100644
|
||||
--- a/packages/ai/src/providers/openai-completions-compat.ts
|
||||
+++ b/packages/ai/src/providers/openai-completions-compat.ts
|
||||
@@ -54,6 +54,8 @@ export function detectOpenAICompat(model: Model<"openai-completions">, resolvedB
|
||||
const isKimiModel = model.id.includes("moonshotai/kimi") || /^kimi[-.]/i.test(model.id);
|
||||
const isAlibaba = provider === "alibaba-coding-plan" || baseUrl.includes("dashscope");
|
||||
const isQwen = model.id.toLowerCase().includes("qwen");
|
||||
+ const isOpenRouter = provider === "openrouter" || baseUrl.includes("openrouter.ai");
|
||||
+ const isOpenCode = provider === "opencode-zen" || provider === "opencode-go" || baseUrl.includes("opencode.ai/zen");
|
||||
|
||||
const isNonStandard =
|
||||
isCerebras ||
|
||||
@@ -99,22 +101,20 @@ export function detectOpenAICompat(model: Model<"openai-completions">, resolvedB
|
||||
requiresMistralToolIds: isMistral,
|
||||
thinkingFormat: isZai
|
||||
? "zai"
|
||||
- : provider === "openrouter" || baseUrl.includes("openrouter.ai")
|
||||
+ : isOpenRouter
|
||||
? "openrouter"
|
||||
: isAlibaba || isQwen
|
||||
? "qwen"
|
||||
: "openai",
|
||||
reasoningContentField: "reasoning_content",
|
||||
// Backends that 400 follow-up requests when prior assistant tool-call turns lack `reasoning_content`:
|
||||
- // - Kimi: documented invariant on its native API and via OpenCode-Go.
|
||||
- // - Reasoning-capable models reached through OpenRouter or OpenCode-Go: DeepSeek V4 Pro and
|
||||
- // similar enforce this server-side whenever the request is in thinking mode.
|
||||
+ // - Kimi: documented invariant on its native API and via OpenCode.
|
||||
+ // - Reasoning-capable models reached through OpenRouter or OpenCode (Zen/Go): DeepSeek V4 Pro,
|
||||
+ // Kimi, and similar models can enforce this server-side whenever the request is in thinking mode.
|
||||
// We can't translate Anthropic's redacted/encrypted reasoning into DeepSeek's plaintext form, so
|
||||
// cross-provider continuations rely on a placeholder — see `convertMessages` for injection rules.
|
||||
requiresReasoningContentForToolCalls:
|
||||
- isKimiModel ||
|
||||
- ((provider === "openrouter" || baseUrl.includes("openrouter.ai") || provider === "opencode-go" ||
|
||||
- baseUrl.includes("opencode.ai/zen/go")) && Boolean(model.reasoning)),
|
||||
+ isKimiModel || ((isOpenRouter || isOpenCode) && Boolean(model.reasoning)),
|
||||
requiresAssistantContentForToolCalls: isKimiModel,
|
||||
openRouterRouting: undefined,
|
||||
vercelGatewayRouting: undefined,
|
||||
diff --git a/packages/ai/src/providers/openai-completions.ts b/packages/ai/src/providers/openai-completions.ts
|
||||
index 70f2e3b63..e25aeffb3 100644
|
||||
--- a/packages/ai/src/providers/openai-completions.ts
|
||||
+++ b/packages/ai/src/providers/openai-completions.ts
|
||||
@@ -1212,8 +1212,8 @@ export function convertMessages(
|
||||
(assistantMsg as any).reasoning_text !== undefined;
|
||||
// Inject a `reasoning_content` placeholder on assistant tool-call turns when the backend
|
||||
// rejects history without it. The compat flag captures the rule:
|
||||
- // - Kimi (native or via OpenCode-Go): chat completion endpoint demands the field.
|
||||
- // - Reasoning models reached through OpenRouter or OpenCode-Go (e.g. DeepSeek V4 Pro):
|
||||
+ // - Kimi (native or via OpenCode Zen/Go): chat completion endpoint demands the field.
|
||||
+ // - Reasoning models reached through OpenRouter or OpenCode Zen/Go (e.g. DeepSeek V4 Pro):
|
||||
// the upstream thinking-mode validator demands it on every prior assistant turn. omp
|
||||
// cannot synthesize real reasoning when the conversation was warmed up by another
|
||||
// provider whose reasoning is redacted/encrypted (Anthropic) or simply absent, so we
|
||||
diff --git a/packages/ai/test/openai-completions-compat.test.ts b/packages/ai/test/openai-completions-compat.test.ts
|
||||
index 6d60ba5e4..c743dd246 100644
|
||||
--- a/packages/ai/test/openai-completions-compat.test.ts
|
||||
+++ b/packages/ai/test/openai-completions-compat.test.ts
|
||||
@@ -282,105 +282,91 @@ describe("openai-completions compatibility", () => {
|
||||
});
|
||||
});
|
||||
|
||||
-describe("kimi model detection via detectCompat", () => {
|
||||
- function openCodeGoModel(id: string, reasoning = true): Model<"openai-completions"> {
|
||||
+describe("opencode reasoning-content compatibility via detectCompat", () => {
|
||||
+ type OpenCodeProvider = "opencode-go" | "opencode-zen";
|
||||
+
|
||||
+ function openCodeModel(provider: OpenCodeProvider, id: string, reasoning = true): Model<"openai-completions"> {
|
||||
+ const baseUrl = provider === "opencode-go" ? "https://opencode.ai/zen/go/v1" : "https://opencode.ai/zen/v1";
|
||||
return {
|
||||
...getBundledModel("openai", "gpt-4o-mini"),
|
||||
api: "openai-completions",
|
||||
- provider: "opencode-go",
|
||||
- baseUrl: "https://opencode.ai/zen/go/v1",
|
||||
+ provider,
|
||||
+ baseUrl,
|
||||
id,
|
||||
reasoning,
|
||||
};
|
||||
}
|
||||
|
||||
- function kimiOpenCodeModel(id: string): Model<"openai-completions"> {
|
||||
- return openCodeGoModel(id, true);
|
||||
- }
|
||||
-
|
||||
- it("requires reasoning_content for tool calls on kimi-k2.5 (opencode-go)", () => {
|
||||
- const compat = detectCompat(kimiOpenCodeModel("kimi-k2.5"));
|
||||
- expect(compat.requiresReasoningContentForToolCalls).toBe(true);
|
||||
- expect(compat.requiresAssistantContentForToolCalls).toBe(true);
|
||||
- });
|
||||
-
|
||||
- it("requires reasoning_content for tool calls on reasoning DeepSeek models via opencode-go", () => {
|
||||
- const compat = detectCompat(openCodeGoModel("deepseek-v4-pro", true));
|
||||
- expect(compat.requiresReasoningContentForToolCalls).toBe(true);
|
||||
- expect(compat.requiresAssistantContentForToolCalls).toBe(false);
|
||||
- });
|
||||
-
|
||||
- it("injects reasoning_content placeholder for reasoning DeepSeek tool-call turns via opencode-go", () => {
|
||||
- const model = openCodeGoModel("deepseek-v4-pro", true);
|
||||
- const compat = detectCompat(model);
|
||||
- const toolCallMessage: AssistantMessage = {
|
||||
- role: "assistant",
|
||||
- content: [{ type: "toolCall", id: "call_ds_go", name: "web_search", arguments: { query: "hi" } }],
|
||||
- api: model.api,
|
||||
- provider: model.provider,
|
||||
- model: model.id,
|
||||
- usage: {
|
||||
- input: 0,
|
||||
- output: 0,
|
||||
- cacheRead: 0,
|
||||
- cacheWrite: 0,
|
||||
- totalTokens: 0,
|
||||
- cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0, total: 0 },
|
||||
- },
|
||||
- stopReason: "toolUse",
|
||||
- timestamp: Date.now(),
|
||||
+ it.each(["opencode-go", "opencode-zen"] as const)(
|
||||
+ "requires reasoning_content for tool calls on kimi-k2.5 via %s",
|
||||
+ provider => {
|
||||
+ const compat = detectCompat(openCodeModel(provider, "kimi-k2.5", true));
|
||||
+ expect(compat.requiresReasoningContentForToolCalls).toBe(true);
|
||||
+ expect(compat.requiresAssistantContentForToolCalls).toBe(true);
|
||||
+ },
|
||||
+ );
|
||||
+
|
||||
+ it.each(["opencode-go", "opencode-zen"] as const)(
|
||||
+ "requires reasoning_content for tool calls on reasoning DeepSeek models via %s",
|
||||
+ provider => {
|
||||
+ const compat = detectCompat(openCodeModel(provider, "deepseek-v4-pro", true));
|
||||
+ expect(compat.requiresReasoningContentForToolCalls).toBe(true);
|
||||
+ expect(compat.requiresAssistantContentForToolCalls).toBe(false);
|
||||
+ },
|
||||
+ );
|
||||
+
|
||||
+ it("requires reasoning_content when custom openai provider targets opencode zen baseUrl", () => {
|
||||
+ const model: Model<"openai-completions"> = {
|
||||
+ ...getBundledModel("openai", "gpt-4o-mini"),
|
||||
+ api: "openai-completions",
|
||||
+ provider: "openai",
|
||||
+ baseUrl: "https://opencode.ai/zen/v1",
|
||||
+ id: "deepseek-v4-pro",
|
||||
+ reasoning: true,
|
||||
};
|
||||
- const messages = convertMessages(model, { messages: [toolCallMessage] }, compat);
|
||||
- const assistant = messages.find(m => m.role === "assistant");
|
||||
- expect(assistant).toBeDefined();
|
||||
- expect(Reflect.get(assistant as object, "reasoning_content")).toBe(".");
|
||||
- });
|
||||
-
|
||||
- it("injects reasoning_content placeholder when assistant with tool calls has no reasoning field", () => {
|
||||
- const model = kimiOpenCodeModel("kimi-k2.5");
|
||||
const compat = detectCompat(model);
|
||||
- const toolCallMessage: AssistantMessage = {
|
||||
- role: "assistant",
|
||||
- content: [
|
||||
- // Thinking returned as plain text (as kimi-k2.5 on opencode-go does)
|
||||
- { type: "text", text: "Let me research this." },
|
||||
- {
|
||||
- type: "toolCall",
|
||||
- id: "call_abc123",
|
||||
- name: "web_search",
|
||||
- arguments: { query: "beads gastownhall" },
|
||||
- },
|
||||
- ],
|
||||
- api: model.api,
|
||||
- provider: model.provider,
|
||||
- model: model.id,
|
||||
- usage: {
|
||||
- input: 0,
|
||||
- output: 0,
|
||||
- cacheRead: 0,
|
||||
- cacheWrite: 0,
|
||||
- totalTokens: 0,
|
||||
- cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0, total: 0 },
|
||||
- },
|
||||
- stopReason: "toolUse",
|
||||
- timestamp: Date.now(),
|
||||
- };
|
||||
- const messages = convertMessages(model, { messages: [toolCallMessage] }, compat);
|
||||
- const assistant = messages.find(m => m.role === "assistant");
|
||||
- expect(assistant).toBeDefined();
|
||||
- const reasoningContent = Reflect.get(assistant as object, "reasoning_content");
|
||||
- expect(reasoningContent).toBeDefined();
|
||||
- expect(typeof reasoningContent).toBe("string");
|
||||
- expect((reasoningContent as string).length).toBeGreaterThan(0);
|
||||
- });
|
||||
-
|
||||
- it("does not require reasoning_content when opencode-go model is not reasoning-capable", () => {
|
||||
- const compat = detectCompat(openCodeGoModel("some-other-model", false));
|
||||
- expect(compat.requiresReasoningContentForToolCalls).toBe(false);
|
||||
+ expect(compat.requiresReasoningContentForToolCalls).toBe(true);
|
||||
});
|
||||
|
||||
- it.each(["kimi-k2.5", "kimi-k1.5", "kimi-k2-5"])("matches kimi model id: %s", id => {
|
||||
- const compat = detectCompat(kimiOpenCodeModel(id));
|
||||
+ it.each(["opencode-go", "opencode-zen"] as const)(
|
||||
+ "injects reasoning_content placeholder for reasoning DeepSeek tool-call turns via %s",
|
||||
+ provider => {
|
||||
+ const model = openCodeModel(provider, "deepseek-v4-pro", true);
|
||||
+ const compat = detectCompat(model);
|
||||
+ const toolCallMessage: AssistantMessage = {
|
||||
+ role: "assistant",
|
||||
+ content: [{ type: "toolCall", id: `call_ds_${provider}`, name: "web_search", arguments: { query: "hi" } }],
|
||||
+ api: model.api,
|
||||
+ provider: model.provider,
|
||||
+ model: model.id,
|
||||
+ usage: {
|
||||
+ input: 0,
|
||||
+ output: 0,
|
||||
+ cacheRead: 0,
|
||||
+ cacheWrite: 0,
|
||||
+ totalTokens: 0,
|
||||
+ cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0, total: 0 },
|
||||
+ },
|
||||
+ stopReason: "toolUse",
|
||||
+ timestamp: Date.now(),
|
||||
+ };
|
||||
+ const messages = convertMessages(model, { messages: [toolCallMessage] }, compat);
|
||||
+ const assistant = messages.find(m => m.role === "assistant");
|
||||
+ expect(assistant).toBeDefined();
|
||||
+ expect(Reflect.get(assistant as object, "reasoning_content")).toBe(".");
|
||||
+ },
|
||||
+ );
|
||||
+
|
||||
+ it.each(["opencode-go", "opencode-zen"] as const)(
|
||||
+ "does not require reasoning_content when %s model is not reasoning-capable",
|
||||
+ provider => {
|
||||
+ const compat = detectCompat(openCodeModel(provider, "some-other-model", false));
|
||||
+ expect(compat.requiresReasoningContentForToolCalls).toBe(false);
|
||||
+ },
|
||||
+ );
|
||||
+
|
||||
+ it.each(["kimi-k2.5", "kimi-k1.5", "kimi-k2-5"])("matches kimi model id pattern via opencode-zen: %s", id => {
|
||||
+ const compat = detectCompat(openCodeModel("opencode-zen", id, true));
|
||||
expect(compat.requiresReasoningContentForToolCalls).toBe(true);
|
||||
});
|
||||
|
||||
|
||||
From 76c1fe9ee083836ecca43900fefc458c8cf4c4fb Mon Sep 17 00:00:00 2001
|
||||
From: sonhyrd <son.hong.do@hyrd.ai>
|
||||
Date: Mon, 27 Apr 2026 00:14:27 +0700
|
||||
Subject: [PATCH 3/5] test(ai): restore non-kimi coverage while adding
|
||||
opencode-zen cases
|
||||
|
||||
---
|
||||
.../ai/test/openai-completions-compat.test.ts | 215 +++++++++++++-----
|
||||
1 file changed, 154 insertions(+), 61 deletions(-)
|
||||
|
||||
diff --git a/packages/ai/test/openai-completions-compat.test.ts b/packages/ai/test/openai-completions-compat.test.ts
|
||||
index c743dd246..8b8cef393 100644
|
||||
--- a/packages/ai/test/openai-completions-compat.test.ts
|
||||
+++ b/packages/ai/test/openai-completions-compat.test.ts
|
||||
@@ -282,38 +282,56 @@ describe("openai-completions compatibility", () => {
|
||||
});
|
||||
});
|
||||
|
||||
-describe("opencode reasoning-content compatibility via detectCompat", () => {
|
||||
- type OpenCodeProvider = "opencode-go" | "opencode-zen";
|
||||
+describe("kimi model detection via detectCompat", () => {
|
||||
+ function openCodeGoModel(id: string, reasoning = true): Model<"openai-completions"> {
|
||||
+ return {
|
||||
+ ...getBundledModel("openai", "gpt-4o-mini"),
|
||||
+ api: "openai-completions",
|
||||
+ provider: "opencode-go",
|
||||
+ baseUrl: "https://opencode.ai/zen/go/v1",
|
||||
+ id,
|
||||
+ reasoning,
|
||||
+ };
|
||||
+ }
|
||||
|
||||
- function openCodeModel(provider: OpenCodeProvider, id: string, reasoning = true): Model<"openai-completions"> {
|
||||
- const baseUrl = provider === "opencode-go" ? "https://opencode.ai/zen/go/v1" : "https://opencode.ai/zen/v1";
|
||||
+ function openCodeZenModel(id: string, reasoning = true): Model<"openai-completions"> {
|
||||
return {
|
||||
...getBundledModel("openai", "gpt-4o-mini"),
|
||||
api: "openai-completions",
|
||||
- provider,
|
||||
- baseUrl,
|
||||
+ provider: "opencode-zen",
|
||||
+ baseUrl: "https://opencode.ai/zen/v1",
|
||||
id,
|
||||
reasoning,
|
||||
};
|
||||
}
|
||||
|
||||
- it.each(["opencode-go", "opencode-zen"] as const)(
|
||||
- "requires reasoning_content for tool calls on kimi-k2.5 via %s",
|
||||
- provider => {
|
||||
- const compat = detectCompat(openCodeModel(provider, "kimi-k2.5", true));
|
||||
- expect(compat.requiresReasoningContentForToolCalls).toBe(true);
|
||||
- expect(compat.requiresAssistantContentForToolCalls).toBe(true);
|
||||
- },
|
||||
- );
|
||||
-
|
||||
- it.each(["opencode-go", "opencode-zen"] as const)(
|
||||
- "requires reasoning_content for tool calls on reasoning DeepSeek models via %s",
|
||||
- provider => {
|
||||
- const compat = detectCompat(openCodeModel(provider, "deepseek-v4-pro", true));
|
||||
- expect(compat.requiresReasoningContentForToolCalls).toBe(true);
|
||||
- expect(compat.requiresAssistantContentForToolCalls).toBe(false);
|
||||
- },
|
||||
- );
|
||||
+ function kimiOpenCodeModel(id: string): Model<"openai-completions"> {
|
||||
+ return openCodeGoModel(id, true);
|
||||
+ }
|
||||
+
|
||||
+ it("requires reasoning_content for tool calls on kimi-k2.5 (opencode-go)", () => {
|
||||
+ const compat = detectCompat(kimiOpenCodeModel("kimi-k2.5"));
|
||||
+ expect(compat.requiresReasoningContentForToolCalls).toBe(true);
|
||||
+ expect(compat.requiresAssistantContentForToolCalls).toBe(true);
|
||||
+ });
|
||||
+
|
||||
+ it("requires reasoning_content for tool calls on kimi-k2.5 (opencode-zen)", () => {
|
||||
+ const compat = detectCompat(openCodeZenModel("kimi-k2.5", true));
|
||||
+ expect(compat.requiresReasoningContentForToolCalls).toBe(true);
|
||||
+ expect(compat.requiresAssistantContentForToolCalls).toBe(true);
|
||||
+ });
|
||||
+
|
||||
+ it("requires reasoning_content for tool calls on reasoning DeepSeek models via opencode-go", () => {
|
||||
+ const compat = detectCompat(openCodeGoModel("deepseek-v4-pro", true));
|
||||
+ expect(compat.requiresReasoningContentForToolCalls).toBe(true);
|
||||
+ expect(compat.requiresAssistantContentForToolCalls).toBe(false);
|
||||
+ });
|
||||
+
|
||||
+ it("requires reasoning_content for tool calls on reasoning DeepSeek models via opencode-zen", () => {
|
||||
+ const compat = detectCompat(openCodeZenModel("deepseek-v4-pro", true));
|
||||
+ expect(compat.requiresReasoningContentForToolCalls).toBe(true);
|
||||
+ expect(compat.requiresAssistantContentForToolCalls).toBe(false);
|
||||
+ });
|
||||
|
||||
it("requires reasoning_content when custom openai provider targets opencode zen baseUrl", () => {
|
||||
const model: Model<"openai-completions"> = {
|
||||
@@ -328,45 +346,120 @@ describe("opencode reasoning-content compatibility via detectCompat", () => {
|
||||
expect(compat.requiresReasoningContentForToolCalls).toBe(true);
|
||||
});
|
||||
|
||||
- it.each(["opencode-go", "opencode-zen"] as const)(
|
||||
- "injects reasoning_content placeholder for reasoning DeepSeek tool-call turns via %s",
|
||||
- provider => {
|
||||
- const model = openCodeModel(provider, "deepseek-v4-pro", true);
|
||||
- const compat = detectCompat(model);
|
||||
- const toolCallMessage: AssistantMessage = {
|
||||
- role: "assistant",
|
||||
- content: [{ type: "toolCall", id: `call_ds_${provider}`, name: "web_search", arguments: { query: "hi" } }],
|
||||
- api: model.api,
|
||||
- provider: model.provider,
|
||||
- model: model.id,
|
||||
- usage: {
|
||||
- input: 0,
|
||||
- output: 0,
|
||||
- cacheRead: 0,
|
||||
- cacheWrite: 0,
|
||||
- totalTokens: 0,
|
||||
- cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0, total: 0 },
|
||||
+ it("injects reasoning_content placeholder for reasoning DeepSeek tool-call turns via opencode-go", () => {
|
||||
+ const model = openCodeGoModel("deepseek-v4-pro", true);
|
||||
+ const compat = detectCompat(model);
|
||||
+ const toolCallMessage: AssistantMessage = {
|
||||
+ role: "assistant",
|
||||
+ content: [{ type: "toolCall", id: "call_ds_go", name: "web_search", arguments: { query: "hi" } }],
|
||||
+ api: model.api,
|
||||
+ provider: model.provider,
|
||||
+ model: model.id,
|
||||
+ usage: {
|
||||
+ input: 0,
|
||||
+ output: 0,
|
||||
+ cacheRead: 0,
|
||||
+ cacheWrite: 0,
|
||||
+ totalTokens: 0,
|
||||
+ cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0, total: 0 },
|
||||
+ },
|
||||
+ stopReason: "toolUse",
|
||||
+ timestamp: Date.now(),
|
||||
+ };
|
||||
+ const messages = convertMessages(model, { messages: [toolCallMessage] }, compat);
|
||||
+ const assistant = messages.find(m => m.role === "assistant");
|
||||
+ expect(assistant).toBeDefined();
|
||||
+ expect(Reflect.get(assistant as object, "reasoning_content")).toBe(".");
|
||||
+ });
|
||||
+
|
||||
+ it("injects reasoning_content placeholder for reasoning DeepSeek tool-call turns via opencode-zen", () => {
|
||||
+ const model = openCodeZenModel("deepseek-v4-pro", true);
|
||||
+ const compat = detectCompat(model);
|
||||
+ const toolCallMessage: AssistantMessage = {
|
||||
+ role: "assistant",
|
||||
+ content: [{ type: "toolCall", id: "call_ds_zen", name: "web_search", arguments: { query: "hi" } }],
|
||||
+ api: model.api,
|
||||
+ provider: model.provider,
|
||||
+ model: model.id,
|
||||
+ usage: {
|
||||
+ input: 0,
|
||||
+ output: 0,
|
||||
+ cacheRead: 0,
|
||||
+ cacheWrite: 0,
|
||||
+ totalTokens: 0,
|
||||
+ cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0, total: 0 },
|
||||
+ },
|
||||
+ stopReason: "toolUse",
|
||||
+ timestamp: Date.now(),
|
||||
+ };
|
||||
+ const messages = convertMessages(model, { messages: [toolCallMessage] }, compat);
|
||||
+ const assistant = messages.find(m => m.role === "assistant");
|
||||
+ expect(assistant).toBeDefined();
|
||||
+ expect(Reflect.get(assistant as object, "reasoning_content")).toBe(".");
|
||||
+ });
|
||||
+
|
||||
+ it("injects reasoning_content placeholder when assistant with tool calls has no reasoning field", () => {
|
||||
+ const model = kimiOpenCodeModel("kimi-k2.5");
|
||||
+ const compat = detectCompat(model);
|
||||
+ const toolCallMessage: AssistantMessage = {
|
||||
+ role: "assistant",
|
||||
+ content: [
|
||||
+ // Thinking returned as plain text (as kimi-k2.5 on opencode-go does)
|
||||
+ { type: "text", text: "Let me research this." },
|
||||
+ {
|
||||
+ type: "toolCall",
|
||||
+ id: "call_abc123",
|
||||
+ name: "web_search",
|
||||
+ arguments: { query: "beads gastownhall" },
|
||||
},
|
||||
- stopReason: "toolUse",
|
||||
- timestamp: Date.now(),
|
||||
- };
|
||||
- const messages = convertMessages(model, { messages: [toolCallMessage] }, compat);
|
||||
- const assistant = messages.find(m => m.role === "assistant");
|
||||
- expect(assistant).toBeDefined();
|
||||
- expect(Reflect.get(assistant as object, "reasoning_content")).toBe(".");
|
||||
- },
|
||||
- );
|
||||
-
|
||||
- it.each(["opencode-go", "opencode-zen"] as const)(
|
||||
- "does not require reasoning_content when %s model is not reasoning-capable",
|
||||
- provider => {
|
||||
- const compat = detectCompat(openCodeModel(provider, "some-other-model", false));
|
||||
- expect(compat.requiresReasoningContentForToolCalls).toBe(false);
|
||||
- },
|
||||
- );
|
||||
-
|
||||
- it.each(["kimi-k2.5", "kimi-k1.5", "kimi-k2-5"])("matches kimi model id pattern via opencode-zen: %s", id => {
|
||||
- const compat = detectCompat(openCodeModel("opencode-zen", id, true));
|
||||
+ ],
|
||||
+ api: model.api,
|
||||
+ provider: model.provider,
|
||||
+ model: model.id,
|
||||
+ usage: {
|
||||
+ input: 0,
|
||||
+ output: 0,
|
||||
+ cacheRead: 0,
|
||||
+ cacheWrite: 0,
|
||||
+ totalTokens: 0,
|
||||
+ cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0, total: 0 },
|
||||
+ },
|
||||
+ stopReason: "toolUse",
|
||||
+ timestamp: Date.now(),
|
||||
+ };
|
||||
+ const messages = convertMessages(model, { messages: [toolCallMessage] }, compat);
|
||||
+ const assistant = messages.find(m => m.role === "assistant");
|
||||
+ expect(assistant).toBeDefined();
|
||||
+ const reasoningContent = Reflect.get(assistant as object, "reasoning_content");
|
||||
+ expect(reasoningContent).toBeDefined();
|
||||
+ expect(typeof reasoningContent).toBe("string");
|
||||
+ expect((reasoningContent as string).length).toBeGreaterThan(0);
|
||||
+ });
|
||||
+
|
||||
+ it("does not inject reasoning_content when model is not kimi", () => {
|
||||
+ const model: Model<"openai-completions"> = {
|
||||
+ ...getBundledModel("openai", "gpt-4o-mini"),
|
||||
+ api: "openai-completions",
|
||||
+ provider: "opencode-go",
|
||||
+ baseUrl: "https://opencode.ai/zen/go/v1",
|
||||
+ id: "some-other-model",
|
||||
+ };
|
||||
+ const compat = detectCompat(model);
|
||||
+ expect(compat.requiresReasoningContentForToolCalls).toBe(false);
|
||||
+ });
|
||||
+
|
||||
+ it("does not require reasoning_content when opencode-go model is not reasoning-capable", () => {
|
||||
+ const compat = detectCompat(openCodeGoModel("some-other-model", false));
|
||||
+ expect(compat.requiresReasoningContentForToolCalls).toBe(false);
|
||||
+ });
|
||||
+
|
||||
+ it("does not require reasoning_content when opencode-zen model is not reasoning-capable", () => {
|
||||
+ const compat = detectCompat(openCodeZenModel("some-other-model", false));
|
||||
+ expect(compat.requiresReasoningContentForToolCalls).toBe(false);
|
||||
+ });
|
||||
+
|
||||
+ it.each(["kimi-k2.5", "kimi-k1.5", "kimi-k2-5"])("matches kimi model id: %s", id => {
|
||||
+ const compat = detectCompat(kimiOpenCodeModel(id));
|
||||
expect(compat.requiresReasoningContentForToolCalls).toBe(true);
|
||||
});
|
||||
|
||||
|
||||
From 9c7a8958c682b16990504500551827320508087d Mon Sep 17 00:00:00 2001
|
||||
From: sonhyrd <son.hong.do@hyrd.ai>
|
||||
Date: Mon, 27 Apr 2026 00:29:48 +0700
|
||||
Subject: [PATCH 4/5] fix(ai/providers): gate reasoning_content stubs on
|
||||
deepseek models
|
||||
|
||||
---
|
||||
.../providers/openai-completions-compat.ts | 7 ++--
|
||||
.../ai/src/providers/openai-completions.ts | 4 +--
|
||||
.../ai/test/openai-completions-compat.test.ts | 36 +++++++++++++++++++
|
||||
3 files changed, 42 insertions(+), 5 deletions(-)
|
||||
|
||||
diff --git a/packages/ai/src/providers/openai-completions-compat.ts b/packages/ai/src/providers/openai-completions-compat.ts
|
||||
index b4825a31c..bba1cef70 100644
|
||||
--- a/packages/ai/src/providers/openai-completions-compat.ts
|
||||
+++ b/packages/ai/src/providers/openai-completions-compat.ts
|
||||
@@ -54,6 +54,7 @@ export function detectOpenAICompat(model: Model<"openai-completions">, resolvedB
|
||||
const isKimiModel = model.id.includes("moonshotai/kimi") || /^kimi[-.]/i.test(model.id);
|
||||
const isAlibaba = provider === "alibaba-coding-plan" || baseUrl.includes("dashscope");
|
||||
const isQwen = model.id.toLowerCase().includes("qwen");
|
||||
+ const isDeepSeekModel = model.id.toLowerCase().includes("deepseek");
|
||||
const isOpenRouter = provider === "openrouter" || baseUrl.includes("openrouter.ai");
|
||||
const isOpenCode = provider === "opencode-zen" || provider === "opencode-go" || baseUrl.includes("opencode.ai/zen");
|
||||
|
||||
@@ -109,12 +110,12 @@ export function detectOpenAICompat(model: Model<"openai-completions">, resolvedB
|
||||
reasoningContentField: "reasoning_content",
|
||||
// Backends that 400 follow-up requests when prior assistant tool-call turns lack `reasoning_content`:
|
||||
// - Kimi: documented invariant on its native API and via OpenCode.
|
||||
- // - Reasoning-capable models reached through OpenRouter or OpenCode (Zen/Go): DeepSeek V4 Pro,
|
||||
- // Kimi, and similar models can enforce this server-side whenever the request is in thinking mode.
|
||||
+ // - DeepSeek reasoning models reached through OpenRouter or OpenCode (Zen/Go): enforced when
|
||||
+ // thinking mode is enabled on those model families.
|
||||
// We can't translate Anthropic's redacted/encrypted reasoning into DeepSeek's plaintext form, so
|
||||
// cross-provider continuations rely on a placeholder — see `convertMessages` for injection rules.
|
||||
requiresReasoningContentForToolCalls:
|
||||
- isKimiModel || ((isOpenRouter || isOpenCode) && Boolean(model.reasoning)),
|
||||
+ isKimiModel || (isDeepSeekModel && (isOpenRouter || isOpenCode) && Boolean(model.reasoning)),
|
||||
requiresAssistantContentForToolCalls: isKimiModel,
|
||||
openRouterRouting: undefined,
|
||||
vercelGatewayRouting: undefined,
|
||||
diff --git a/packages/ai/src/providers/openai-completions.ts b/packages/ai/src/providers/openai-completions.ts
|
||||
index e25aeffb3..89a997a0f 100644
|
||||
--- a/packages/ai/src/providers/openai-completions.ts
|
||||
+++ b/packages/ai/src/providers/openai-completions.ts
|
||||
@@ -1213,8 +1213,8 @@ export function convertMessages(
|
||||
// Inject a `reasoning_content` placeholder on assistant tool-call turns when the backend
|
||||
// rejects history without it. The compat flag captures the rule:
|
||||
// - Kimi (native or via OpenCode Zen/Go): chat completion endpoint demands the field.
|
||||
- // - Reasoning models reached through OpenRouter or OpenCode Zen/Go (e.g. DeepSeek V4 Pro):
|
||||
- // the upstream thinking-mode validator demands it on every prior assistant turn. omp
|
||||
+ // - DeepSeek reasoning models reached through OpenRouter or OpenCode Zen/Go: the upstream
|
||||
+ // thinking-mode validator demands it on every prior assistant turn. omp
|
||||
// cannot synthesize real reasoning when the conversation was warmed up by another
|
||||
// provider whose reasoning is redacted/encrypted (Anthropic) or simply absent, so we
|
||||
// emit a placeholder. Real captured reasoning, when present, is preserved earlier via
|
||||
diff --git a/packages/ai/test/openai-completions-compat.test.ts b/packages/ai/test/openai-completions-compat.test.ts
|
||||
index 8b8cef393..c083c2151 100644
|
||||
--- a/packages/ai/test/openai-completions-compat.test.ts
|
||||
+++ b/packages/ai/test/openai-completions-compat.test.ts
|
||||
@@ -333,6 +333,29 @@ describe("kimi model detection via detectCompat", () => {
|
||||
expect(compat.requiresAssistantContentForToolCalls).toBe(false);
|
||||
});
|
||||
|
||||
+ it("does not require reasoning_content for non-DeepSeek reasoning models via opencode-go", () => {
|
||||
+ const compat = detectCompat(openCodeGoModel("glm-5", true));
|
||||
+ expect(compat.requiresReasoningContentForToolCalls).toBe(false);
|
||||
+ });
|
||||
+
|
||||
+ it("does not require reasoning_content for non-DeepSeek reasoning models via opencode-zen", () => {
|
||||
+ const compat = detectCompat(openCodeZenModel("glm-5", true));
|
||||
+ expect(compat.requiresReasoningContentForToolCalls).toBe(false);
|
||||
+ });
|
||||
+
|
||||
+ it("does not require reasoning_content when custom openai provider targets opencode zen baseUrl with non-DeepSeek model", () => {
|
||||
+ const model: Model<"openai-completions"> = {
|
||||
+ ...getBundledModel("openai", "gpt-4o-mini"),
|
||||
+ api: "openai-completions",
|
||||
+ provider: "openai",
|
||||
+ baseUrl: "https://opencode.ai/zen/v1",
|
||||
+ id: "glm-5",
|
||||
+ reasoning: true,
|
||||
+ };
|
||||
+ const compat = detectCompat(model);
|
||||
+ expect(compat.requiresReasoningContentForToolCalls).toBe(false);
|
||||
+ });
|
||||
+
|
||||
it("requires reasoning_content when custom openai provider targets opencode zen baseUrl", () => {
|
||||
const model: Model<"openai-completions"> = {
|
||||
...getBundledModel("openai", "gpt-4o-mini"),
|
||||
@@ -453,6 +476,19 @@ describe("kimi model detection via detectCompat", () => {
|
||||
expect(compat.requiresReasoningContentForToolCalls).toBe(false);
|
||||
});
|
||||
|
||||
+ it("does not require reasoning_content for non-DeepSeek reasoning models via openrouter", () => {
|
||||
+ const model: Model<"openai-completions"> = {
|
||||
+ ...getBundledModel("openai", "gpt-4o-mini"),
|
||||
+ api: "openai-completions",
|
||||
+ provider: "openrouter",
|
||||
+ baseUrl: "https://openrouter.ai/api/v1",
|
||||
+ id: "openai/gpt-4.1-mini",
|
||||
+ reasoning: true,
|
||||
+ };
|
||||
+ const compat = detectCompat(model);
|
||||
+ expect(compat.requiresReasoningContentForToolCalls).toBe(false);
|
||||
+ });
|
||||
+
|
||||
it("does not require reasoning_content when opencode-zen model is not reasoning-capable", () => {
|
||||
const compat = detectCompat(openCodeZenModel("some-other-model", false));
|
||||
expect(compat.requiresReasoningContentForToolCalls).toBe(false);
|
||||
|
||||
From 53a03286cf658bb4aeab67dad3246b7ba80cf244 Mon Sep 17 00:00:00 2001
|
||||
From: sonhyrd <son.hong.do@hyrd.ai>
|
||||
Date: Mon, 27 Apr 2026 00:52:22 +0700
|
||||
Subject: [PATCH 5/5] fix(ai/providers): set content when reasoning placeholder
|
||||
is injected
|
||||
|
||||
---
|
||||
packages/ai/src/providers/openai-completions.ts | 3 ++-
|
||||
packages/ai/test/openai-completions-compat.test.ts | 2 ++
|
||||
2 files changed, 4 insertions(+), 1 deletion(-)
|
||||
|
||||
diff --git a/packages/ai/src/providers/openai-completions.ts b/packages/ai/src/providers/openai-completions.ts
|
||||
index 89a997a0f..b490e254e 100644
|
||||
--- a/packages/ai/src/providers/openai-completions.ts
|
||||
+++ b/packages/ai/src/providers/openai-completions.ts
|
||||
@@ -1206,7 +1206,7 @@ export function convertMessages(
|
||||
}
|
||||
|
||||
const toolCalls = msg.content.filter(b => b.type === "toolCall") as ToolCall[];
|
||||
- const hasReasoningField =
|
||||
+ let hasReasoningField =
|
||||
(assistantMsg as any).reasoning_content !== undefined ||
|
||||
(assistantMsg as any).reasoning !== undefined ||
|
||||
(assistantMsg as any).reasoning_text !== undefined;
|
||||
@@ -1227,6 +1227,7 @@ export function convertMessages(
|
||||
if (toolCalls.length > 0 && stubsReasoningContent && !hasReasoningField) {
|
||||
const reasoningField = compat.reasoningContentField ?? "reasoning_content";
|
||||
(assistantMsg as any)[reasoningField] = ".";
|
||||
+ hasReasoningField = true;
|
||||
}
|
||||
if (toolCalls.length > 0) {
|
||||
assistantMsg.tool_calls = toolCalls.map((tc, toolCallIndex) => {
|
||||
diff --git a/packages/ai/test/openai-completions-compat.test.ts b/packages/ai/test/openai-completions-compat.test.ts
|
||||
index c083c2151..8efae899a 100644
|
||||
--- a/packages/ai/test/openai-completions-compat.test.ts
|
||||
+++ b/packages/ai/test/openai-completions-compat.test.ts
|
||||
@@ -393,6 +393,7 @@ describe("kimi model detection via detectCompat", () => {
|
||||
const assistant = messages.find(m => m.role === "assistant");
|
||||
expect(assistant).toBeDefined();
|
||||
expect(Reflect.get(assistant as object, "reasoning_content")).toBe(".");
|
||||
+ expect(Reflect.get(assistant as object, "content")).toBe("");
|
||||
});
|
||||
|
||||
it("injects reasoning_content placeholder for reasoning DeepSeek tool-call turns via opencode-zen", () => {
|
||||
@@ -419,6 +420,7 @@ describe("kimi model detection via detectCompat", () => {
|
||||
const assistant = messages.find(m => m.role === "assistant");
|
||||
expect(assistant).toBeDefined();
|
||||
expect(Reflect.get(assistant as object, "reasoning_content")).toBe(".");
|
||||
+ expect(Reflect.get(assistant as object, "content")).toBe("");
|
||||
});
|
||||
|
||||
it("injects reasoning_content placeholder when assistant with tool calls has no reasoning field", () => {
|
||||
@@ -0,0 +1,115 @@
|
||||
From cf7b9a9fc53023cbaca5a128ece32d76cafe95d5 Mon Sep 17 00:00:00 2001
|
||||
From: Oscar Cowdery Lack <oscar.cowderylack@gmail.com>
|
||||
Date: Mon, 30 Mar 2026 00:05:49 +1100
|
||||
Subject: [PATCH] server: Use provided secret to unlock auto-created default
|
||||
keyring (#443)
|
||||
|
||||
If a secret is provided by PAM or systemd credentials, then it should be
|
||||
used to unlock the default keyring when creating it for the first time,
|
||||
not just when discovering existing keyrings.
|
||||
---
|
||||
src/service/mod.rs | 36 +++++++++++++++++++++++++-----------
|
||||
src/tests.rs | 4 +++-
|
||||
2 files changed, 28 insertions(+), 12 deletions(-)
|
||||
|
||||
diff --git a/src/service/mod.rs b/src/service/mod.rs
|
||||
index bfbe16d..44e55c2 100644
|
||||
--- a/src/service/mod.rs
|
||||
+++ b/src/service/mod.rs
|
||||
@@ -415,10 +415,10 @@ impl Service {
|
||||
.await?;
|
||||
|
||||
// Discover existing keyrings
|
||||
- let discovered_keyrings = service.discover_keyrings(secret).await?;
|
||||
+ let discovered_keyrings = service.discover_keyrings(secret.clone()).await?;
|
||||
|
||||
service
|
||||
- .initialize(connection, discovered_keyrings, true)
|
||||
+ .initialize(connection, discovered_keyrings, secret, true)
|
||||
.await?;
|
||||
|
||||
// Start PAM listener
|
||||
@@ -458,7 +458,7 @@ impl Service {
|
||||
)
|
||||
.await?;
|
||||
|
||||
- let default_keyring = if let Some(secret) = secret {
|
||||
+ let default_keyring = if let Some(secret) = secret.clone() {
|
||||
vec![(
|
||||
"Login".to_owned(),
|
||||
oo7::dbus::Service::DEFAULT_COLLECTION.to_owned(),
|
||||
@@ -469,7 +469,7 @@ impl Service {
|
||||
};
|
||||
|
||||
service
|
||||
- .initialize(connection, default_keyring, false)
|
||||
+ .initialize(connection, default_keyring, secret, false)
|
||||
.await?;
|
||||
Ok(service)
|
||||
}
|
||||
@@ -686,6 +686,7 @@ impl Service {
|
||||
&self,
|
||||
connection: zbus::Connection,
|
||||
mut discovered_keyrings: Vec<(String, String, Keyring)>, // (name, alias, keyring)
|
||||
+ secret: Option<Secret>,
|
||||
auto_create_default: bool,
|
||||
) -> Result<(), Error> {
|
||||
self.connection.set(connection.clone()).unwrap();
|
||||
@@ -701,19 +702,32 @@ impl Service {
|
||||
if !has_default && auto_create_default {
|
||||
tracing::info!("No default collection found, creating 'Login' keyring");
|
||||
|
||||
- let locked_keyring = LockedKeyring::open(Self::LOGIN_ALIAS)
|
||||
- .await
|
||||
- .inspect_err(|e| {
|
||||
- tracing::error!("Failed to create default Login keyring: {}", e);
|
||||
- })?;
|
||||
+ let keyring = if let Some(secret) = secret {
|
||||
+ UnlockedKeyring::open(Self::LOGIN_ALIAS, secret)
|
||||
+ .await
|
||||
+ .map(Keyring::Unlocked)
|
||||
+ } else {
|
||||
+ LockedKeyring::open(Self::LOGIN_ALIAS)
|
||||
+ .await
|
||||
+ .map(Keyring::Locked)
|
||||
+ };
|
||||
+
|
||||
+ let keyring = keyring.inspect_err(|e| {
|
||||
+ tracing::error!("Failed to create default Login keyring: {}", e);
|
||||
+ })?;
|
||||
|
||||
+ let is_locked = if keyring.is_locked() {
|
||||
+ "locked"
|
||||
+ } else {
|
||||
+ "unlocked"
|
||||
+ };
|
||||
discovered_keyrings.push((
|
||||
"Login".to_owned(),
|
||||
oo7::dbus::Service::DEFAULT_COLLECTION.to_owned(),
|
||||
- Keyring::Locked(locked_keyring),
|
||||
+ keyring,
|
||||
));
|
||||
|
||||
- tracing::info!("Created default 'Login' collection (locked)");
|
||||
+ tracing::info!("Created default 'Login' collection ({})", is_locked);
|
||||
}
|
||||
|
||||
// Set up discovered collections
|
||||
diff --git a/src/tests.rs b/src/tests.rs
|
||||
index 16aa0bb..07fb27c 100644
|
||||
--- a/src/tests.rs
|
||||
+++ b/src/tests.rs
|
||||
@@ -254,7 +254,9 @@ impl TestServiceSetup {
|
||||
.await?;
|
||||
|
||||
let discovered = service.discover_keyrings(secret.clone()).await?;
|
||||
- service.initialize(server_conn, discovered, false).await?;
|
||||
+ service
|
||||
+ .initialize(server_conn, discovered, secret.clone(), false)
|
||||
+ .await?;
|
||||
|
||||
#[cfg(any(feature = "gnome_native_crypto", feature = "gnome_openssl_crypto"))]
|
||||
let mock_prompter = {
|
||||
--
|
||||
2.53.0
|
||||
|
||||
54
scripts/bootstrap-desktop-tpm.sh
Executable file
54
scripts/bootstrap-desktop-tpm.sh
Executable file
@@ -0,0 +1,54 @@
|
||||
#!/usr/bin/env bash
|
||||
# Bootstrap the age-plugin-tpm identity for a desktop host (mreow / yarn).
|
||||
#
|
||||
# Produces a TPM-sealed age identity at /var/lib/agenix/tpm-identity and
|
||||
# prints the recipient string to add to secrets/secrets.nix.
|
||||
#
|
||||
# Usage:
|
||||
# doas scripts/bootstrap-desktop-tpm.sh
|
||||
#
|
||||
# After running:
|
||||
# 1. Append the printed recipient to the `tpm` list in secrets/secrets.nix.
|
||||
# 2. Re-encrypt: nix-shell -p age-plugin-tpm rage --run \
|
||||
# 'agenix -r -i ~/.ssh/id_ed25519'
|
||||
# 3. Commit + ./deploy.sh switch.
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
if [[ $EUID -ne 0 ]]; then
|
||||
echo "this script must run as root (access to /dev/tpmrm0 + /var/lib/agenix)" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
host=$(hostname -s)
|
||||
id_file=/var/lib/agenix/tpm-identity
|
||||
|
||||
install -d -m 0700 -o root -g root /var/lib/agenix
|
||||
|
||||
if [[ -f "$id_file" ]]; then
|
||||
echo "existing identity found at $id_file — preserving"
|
||||
else
|
||||
echo "generating TPM-sealed age identity..."
|
||||
nix-shell -p age-plugin-tpm --run "age-plugin-tpm --generate -o $id_file"
|
||||
chmod 0400 "$id_file"
|
||||
chown root:root "$id_file"
|
||||
fi
|
||||
|
||||
# Read the recipient directly from the identity file header — no TPM
|
||||
# round-trip needed, no nix run, no set -e hazards.
|
||||
recipient=$(grep '^# Recipient:' "$id_file" | awk '{print $3}')
|
||||
if [[ -z "$recipient" ]]; then
|
||||
echo "failed to read recipient from $id_file" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
cat <<EOF
|
||||
|
||||
recipient for $host:
|
||||
"$recipient $host"
|
||||
|
||||
next steps (run on a workstation with git-crypt unlocked):
|
||||
1. edit secrets/secrets.nix and add the line above to the \`tpm\` list.
|
||||
2. re-encrypt: nix-shell -p age-plugin-tpm rage --run 'agenix -r -i ~/.ssh/id_ed25519'
|
||||
3. git commit + ./deploy.sh switch
|
||||
EOF
|
||||
Binary file not shown.
BIN
secrets/desktop/nix-cache-netrc.age
Normal file
BIN
secrets/desktop/nix-cache-netrc.age
Normal file
Binary file not shown.
BIN
secrets/desktop/oo7-keyring-password.age
Normal file
BIN
secrets/desktop/oo7-keyring-password.age
Normal file
Binary file not shown.
Binary file not shown.
BIN
secrets/desktop/password-hash.age
Normal file
BIN
secrets/desktop/password-hash.age
Normal file
Binary file not shown.
Binary file not shown.
BIN
secrets/desktop/secureboot.tar.age
Normal file
BIN
secrets/desktop/secureboot.tar.age
Normal file
Binary file not shown.
Binary file not shown.
BIN
secrets/secrets.nix
Normal file
BIN
secrets/secrets.nix
Normal file
Binary file not shown.
@@ -13,6 +13,7 @@ let
|
||||
|
||||
curl = "${pkgs.curl}/bin/curl";
|
||||
jq = "${pkgs.jq}/bin/jq";
|
||||
shuf = "${pkgs.coreutils}/bin/shuf";
|
||||
|
||||
# Max items to search per cycle per category (missing + cutoff) per app
|
||||
maxPerCycle = 5;
|
||||
@@ -54,10 +55,16 @@ let
|
||||
local label="$2"
|
||||
|
||||
local series_ids
|
||||
series_ids=$(${curl} -sf --max-time 30 \
|
||||
# Fetch the full wanted list, dedupe to seriesIds, then randomly
|
||||
# sample maxPerCycle. Sonarr's wanted endpoint returns one record
|
||||
# per episode, so a small pageSize collapses to a single seriesId
|
||||
# whenever any one show dominates the alphabetical head of the
|
||||
# backlog -- which starves every other show indefinitely.
|
||||
series_ids=$(${curl} -sf --max-time 60 \
|
||||
-H "X-Api-Key: $SONARR_KEY" \
|
||||
"${sonarrUrl}/api/v3/wanted/$endpoint?page=1&pageSize=${builtins.toString maxPerCycle}&monitored=true&sortKey=title&sortDirection=ascending&includeSeries=true" \
|
||||
| ${jq} -r '[.records[].seriesId] | unique | .[] // empty')
|
||||
"${sonarrUrl}/api/v3/wanted/$endpoint?page=1&pageSize=5000&monitored=true" \
|
||||
| ${jq} -r '[.records[].seriesId] | unique | .[]' \
|
||||
| ${shuf} -n ${builtins.toString maxPerCycle})
|
||||
|
||||
if [ -z "$series_ids" ]; then
|
||||
echo "sonarr: no $label items"
|
||||
|
||||
@@ -173,6 +173,19 @@ in
|
||||
];
|
||||
}
|
||||
{ name = "HDTV-720p"; }
|
||||
# SD fallback for shows that predate HD or whose only seeded
|
||||
# public-tracker copies are 480p/DVD/SDTV. Sonarr will still
|
||||
# upgrade to WEB/Bluray (cutoff above) when an HD release
|
||||
# surfaces.
|
||||
{
|
||||
name = "SD";
|
||||
qualities = [
|
||||
"WEBDL-480p"
|
||||
"WEBRip-480p"
|
||||
"DVD"
|
||||
"SDTV"
|
||||
];
|
||||
}
|
||||
];
|
||||
}
|
||||
];
|
||||
|
||||
@@ -57,6 +57,19 @@ def get_qbit_torrents(qbit_client, category: str) -> dict[str, dict]:
|
||||
return {t["hash"].upper(): t for t in torrents}
|
||||
|
||||
|
||||
def is_complete(torrent: dict) -> bool:
|
||||
"""True iff the torrent's payload is fully on disk.
|
||||
|
||||
A torrent that was once imported can later end up at progress < 1 if the
|
||||
files were deleted or qBittorrent was reset and the torrent was re-added.
|
||||
Those entries must NOT be reported as abandoned-safe: their reported size
|
||||
is the metadata size, not what is actually on disk, so the reclaim figure
|
||||
would be a fiction and a 'safe to delete' verdict could kill a re-grab in
|
||||
progress.
|
||||
"""
|
||||
return float(torrent.get("progress", 0)) >= 1.0
|
||||
|
||||
|
||||
def gib(size_bytes: int) -> str:
|
||||
return f"{size_bytes / 1073741824:.1f}"
|
||||
|
||||
@@ -133,6 +146,12 @@ def find_movie_abandoned(radarr, qbit_movies):
|
||||
torrent = qbit_movies.get(ahash)
|
||||
if torrent is None:
|
||||
continue
|
||||
# Skip torrents whose payload is not fully on disk: their reported size
|
||||
# is metadata, not actual on-disk bytes, so flagging them as
|
||||
# abandoned-safe would lie about the reclaim and could disrupt a
|
||||
# re-download in progress.
|
||||
if not is_complete(torrent):
|
||||
continue
|
||||
|
||||
mid = hash_to_movie.get(ahash)
|
||||
movie = radarr_movies.get(mid) if mid else None
|
||||
@@ -211,6 +230,8 @@ def find_tv_abandoned(sonarr, qbit_tvshows):
|
||||
torrent = qbit_tvshows.get(ahash)
|
||||
if torrent is None:
|
||||
continue
|
||||
if not is_complete(torrent):
|
||||
continue
|
||||
|
||||
status = "SAFE"
|
||||
notes = []
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
config,
|
||||
lib,
|
||||
pkgs,
|
||||
site_config,
|
||||
service_configs,
|
||||
...
|
||||
}:
|
||||
@@ -25,7 +26,7 @@
|
||||
configurePostgres = true;
|
||||
config = {
|
||||
# Refer to https://github.com/dani-garcia/vaultwarden/blob/main/.env.template
|
||||
DOMAIN = "https://bitwarden.${service_configs.https.domain}";
|
||||
DOMAIN = "https://bitwarden.${site_config.domain}";
|
||||
SIGNUPS_ALLOWED = false;
|
||||
|
||||
ROCKET_ADDRESS = "127.0.0.1";
|
||||
@@ -34,7 +35,7 @@
|
||||
};
|
||||
};
|
||||
|
||||
services.caddy.virtualHosts."bitwarden.${service_configs.https.domain}".extraConfig = ''
|
||||
services.caddy.virtualHosts."bitwarden.${site_config.domain}".extraConfig = ''
|
||||
encode zstd gzip
|
||||
|
||||
reverse_proxy :${toString config.services.vaultwarden.config.ROCKET_PORT} {
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
{
|
||||
config,
|
||||
site_config,
|
||||
service_configs,
|
||||
pkgs,
|
||||
lib,
|
||||
@@ -42,8 +43,8 @@ let
|
||||
'';
|
||||
};
|
||||
|
||||
newDomain = service_configs.https.domain;
|
||||
oldDomain = service_configs.https.old_domain;
|
||||
newDomain = site_config.domain;
|
||||
oldDomain = site_config.old_domain;
|
||||
in
|
||||
{
|
||||
imports = [
|
||||
@@ -54,7 +55,7 @@ in
|
||||
|
||||
services.caddy = {
|
||||
enable = true;
|
||||
email = "titaniumtown@proton.me";
|
||||
email = site_config.contact_email;
|
||||
|
||||
# Build with Njalla DNS provider for DNS-01 ACME challenges (wildcard certs)
|
||||
package = pkgs.caddy.withPlugins {
|
||||
@@ -146,8 +147,9 @@ in
|
||||
# defaults: maxretry=5, findtime=10m, bantime=10m
|
||||
|
||||
# Ignore local network IPs - NAT hairpinning causes all LAN traffic to
|
||||
# appear from the router IP (192.168.1.1). Banning it blocks all internal access.
|
||||
ignoreip = "127.0.0.1/8 ::1 192.168.1.0/24";
|
||||
# appear from the router IP (site_config.lan.gateway). Banning it
|
||||
# blocks all internal access.
|
||||
ignoreip = "127.0.0.1/8 ::1 ${site_config.lan.cidr}";
|
||||
};
|
||||
filter.Definition = {
|
||||
# Only match 401s where an Authorization header was actually sent.
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
config,
|
||||
lib,
|
||||
pkgs,
|
||||
site_config,
|
||||
service_configs,
|
||||
inputs,
|
||||
...
|
||||
@@ -32,7 +33,7 @@ let
|
||||
};
|
||||
in
|
||||
{
|
||||
services.caddy.virtualHosts."senior-project.${service_configs.https.domain}".extraConfig = ''
|
||||
services.caddy.virtualHosts."senior-project.${site_config.domain}".extraConfig = ''
|
||||
root * ${hugoWebsite}
|
||||
file_server browse
|
||||
'';
|
||||
|
||||
@@ -34,6 +34,14 @@
|
||||
};
|
||||
};
|
||||
|
||||
users.users.gitea-runner = {
|
||||
isSystemUser = true;
|
||||
group = "gitea-runner";
|
||||
home = "/var/lib/gitea-runner";
|
||||
description = "Gitea Actions CI runner";
|
||||
};
|
||||
users.groups.gitea-runner = { };
|
||||
|
||||
# Override DynamicUser to use our static gitea-runner user, and ensure
|
||||
# the runner doesn't start before the co-located gitea instance is ready
|
||||
# (upstream can't assume locality, so this dependency is ours to add).
|
||||
|
||||
@@ -49,6 +49,32 @@
|
||||
};
|
||||
};
|
||||
|
||||
# Hide repo Actions/workflow details from anonymous visitors. Gitea's own
|
||||
# REQUIRE_SIGNIN_VIEW=expensive does not cover /{user}/{repo}/actions, and
|
||||
# the API auth chain (routers/api/v1/api.go buildAuthGroup) deliberately
|
||||
# omits `auth_service.Session`, so an /api/v1/user probe would 401 even
|
||||
# for logged-in browser sessions. We gate at Caddy instead: forward_auth
|
||||
# probes a lightweight *web-UI* endpoint that does accept session cookies,
|
||||
# and Gitea's own reqSignIn middleware answers 303 to /user/login for
|
||||
# anonymous callers which we rewrite to preserve the original URL.
|
||||
# Workflow status badges stay public so README links keep rendering.
|
||||
services.caddy.virtualHosts.${service_configs.gitea.domain}.extraConfig = ''
|
||||
@repoActionsNotBadge {
|
||||
path_regexp ^/[^/]+/[^/]+/actions(/.*)?$
|
||||
not path_regexp ^/[^/]+/[^/]+/actions/workflows/[^/]+/badge\.svg$
|
||||
}
|
||||
handle @repoActionsNotBadge {
|
||||
forward_auth :${toString service_configs.ports.private.gitea.port} {
|
||||
uri /user/stopwatches
|
||||
|
||||
@unauthorized status 302 303
|
||||
handle_response @unauthorized {
|
||||
redir * /user/login?redirect_to={uri} 302
|
||||
}
|
||||
}
|
||||
}
|
||||
'';
|
||||
|
||||
services.postgresql = {
|
||||
ensureDatabases = [ config.services.gitea.user ];
|
||||
ensureUsers = [
|
||||
|
||||
@@ -687,6 +687,188 @@ let
|
||||
overrides = [ ];
|
||||
};
|
||||
}
|
||||
|
||||
# -- Row 6: Minecraft --
|
||||
{
|
||||
id = 14;
|
||||
type = "stat";
|
||||
title = "Minecraft Players";
|
||||
gridPos = {
|
||||
h = 8;
|
||||
w = 6;
|
||||
x = 0;
|
||||
y = 40;
|
||||
};
|
||||
datasource = promDs;
|
||||
targets = [
|
||||
{
|
||||
datasource = promDs;
|
||||
expr = "sum(minecraft_status_players_online_count) or vector(0)";
|
||||
refId = "A";
|
||||
}
|
||||
];
|
||||
fieldConfig = {
|
||||
defaults = {
|
||||
thresholds = {
|
||||
mode = "absolute";
|
||||
steps = [
|
||||
{
|
||||
color = "green";
|
||||
value = null;
|
||||
}
|
||||
{
|
||||
color = "yellow";
|
||||
value = 3;
|
||||
}
|
||||
{
|
||||
color = "red";
|
||||
value = 6;
|
||||
}
|
||||
];
|
||||
};
|
||||
};
|
||||
overrides = [ ];
|
||||
};
|
||||
options = {
|
||||
reduceOptions = {
|
||||
calcs = [ "lastNotNull" ];
|
||||
fields = "";
|
||||
values = false;
|
||||
};
|
||||
colorMode = "value";
|
||||
graphMode = "area";
|
||||
};
|
||||
}
|
||||
{
|
||||
id = 15;
|
||||
type = "stat";
|
||||
title = "Minecraft Server";
|
||||
gridPos = {
|
||||
h = 8;
|
||||
w = 6;
|
||||
x = 6;
|
||||
y = 40;
|
||||
};
|
||||
datasource = promDs;
|
||||
targets = [
|
||||
{
|
||||
datasource = promDs;
|
||||
expr = "max(minecraft_status_healthy) or vector(0)";
|
||||
refId = "A";
|
||||
}
|
||||
];
|
||||
fieldConfig = {
|
||||
defaults = {
|
||||
mappings = [
|
||||
{
|
||||
type = "value";
|
||||
options = {
|
||||
"0" = {
|
||||
text = "Offline";
|
||||
color = "red";
|
||||
index = 0;
|
||||
};
|
||||
"1" = {
|
||||
text = "Online";
|
||||
color = "green";
|
||||
index = 1;
|
||||
};
|
||||
};
|
||||
}
|
||||
];
|
||||
thresholds = {
|
||||
mode = "absolute";
|
||||
steps = [
|
||||
{
|
||||
color = "red";
|
||||
value = null;
|
||||
}
|
||||
{
|
||||
color = "green";
|
||||
value = 1;
|
||||
}
|
||||
];
|
||||
};
|
||||
};
|
||||
overrides = [ ];
|
||||
};
|
||||
options = {
|
||||
reduceOptions = {
|
||||
calcs = [ "lastNotNull" ];
|
||||
fields = "";
|
||||
values = false;
|
||||
};
|
||||
colorMode = "value";
|
||||
graphMode = "none";
|
||||
};
|
||||
}
|
||||
{
|
||||
id = 16;
|
||||
type = "timeseries";
|
||||
title = "Minecraft Player Activity";
|
||||
gridPos = {
|
||||
h = 8;
|
||||
w = 12;
|
||||
x = 12;
|
||||
y = 40;
|
||||
};
|
||||
datasource = promDs;
|
||||
targets = [
|
||||
{
|
||||
datasource = promDs;
|
||||
expr = "sum(minecraft_status_players_online_count) or vector(0)";
|
||||
legendFormat = "Online players";
|
||||
refId = "A";
|
||||
}
|
||||
{
|
||||
datasource = promDs;
|
||||
expr = "max(minecraft_status_players_max_count) or vector(0)";
|
||||
legendFormat = "Max players";
|
||||
refId = "B";
|
||||
}
|
||||
];
|
||||
fieldConfig = {
|
||||
defaults = {
|
||||
unit = "short";
|
||||
min = 0;
|
||||
decimals = 0;
|
||||
color.mode = "palette-classic";
|
||||
custom = {
|
||||
lineWidth = 2;
|
||||
fillOpacity = 15;
|
||||
spanNulls = true;
|
||||
};
|
||||
};
|
||||
overrides = [
|
||||
{
|
||||
matcher = {
|
||||
id = "byFrameRefID";
|
||||
options = "B";
|
||||
};
|
||||
properties = [
|
||||
{
|
||||
id = "custom.lineStyle";
|
||||
value = {
|
||||
fill = "dash";
|
||||
dash = [
|
||||
8
|
||||
4
|
||||
];
|
||||
};
|
||||
}
|
||||
{
|
||||
id = "custom.fillOpacity";
|
||||
value = 0;
|
||||
}
|
||||
{
|
||||
id = "custom.lineWidth";
|
||||
value = 1;
|
||||
}
|
||||
];
|
||||
}
|
||||
];
|
||||
};
|
||||
}
|
||||
];
|
||||
};
|
||||
in
|
||||
|
||||
@@ -10,6 +10,9 @@ let
|
||||
jellyfinExporterPort = service_configs.ports.private.jellyfin_exporter.port;
|
||||
qbitExporterPort = service_configs.ports.private.qbittorrent_exporter.port;
|
||||
igpuExporterPort = service_configs.ports.private.igpu_exporter.port;
|
||||
minecraftExporterPort = service_configs.ports.private.minecraft_exporter.port;
|
||||
minecraftServerName = service_configs.minecraft.server_name;
|
||||
minecraftServerPort = service_configs.ports.public.minecraft.port;
|
||||
in
|
||||
{
|
||||
# -- Jellyfin Prometheus Exporter --
|
||||
@@ -109,4 +112,45 @@ in
|
||||
REFRESH_PERIOD_MS = "30000";
|
||||
};
|
||||
};
|
||||
|
||||
# -- Minecraft Prometheus Exporter --
|
||||
# itzg/mc-monitor queries the local server via SLP on each scrape and exposes
|
||||
# minecraft_status_{healthy,response_time_seconds,players_online_count,players_max_count}.
|
||||
# mc-monitor binds to 0.0.0.0 (no listen-address flag); the firewall keeps
|
||||
# 9567 internal and IPAddressAllow pins the socket to loopback as defense-in-depth.
|
||||
systemd.services.minecraft-exporter =
|
||||
lib.mkIf (config.services.grafana.enable && config.services.minecraft-servers.enable)
|
||||
{
|
||||
description = "Prometheus exporter for Minecraft (mc-monitor SLP)";
|
||||
after = [
|
||||
"network.target"
|
||||
"minecraft-server-${minecraftServerName}.service"
|
||||
];
|
||||
wantedBy = [ "multi-user.target" ];
|
||||
serviceConfig = {
|
||||
ExecStart = "${lib.getExe pkgs.mc-monitor} export-for-prometheus";
|
||||
Restart = "on-failure";
|
||||
RestartSec = "10s";
|
||||
DynamicUser = true;
|
||||
NoNewPrivileges = true;
|
||||
ProtectSystem = "strict";
|
||||
ProtectHome = true;
|
||||
PrivateTmp = true;
|
||||
MemoryDenyWriteExecute = true;
|
||||
RestrictAddressFamilies = [
|
||||
"AF_INET"
|
||||
"AF_INET6"
|
||||
];
|
||||
IPAddressAllow = [
|
||||
"127.0.0.0/8"
|
||||
"::1/128"
|
||||
];
|
||||
IPAddressDeny = "any";
|
||||
};
|
||||
environment = {
|
||||
EXPORT_SERVERS = "127.0.0.1:${toString minecraftServerPort}";
|
||||
EXPORT_PORT = toString minecraftExporterPort;
|
||||
TIMEOUT = "5s";
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
@@ -95,6 +95,12 @@ in
|
||||
{ targets = [ "127.0.0.1:${toString service_configs.ports.private.igpu_exporter.port}" ]; }
|
||||
];
|
||||
}
|
||||
{
|
||||
job_name = "minecraft";
|
||||
static_configs = [
|
||||
{ targets = [ "127.0.0.1:${toString service_configs.ports.private.minecraft_exporter.port}" ]; }
|
||||
];
|
||||
}
|
||||
{
|
||||
job_name = "zfs";
|
||||
static_configs = [
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
{
|
||||
site_config,
|
||||
service_configs,
|
||||
inputs,
|
||||
pkgs,
|
||||
@@ -9,7 +10,7 @@ let
|
||||
inputs.ytbn-graphing-software.packages.${pkgs.stdenv.targetPlatform.system}.web;
|
||||
in
|
||||
{
|
||||
services.caddy.virtualHosts."graphing.${service_configs.https.domain}".extraConfig = ''
|
||||
services.caddy.virtualHosts."graphing.${site_config.domain}".extraConfig = ''
|
||||
root * ${graphing-calculator}
|
||||
file_server browse
|
||||
'';
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
{
|
||||
config,
|
||||
lib,
|
||||
site_config,
|
||||
service_configs,
|
||||
...
|
||||
}:
|
||||
@@ -19,7 +20,7 @@
|
||||
|
||||
# serve latest deploy store paths (unauthenticated — just a path string)
|
||||
# CI writes to /var/lib/nix-deploy/<hostname> after building
|
||||
services.caddy.virtualHosts."nix-cache.${service_configs.https.domain}".extraConfig = ''
|
||||
services.caddy.virtualHosts."nix-cache.${site_config.domain}".extraConfig = ''
|
||||
handle_path /deploy/* {
|
||||
root * /var/lib/nix-deploy
|
||||
file_server
|
||||
|
||||
@@ -38,6 +38,7 @@ class JellyfinQBittorrentMonitor:
|
||||
stream_bitrate_headroom=1.1,
|
||||
webhook_port=0,
|
||||
webhook_bind="127.0.0.1",
|
||||
gateway_ip=None,
|
||||
):
|
||||
self.jellyfin_url = jellyfin_url
|
||||
self.qbittorrent_url = qbittorrent_url
|
||||
@@ -77,6 +78,15 @@ class JellyfinQBittorrentMonitor:
|
||||
ipaddress.ip_network("fe80::/10"), # IPv6 link-local
|
||||
]
|
||||
|
||||
# Hairpin marker. When a LAN client reaches Jellyfin via the public
|
||||
# hostname, the router NAT-loopbacks the packet and SNATs the source
|
||||
# to itself — the session arrives looking local but still costs WAN
|
||||
# bandwidth. Sessions whose source equals the gateway must therefore
|
||||
# NOT be skipped. None disables the check (pre-hairpin-aware behavior).
|
||||
if gateway_ip is None:
|
||||
gateway_ip = self._discover_default_gateway()
|
||||
self.gateway_ip = gateway_ip
|
||||
|
||||
def is_local_ip(self, ip_address: str) -> bool:
|
||||
"""Check if an IP address is from a local network"""
|
||||
try:
|
||||
@@ -86,6 +96,39 @@ class JellyfinQBittorrentMonitor:
|
||||
logger.warning(f"Invalid IP address format: {ip_address}")
|
||||
return True # Treat invalid IPs as local for safety
|
||||
|
||||
def _discover_default_gateway(self) -> str | None:
|
||||
"""Read the IPv4 default gateway from /proc/net/route, or None."""
|
||||
try:
|
||||
with open("/proc/net/route") as f:
|
||||
next(f) # skip header
|
||||
for line in f:
|
||||
fields = line.split()
|
||||
if len(fields) < 8 or fields[1] != "00000000":
|
||||
continue
|
||||
flags = int(fields[3], 16)
|
||||
if not flags & 0x2: # RTF_GATEWAY
|
||||
continue
|
||||
gw_bytes = bytes.fromhex(fields[2])[::-1] # little-endian
|
||||
if len(gw_bytes) != 4:
|
||||
continue
|
||||
return ".".join(str(b) for b in gw_bytes)
|
||||
except (OSError, ValueError) as e:
|
||||
logger.warning(f"Could not autodetect default gateway: {e}")
|
||||
return None
|
||||
|
||||
def is_skippable(self, ip_address: str) -> bool:
|
||||
"""True iff this source IP can be ignored when deciding to throttle.
|
||||
|
||||
Truly LAN-direct sessions are skippable (no WAN cost). Hairpin-NAT'd
|
||||
LAN sessions arrive with the LAN gateway as their source — those still
|
||||
cost WAN bandwidth and must NOT be skipped.
|
||||
"""
|
||||
if not self.is_local_ip(ip_address):
|
||||
return False
|
||||
if self.gateway_ip and ip_address == self.gateway_ip:
|
||||
return False
|
||||
return True
|
||||
|
||||
def signal_handler(self, signum, frame):
|
||||
logger.info("Received shutdown signal, cleaning up...")
|
||||
self.running = False
|
||||
@@ -164,7 +207,7 @@ class JellyfinQBittorrentMonitor:
|
||||
if (
|
||||
"NowPlayingItem" in session
|
||||
and not session.get("PlayState", {}).get("IsPaused", True)
|
||||
and not self.is_local_ip(session.get("RemoteEndPoint", ""))
|
||||
and not self.is_skippable(session.get("RemoteEndPoint", ""))
|
||||
):
|
||||
item = session["NowPlayingItem"]
|
||||
item_type = item.get("Type", "").lower()
|
||||
@@ -354,6 +397,9 @@ class JellyfinQBittorrentMonitor:
|
||||
logger.info(f"Default stream bitrate: {self.default_stream_bitrate} bps")
|
||||
logger.info(f"Minimum torrent speed: {self.min_torrent_speed} KB/s")
|
||||
logger.info(f"Stream bitrate headroom: {self.stream_bitrate_headroom}x")
|
||||
logger.info(
|
||||
f"LAN gateway (hairpin marker): {self.gateway_ip or 'none / autodetect failed'}"
|
||||
)
|
||||
if self.webhook_port:
|
||||
logger.info(f"Webhook receiver: {self.webhook_bind}:{self.webhook_port}")
|
||||
|
||||
@@ -484,6 +530,7 @@ if __name__ == "__main__":
|
||||
stream_bitrate_headroom = float(os.getenv("STREAM_BITRATE_HEADROOM", "1.1"))
|
||||
webhook_port = int(os.getenv("WEBHOOK_PORT", "0"))
|
||||
webhook_bind = os.getenv("WEBHOOK_BIND", "127.0.0.1")
|
||||
gateway_ip = os.getenv("LAN_GATEWAY_IP") or None
|
||||
|
||||
monitor = JellyfinQBittorrentMonitor(
|
||||
jellyfin_url=jellyfin_url,
|
||||
@@ -499,6 +546,7 @@ if __name__ == "__main__":
|
||||
stream_bitrate_headroom=stream_bitrate_headroom,
|
||||
webhook_port=webhook_port,
|
||||
webhook_bind=webhook_bind,
|
||||
gateway_ip=gateway_ip,
|
||||
)
|
||||
|
||||
monitor.run()
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
{
|
||||
pkgs,
|
||||
config,
|
||||
site_config,
|
||||
service_configs,
|
||||
lib,
|
||||
...
|
||||
@@ -24,7 +25,7 @@
|
||||
inherit (service_configs.jellyfin) dataDir cacheDir;
|
||||
};
|
||||
|
||||
services.caddy.virtualHosts."jellyfin.${service_configs.https.domain}".extraConfig = ''
|
||||
services.caddy.virtualHosts."jellyfin.${site_config.domain}".extraConfig = ''
|
||||
reverse_proxy :${builtins.toString service_configs.ports.private.jellyfin.port} {
|
||||
# Disable response buffering for streaming. Caddy's default partial
|
||||
# buffering delays fMP4-HLS segments and direct-play responses where
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
{
|
||||
pkgs,
|
||||
site_config,
|
||||
service_configs,
|
||||
config,
|
||||
inputs,
|
||||
@@ -24,7 +25,7 @@ in
|
||||
# "Invalid API Key" warning has no client IP, and behind Caddy the
|
||||
# llama-server access log only sees 127.0.0.1. Caddy's JSON log has
|
||||
# the real client IP via request.remote_ip.
|
||||
services.caddy.virtualHosts."llm.${service_configs.https.domain}".extraConfig = ''
|
||||
services.caddy.virtualHosts."llm.${site_config.domain}".extraConfig = ''
|
||||
log {
|
||||
output file /var/log/caddy/access-llama-cpp.log
|
||||
format json
|
||||
@@ -52,8 +53,8 @@ in
|
||||
# defaults: maxretry=5, findtime=10m, bantime=10m
|
||||
|
||||
# NAT hairpinning sends LAN traffic via the router IP. Don't ban
|
||||
# 192.168.1.0/24 or we lock ourselves out.
|
||||
ignoreip = "127.0.0.1/8 ::1 192.168.1.0/24";
|
||||
# our LAN or we lock ourselves out.
|
||||
ignoreip = "127.0.0.1/8 ::1 ${site_config.lan.cidr}";
|
||||
};
|
||||
filter.Definition = {
|
||||
failregex = ''^.*"remote_ip":"<HOST>".*"status":401.*$'';
|
||||
|
||||
@@ -1,13 +1,14 @@
|
||||
{
|
||||
config,
|
||||
lib,
|
||||
site_config,
|
||||
service_configs,
|
||||
...
|
||||
}:
|
||||
{
|
||||
services.coturn = {
|
||||
enable = true;
|
||||
realm = service_configs.https.domain;
|
||||
realm = site_config.domain;
|
||||
use-auth-secret = true;
|
||||
static-auth-secret-file = config.age.secrets.coturn-auth-secret.path;
|
||||
listening-port = service_configs.ports.public.coturn.port;
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
{
|
||||
config,
|
||||
site_config,
|
||||
service_configs,
|
||||
lib,
|
||||
...
|
||||
@@ -23,7 +24,7 @@
|
||||
|
||||
settings.global = {
|
||||
port = [ service_configs.ports.private.matrix.port ];
|
||||
server_name = service_configs.https.domain;
|
||||
server_name = site_config.domain;
|
||||
allow_registration = true;
|
||||
registration_token_file = config.age.secrets.matrix-reg-token.path;
|
||||
|
||||
@@ -43,14 +44,14 @@
|
||||
# TURN server config (coturn)
|
||||
turn_secret_file = config.age.secrets.matrix-turn-secret.path;
|
||||
turn_uris = [
|
||||
"turn:${service_configs.https.domain}?transport=udp"
|
||||
"turn:${service_configs.https.domain}?transport=tcp"
|
||||
"turn:${site_config.domain}?transport=udp"
|
||||
"turn:${site_config.domain}?transport=tcp"
|
||||
];
|
||||
turn_ttl = 86400;
|
||||
};
|
||||
};
|
||||
|
||||
services.caddy.virtualHosts.${service_configs.https.domain}.extraConfig = lib.mkBefore ''
|
||||
services.caddy.virtualHosts.${site_config.domain}.extraConfig = lib.mkBefore ''
|
||||
header /.well-known/matrix/* Content-Type application/json
|
||||
header /.well-known/matrix/* Access-Control-Allow-Origin *
|
||||
respond /.well-known/matrix/server `{"m.server": "${service_configs.matrix.domain}:${builtins.toString service_configs.ports.public.https.port}"}`
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
{
|
||||
pkgs,
|
||||
site_config,
|
||||
service_configs,
|
||||
lib,
|
||||
config,
|
||||
@@ -177,7 +178,7 @@
|
||||
};
|
||||
|
||||
services.caddy.virtualHosts = lib.mkIf (config.services.caddy.enable) {
|
||||
"map.${service_configs.https.domain}".extraConfig = ''
|
||||
"map.${site_config.domain}".extraConfig = ''
|
||||
root * ${service_configs.minecraft.parent_dir}/${service_configs.minecraft.server_name}/squaremap/web
|
||||
file_server browse
|
||||
'';
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
config,
|
||||
lib,
|
||||
pkgs,
|
||||
site_config,
|
||||
username,
|
||||
...
|
||||
}:
|
||||
@@ -25,14 +26,13 @@
|
||||
];
|
||||
|
||||
users.users.${username}.openssh.authorizedKeys.keys = [
|
||||
"ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIO4jL6gYOunUlUtPvGdML0cpbKSsPNqQ1jit4E7U1RyH" # laptop
|
||||
"ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBJjT5QZ3zRDb+V6Em20EYpSEgPW5e/U+06uQGJdraxi" # desktop
|
||||
site_config.ssh_keys.laptop
|
||||
];
|
||||
|
||||
# used for deploying configs to server
|
||||
users.users.root.openssh.authorizedKeys.keys =
|
||||
config.users.users.${username}.openssh.authorizedKeys.keys
|
||||
++ [
|
||||
"ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIC5ZYN6idL/w/mUIfPOH1i+Q/SQXuzAMQUEuWpipx1Pc ci-deploy@muffin"
|
||||
site_config.ssh_keys.ci_deploy
|
||||
];
|
||||
}
|
||||
|
||||
62
site-config.nix
Normal file
62
site-config.nix
Normal file
@@ -0,0 +1,62 @@
|
||||
# Site-wide constants shared across all three hosts and home-manager profiles.
|
||||
#
|
||||
# This file is pure data — no package refs, no module config. Import it from
|
||||
# flake.nix and pass it as the `site_config` specialArg (and extraSpecialArg for
|
||||
# home-manager). Callers read values; they do not set them.
|
||||
#
|
||||
# Adding a value: only add if it's used by ≥2 hosts/modules. Host-specific
|
||||
# single-use values stay in the host's default.nix. Muffin-only service
|
||||
# infrastructure (ports, zpool names, hugepage budgets) stays in
|
||||
# hosts/muffin/service-configs.nix.
|
||||
rec {
|
||||
# --- Identity ---
|
||||
domain = "sigkill.computer";
|
||||
old_domain = "gardling.com"; # served by muffin via permanent redirect (services/caddy/caddy.nix)
|
||||
contact_email = "titaniumtown@proton.me";
|
||||
|
||||
# All three hosts run on the same timezone. Override per-host via
|
||||
# lib.mkForce when travelling (see hosts/mreow/default.nix for the pattern).
|
||||
timezone = "America/New_York";
|
||||
|
||||
# --- Binary cache (muffin serves via harmonia, desktops consume) ---
|
||||
binary_cache = {
|
||||
url = "https://nix-cache.${domain}";
|
||||
public_key = "nix-cache.${domain}-1:ONtQC9gUjL+2yNgMWB68NudPySXhyzJ7I3ra56/NPgk=";
|
||||
};
|
||||
|
||||
# --- LAN topology ---
|
||||
dns_servers = [
|
||||
"1.1.1.1"
|
||||
"9.9.9.9"
|
||||
];
|
||||
|
||||
lan = {
|
||||
cidr = "192.168.1.0/24";
|
||||
gateway = "192.168.1.1";
|
||||
};
|
||||
|
||||
# Per-host network info. mreow is laptop-on-DHCP so it has no entry.
|
||||
hosts = {
|
||||
muffin = {
|
||||
ip = "192.168.1.50";
|
||||
# Canonical alias used by deploy.sh, CI workflows, and borg backup target.
|
||||
# Resolves via /etc/hosts on muffin and the desktops' NetworkManager DNS.
|
||||
alias = "server-public";
|
||||
# SSH host key — same key is served for every alias muffin answers to
|
||||
# (server-public, the IP, git.${domain}, git.${old_domain}).
|
||||
ssh_host_key = "ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFMjgaMnE+zS7tL+m5E7gh9Q9U1zurLdmU0qcmEmaucu";
|
||||
};
|
||||
yarn = {
|
||||
ip = "192.168.1.223";
|
||||
alias = "desktop";
|
||||
};
|
||||
};
|
||||
|
||||
# --- SSH pubkeys ---
|
||||
# One line per key, referenced by name from services/ssh.nix (muffin) and
|
||||
# hosts/yarn/default.nix. Rotating a key means changing it here, nowhere else.
|
||||
ssh_keys = {
|
||||
laptop = "ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIO4jL6gYOunUlUtPvGdML0cpbKSsPNqQ1jit4E7U1RyH";
|
||||
ci_deploy = "ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIC5ZYN6idL/w/mUIfPOH1i+Q/SQXuzAMQUEuWpipx1Pc ci-deploy@muffin";
|
||||
};
|
||||
}
|
||||
196
tests/deploy-finalize.nix
Normal file
196
tests/deploy-finalize.nix
Normal file
@@ -0,0 +1,196 @@
|
||||
# Test for modules/server-deploy-finalize.nix.
|
||||
#
|
||||
# Covers the decision and scheduling logic with fabricated profile directories,
|
||||
# since spawning a second booted NixOS toplevel to diff kernels is too heavy for
|
||||
# a runNixOSTest. We rely on the shellcheck pass baked into writeShellApplication
|
||||
# to catch syntax regressions in the script itself.
|
||||
{
|
||||
lib,
|
||||
pkgs,
|
||||
inputs,
|
||||
...
|
||||
}:
|
||||
pkgs.testers.runNixOSTest {
|
||||
name = "deploy-finalize";
|
||||
|
||||
node.specialArgs = {
|
||||
inherit inputs lib;
|
||||
username = "testuser";
|
||||
};
|
||||
|
||||
nodes.machine =
|
||||
{ ... }:
|
||||
{
|
||||
imports = [
|
||||
../modules/server-deploy-finalize.nix
|
||||
];
|
||||
|
||||
services.deployFinalize = {
|
||||
enable = true;
|
||||
# Shorter default in the test to make expected-substring assertions
|
||||
# stable and reinforce that the option is wired through.
|
||||
delay = 15;
|
||||
};
|
||||
};
|
||||
|
||||
testScript = ''
|
||||
start_all()
|
||||
machine.wait_for_unit("multi-user.target")
|
||||
|
||||
# Test fixtures: fabricated profile trees whose kernel/initrd/kernel-modules
|
||||
# symlinks are under test control. `readlink -e` requires the targets to
|
||||
# exist, so we point at real files in /tmp rather than non-existent paths.
|
||||
machine.succeed(
|
||||
"mkdir -p /tmp/profile-same /tmp/profile-changed-kernel "
|
||||
"/tmp/profile-changed-initrd /tmp/profile-changed-modules "
|
||||
"/tmp/profile-missing /tmp/fake-targets"
|
||||
)
|
||||
machine.succeed(
|
||||
"touch /tmp/fake-targets/alt-kernel /tmp/fake-targets/alt-initrd "
|
||||
"/tmp/fake-targets/alt-modules"
|
||||
)
|
||||
|
||||
booted_kernel = machine.succeed("readlink -e /run/booted-system/kernel").strip()
|
||||
booted_initrd = machine.succeed("readlink -e /run/booted-system/initrd").strip()
|
||||
booted_modules = machine.succeed("readlink -e /run/booted-system/kernel-modules").strip()
|
||||
|
||||
def link_profile(path, kernel, initrd, modules):
|
||||
machine.succeed(f"ln -sf {kernel} {path}/kernel")
|
||||
machine.succeed(f"ln -sf {initrd} {path}/initrd")
|
||||
machine.succeed(f"ln -sf {modules} {path}/kernel-modules")
|
||||
|
||||
# profile-same: matches booted exactly → should choose `switch`.
|
||||
link_profile("/tmp/profile-same", booted_kernel, booted_initrd, booted_modules)
|
||||
machine.succeed("mkdir -p /tmp/profile-same/bin")
|
||||
machine.succeed(
|
||||
"ln -sf /run/current-system/bin/switch-to-configuration "
|
||||
"/tmp/profile-same/bin/switch-to-configuration"
|
||||
)
|
||||
|
||||
# profile-changed-kernel: kernel differs only → should choose `reboot`.
|
||||
link_profile(
|
||||
"/tmp/profile-changed-kernel",
|
||||
"/tmp/fake-targets/alt-kernel",
|
||||
booted_initrd,
|
||||
booted_modules,
|
||||
)
|
||||
|
||||
# profile-changed-initrd: initrd differs only → should choose `reboot`.
|
||||
link_profile(
|
||||
"/tmp/profile-changed-initrd",
|
||||
booted_kernel,
|
||||
"/tmp/fake-targets/alt-initrd",
|
||||
booted_modules,
|
||||
)
|
||||
|
||||
# profile-changed-modules: kernel-modules differs only → should choose `reboot`.
|
||||
# Catches the obelisk PR / nixpkgs auto-upgrade case where modules rebuild
|
||||
# against the same kernel but ABI-incompatible.
|
||||
link_profile(
|
||||
"/tmp/profile-changed-modules",
|
||||
booted_kernel,
|
||||
booted_initrd,
|
||||
"/tmp/fake-targets/alt-modules",
|
||||
)
|
||||
|
||||
# profile-missing: no kernel/initrd/kernel-modules → should fail closed.
|
||||
|
||||
with subtest("dry-run against identical profile selects switch"):
|
||||
rc, out = machine.execute(
|
||||
"deploy-finalize --dry-run --profile /tmp/profile-same 2>&1"
|
||||
)
|
||||
assert rc == 0, f"rc={rc}\n{out}"
|
||||
assert "action=switch" in out, out
|
||||
assert "services only" in out, out
|
||||
assert "dry-run — not scheduling" in out, out
|
||||
assert "would run: /tmp/profile-same/bin/switch-to-configuration switch" in out, out
|
||||
assert "would schedule: systemd-run" in out, out
|
||||
|
||||
with subtest("dry-run against changed-kernel profile selects reboot"):
|
||||
rc, out = machine.execute(
|
||||
"deploy-finalize --dry-run --profile /tmp/profile-changed-kernel 2>&1"
|
||||
)
|
||||
assert rc == 0, f"rc={rc}\n{out}"
|
||||
assert "action=reboot" in out, out
|
||||
assert "reason=kernel changed" in out, out
|
||||
assert "systemctl reboot" in out, out
|
||||
|
||||
with subtest("dry-run against changed-initrd profile selects reboot"):
|
||||
rc, out = machine.execute(
|
||||
"deploy-finalize --dry-run --profile /tmp/profile-changed-initrd 2>&1"
|
||||
)
|
||||
assert rc == 0, f"rc={rc}\n{out}"
|
||||
assert "action=reboot" in out, out
|
||||
assert "reason=initrd changed" in out, out
|
||||
|
||||
with subtest("dry-run against changed-modules profile selects reboot"):
|
||||
rc, out = machine.execute(
|
||||
"deploy-finalize --dry-run --profile /tmp/profile-changed-modules 2>&1"
|
||||
)
|
||||
assert rc == 0, f"rc={rc}\n{out}"
|
||||
assert "action=reboot" in out, out
|
||||
assert "reason=kernel-modules changed" in out, out
|
||||
|
||||
with subtest("dry-run against empty profile fails closed with rc=1"):
|
||||
rc, out = machine.execute(
|
||||
"deploy-finalize --dry-run --profile /tmp/profile-missing 2>&1"
|
||||
)
|
||||
assert rc == 1, f"rc={rc}\n{out}"
|
||||
assert "missing kernel, initrd, or kernel-modules" in out, out
|
||||
|
||||
with subtest("--delay override is reflected in output"):
|
||||
rc, out = machine.execute(
|
||||
"deploy-finalize --dry-run --delay 7 --profile /tmp/profile-same 2>&1"
|
||||
)
|
||||
assert rc == 0, f"rc={rc}\n{out}"
|
||||
assert "delay=7s" in out, out
|
||||
|
||||
with subtest("configured default delay from module option is used"):
|
||||
rc, out = machine.execute(
|
||||
"deploy-finalize --dry-run --profile /tmp/profile-same 2>&1"
|
||||
)
|
||||
assert rc == 0, f"rc={rc}\n{out}"
|
||||
# module option delay=15 in nodes.machine above.
|
||||
assert "delay=15s" in out, out
|
||||
|
||||
with subtest("unknown option rejected with rc=2"):
|
||||
rc, out = machine.execute("deploy-finalize --bogus 2>&1")
|
||||
assert rc == 2, f"rc={rc}\n{out}"
|
||||
assert "unknown option --bogus" in out, out
|
||||
|
||||
with subtest("non-dry run arms a transient systemd timer"):
|
||||
# Long delay so the timer doesn't fire during the test. We stop it
|
||||
# explicitly afterwards.
|
||||
rc, out = machine.execute(
|
||||
"deploy-finalize --delay 3600 --profile /tmp/profile-same 2>&1"
|
||||
)
|
||||
assert rc == 0, f"scheduling rc={rc}\n{out}"
|
||||
# Confirm exactly one transient timer is active.
|
||||
timers = machine.succeed(
|
||||
"systemctl list-units --type=timer --no-legend 'deploy-finalize-*.timer' "
|
||||
"--state=waiting | awk 'NF{print $1}'"
|
||||
).strip().splitlines()
|
||||
assert len(timers) == 1, f"expected exactly one pending timer, got {timers}"
|
||||
assert timers[0].startswith("deploy-finalize-"), timers
|
||||
|
||||
with subtest("back-to-back scheduling cancels the previous timer"):
|
||||
# The previous subtest left one timer armed. Schedule again; the old
|
||||
# one should be stopped before the new unit name is created.
|
||||
machine.succeed("sleep 1") # ensure a distinct unit-name timestamp
|
||||
rc, out = machine.execute(
|
||||
"deploy-finalize --delay 3600 --profile /tmp/profile-same 2>&1"
|
||||
)
|
||||
assert rc == 0, f"second-schedule rc={rc}\n{out}"
|
||||
timers = machine.succeed(
|
||||
"systemctl list-units --type=timer --no-legend 'deploy-finalize-*.timer' "
|
||||
"--state=waiting | awk 'NF{print $1}'"
|
||||
).strip().splitlines()
|
||||
assert len(timers) == 1, f"expected only the new timer, got {timers}"
|
||||
|
||||
# Clean up so the test's shutdown path is quiet.
|
||||
machine.succeed(
|
||||
"systemctl stop 'deploy-finalize-*.timer' 'deploy-finalize-*.service' "
|
||||
"2>/dev/null || true"
|
||||
)
|
||||
'';
|
||||
}
|
||||
@@ -12,10 +12,10 @@
|
||||
...
|
||||
}:
|
||||
let
|
||||
baseServiceConfigs = import ../hosts/muffin/service-configs.nix;
|
||||
baseSiteConfig = import ../site-config.nix;
|
||||
baseServiceConfigs = import ../hosts/muffin/service-configs.nix { site_config = baseSiteConfig; };
|
||||
testServiceConfigs = lib.recursiveUpdate baseServiceConfigs {
|
||||
zpool_ssds = "";
|
||||
https.domain = "test.local";
|
||||
};
|
||||
|
||||
alwaysOk = pkgs.writeShellApplication {
|
||||
|
||||
@@ -5,7 +5,10 @@
|
||||
...
|
||||
}:
|
||||
let
|
||||
baseServiceConfigs = import ../../hosts/muffin/service-configs.nix;
|
||||
baseSiteConfig = import ../../site-config.nix;
|
||||
baseServiceConfigs = import ../../hosts/muffin/service-configs.nix {
|
||||
site_config = baseSiteConfig;
|
||||
};
|
||||
testServiceConfigs = lib.recursiveUpdate baseServiceConfigs {
|
||||
zpool_ssds = "";
|
||||
gitea = {
|
||||
|
||||
@@ -5,10 +5,12 @@
|
||||
...
|
||||
}:
|
||||
let
|
||||
baseServiceConfigs = import ../../hosts/muffin/service-configs.nix;
|
||||
baseSiteConfig = import ../../site-config.nix;
|
||||
baseServiceConfigs = import ../../hosts/muffin/service-configs.nix {
|
||||
site_config = baseSiteConfig;
|
||||
};
|
||||
testServiceConfigs = lib.recursiveUpdate baseServiceConfigs {
|
||||
zpool_ssds = "";
|
||||
https.domain = "test.local";
|
||||
ports.private.immich = {
|
||||
port = 2283;
|
||||
proto = "tcp";
|
||||
|
||||
@@ -5,10 +5,12 @@
|
||||
...
|
||||
}:
|
||||
let
|
||||
baseServiceConfigs = import ../../hosts/muffin/service-configs.nix;
|
||||
baseSiteConfig = import ../../site-config.nix;
|
||||
baseServiceConfigs = import ../../hosts/muffin/service-configs.nix {
|
||||
site_config = baseSiteConfig;
|
||||
};
|
||||
testServiceConfigs = lib.recursiveUpdate baseServiceConfigs {
|
||||
zpool_ssds = "";
|
||||
https.domain = "test.local";
|
||||
jellyfin = {
|
||||
dataDir = "/var/lib/jellyfin";
|
||||
cacheDir = "/var/cache/jellyfin";
|
||||
@@ -33,6 +35,7 @@ let
|
||||
(import ../../services/jellyfin/jellyfin.nix {
|
||||
inherit config pkgs;
|
||||
lib = testLib;
|
||||
site_config = baseSiteConfig;
|
||||
service_configs = testServiceConfigs;
|
||||
})
|
||||
];
|
||||
|
||||
@@ -5,10 +5,12 @@
|
||||
...
|
||||
}:
|
||||
let
|
||||
baseServiceConfigs = import ../../hosts/muffin/service-configs.nix;
|
||||
baseSiteConfig = import ../../site-config.nix;
|
||||
baseServiceConfigs = import ../../hosts/muffin/service-configs.nix {
|
||||
site_config = baseSiteConfig;
|
||||
};
|
||||
testServiceConfigs = lib.recursiveUpdate baseServiceConfigs {
|
||||
zpool_ssds = "";
|
||||
https.domain = "test.local";
|
||||
};
|
||||
|
||||
testLib = lib.extend (
|
||||
@@ -28,6 +30,7 @@ let
|
||||
(import ../../services/bitwarden.nix {
|
||||
inherit config pkgs;
|
||||
lib = testLib;
|
||||
site_config = baseSiteConfig;
|
||||
service_configs = testServiceConfigs;
|
||||
})
|
||||
];
|
||||
|
||||
220
tests/gitea-hide-actions.nix
Normal file
220
tests/gitea-hide-actions.nix
Normal file
@@ -0,0 +1,220 @@
|
||||
{
|
||||
config,
|
||||
lib,
|
||||
pkgs,
|
||||
...
|
||||
}:
|
||||
let
|
||||
baseSiteConfig = import ../site-config.nix;
|
||||
baseServiceConfigs = import ../hosts/muffin/service-configs.nix {
|
||||
site_config = baseSiteConfig;
|
||||
};
|
||||
testServiceConfigs = lib.recursiveUpdate baseServiceConfigs {
|
||||
zpool_ssds = "";
|
||||
gitea = {
|
||||
dir = "/var/lib/gitea";
|
||||
# `:80` makes Caddy bind all hosts on HTTP port 80 with no Host-header
|
||||
# matching — simplest path to a reachable vhost inside the test VM
|
||||
# where there is no ACME / DNS and no TLS terminator.
|
||||
domain = ":80";
|
||||
};
|
||||
ports.private.gitea = {
|
||||
port = 3000;
|
||||
proto = "tcp";
|
||||
};
|
||||
};
|
||||
|
||||
testLib = lib.extend (
|
||||
final: prev: {
|
||||
serviceMountWithZpool =
|
||||
serviceName: zpool: dirs:
|
||||
{ ... }:
|
||||
{ };
|
||||
serviceFilePerms = serviceName: tmpfilesRules: { ... }: { };
|
||||
}
|
||||
);
|
||||
|
||||
giteaModule =
|
||||
{ config, pkgs, ... }:
|
||||
{
|
||||
imports = [
|
||||
(import ../services/gitea/gitea.nix {
|
||||
inherit config pkgs;
|
||||
lib = testLib;
|
||||
service_configs = testServiceConfigs;
|
||||
})
|
||||
];
|
||||
};
|
||||
in
|
||||
pkgs.testers.runNixOSTest {
|
||||
name = "gitea-hide-actions";
|
||||
|
||||
nodes = {
|
||||
server =
|
||||
{
|
||||
config,
|
||||
lib,
|
||||
pkgs,
|
||||
...
|
||||
}:
|
||||
{
|
||||
imports = [
|
||||
../modules/server-security.nix
|
||||
giteaModule
|
||||
];
|
||||
|
||||
# The shared gitea.nix module derives DOMAIN/ROOT_URL from the
|
||||
# `service_configs.gitea.domain` string, which here is the full URL
|
||||
# `http://server`. Override to valid bare values so Gitea doesn't
|
||||
# get a malformed ROOT_URL like `https://http://server`.
|
||||
services.gitea.settings = {
|
||||
server = {
|
||||
DOMAIN = lib.mkForce "server";
|
||||
ROOT_URL = lib.mkForce "http://server/";
|
||||
};
|
||||
# Tests talk HTTP, so drop the Secure flag — otherwise curl's cookie
|
||||
# jar holds the session cookie but never sends it back.
|
||||
session.COOKIE_SECURE = lib.mkForce false;
|
||||
};
|
||||
services.caddy = {
|
||||
enable = true;
|
||||
# No DNS / ACME in the VM test network — serve plain HTTP.
|
||||
globalConfig = ''
|
||||
auto_https off
|
||||
'';
|
||||
};
|
||||
|
||||
services.postgresql.enable = true;
|
||||
|
||||
# Stub out zfs/mount ordering added by the real serviceMountWithZpool.
|
||||
systemd.services."gitea-mounts".enable = lib.mkForce false;
|
||||
systemd.services.gitea = {
|
||||
wants = lib.mkForce [ ];
|
||||
after = lib.mkForce [ "postgresql.service" ];
|
||||
requires = lib.mkForce [ ];
|
||||
};
|
||||
|
||||
networking.firewall.allowedTCPPorts = [
|
||||
80
|
||||
3000
|
||||
];
|
||||
};
|
||||
|
||||
client =
|
||||
{ pkgs, ... }:
|
||||
{
|
||||
environment.systemPackages = [ pkgs.curl ];
|
||||
};
|
||||
};
|
||||
|
||||
testScript = ''
|
||||
import re
|
||||
|
||||
start_all()
|
||||
server.wait_for_unit("postgresql.service")
|
||||
server.wait_for_unit("gitea.service")
|
||||
server.wait_for_unit("caddy.service")
|
||||
server.wait_for_open_port(3000)
|
||||
server.wait_for_open_port(80)
|
||||
|
||||
server.succeed(
|
||||
"su -l gitea -s /bin/sh -c '${pkgs.gitea}/bin/gitea admin user create "
|
||||
"--username testuser --password testpassword "
|
||||
"--email test@test.local --must-change-password=false "
|
||||
"--work-path /var/lib/gitea'"
|
||||
)
|
||||
|
||||
def curl(args, cookies=None):
|
||||
cookie_args = f"-b {cookies} " if cookies else ""
|
||||
cmd = (
|
||||
"curl -4 -s -o /dev/null "
|
||||
f"-w '%{{http_code}}|%{{redirect_url}}' {cookie_args}{args}"
|
||||
)
|
||||
return client.succeed(cmd).strip()
|
||||
|
||||
def login():
|
||||
# Gitea's POST /user/login requires a _csrf token and expects the
|
||||
# matching session cookie already set. Fetch the login form first
|
||||
# to harvest both, then submit credentials with the same cookie jar.
|
||||
client.succeed("rm -f /tmp/cookies.txt")
|
||||
html = client.succeed(
|
||||
"curl -4 -s -c /tmp/cookies.txt http://server/user/login"
|
||||
)
|
||||
match = re.search(r'name="_csrf"\s+value="([^"]+)"', html)
|
||||
assert match, f"CSRF token not found in login form: {html[:500]!r}"
|
||||
csrf = match.group(1)
|
||||
# -L so we follow the post-login redirect; the session cookie is
|
||||
# rewritten by Gitea on successful login to carry uid.
|
||||
client.succeed(
|
||||
"curl -4 -s -L -o /dev/null "
|
||||
"-b /tmp/cookies.txt -c /tmp/cookies.txt "
|
||||
f"--data-urlencode '_csrf={csrf}' "
|
||||
"--data-urlencode 'user_name=testuser' "
|
||||
"--data-urlencode 'password=testpassword' "
|
||||
"http://server/user/login"
|
||||
)
|
||||
# Sanity-check the session by hitting the gated probe directly —
|
||||
# the post-login cookie jar MUST drive /user/stopwatches to 200.
|
||||
probe = client.succeed(
|
||||
"curl -4 -s -o /dev/null -w '%{http_code}' "
|
||||
"-b /tmp/cookies.txt http://server/user/stopwatches"
|
||||
).strip()
|
||||
assert probe == "200", f"session auth probe expected 200, got {probe!r}"
|
||||
return "/tmp/cookies.txt"
|
||||
|
||||
with subtest("Anonymous /{user}/{repo}/actions redirects to login"):
|
||||
result = curl("http://server/foo/bar/actions")
|
||||
code, _, redir = result.partition("|")
|
||||
print(f"anon /foo/bar/actions -> {result!r}")
|
||||
assert code == "302", f"expected 302, got {code!r} (full: {result!r})"
|
||||
assert "/user/login" in redir, f"expected login redirect, got {redir!r}"
|
||||
assert "redirect_to=" in redir, f"expected redirect_to param, got {redir!r}"
|
||||
assert "/foo/bar/actions" in redir, (
|
||||
f"expected original URL preserved in redirect_to, got {redir!r}"
|
||||
)
|
||||
|
||||
with subtest("Anonymous deep /actions paths also redirect"):
|
||||
for path in ["/foo/bar/actions/", "/foo/bar/actions/runs/1", "/foo/bar/actions/workflows/build.yaml"]:
|
||||
result = curl(f"http://server{path}")
|
||||
code, _, redir = result.partition("|")
|
||||
print(f"anon {path} -> {result!r}")
|
||||
assert code == "302", f"{path}: expected 302, got {code!r}"
|
||||
assert "/user/login" in redir, f"{path}: expected login redirect, got {redir!r}"
|
||||
|
||||
with subtest("Anonymous workflow badge stays public"):
|
||||
result = curl("http://server/foo/bar/actions/workflows/ci.yaml/badge.svg")
|
||||
code, _, redir = result.partition("|")
|
||||
print(f"anon badge -> {result!r}")
|
||||
assert code != "302" or "/user/login" not in redir, (
|
||||
f"badge path should not redirect to login, got {result!r}"
|
||||
)
|
||||
|
||||
cookies = login()
|
||||
|
||||
with subtest("Session-authenticated /{user}/{repo}/actions reaches Gitea"):
|
||||
result = curl(
|
||||
"http://server/testuser/nonexistent/actions", cookies=cookies
|
||||
)
|
||||
code, _, redir = result.partition("|")
|
||||
print(f"auth /testuser/nonexistent/actions -> {result!r}")
|
||||
# Gitea returns 404 for the missing repo — the key assertion is that
|
||||
# Caddy's gate forwarded the request instead of redirecting to login.
|
||||
assert not (code == "302" and "/user/login" in redir), (
|
||||
f"session-authed actions request was intercepted by login gate: {result!r}"
|
||||
)
|
||||
|
||||
with subtest("Anonymous /explore/repos is served without gating"):
|
||||
result = curl("http://server/explore/repos")
|
||||
code, _, _ = result.partition("|")
|
||||
print(f"anon /explore/repos -> {result!r}")
|
||||
assert code == "200", f"expected 200 for public explore page, got {result!r}"
|
||||
|
||||
with subtest("Anonymous /{user}/{repo} (non-actions) is not login-gated"):
|
||||
result = curl("http://server/foo/bar")
|
||||
code, _, redir = result.partition("|")
|
||||
print(f"anon /foo/bar -> {result!r}")
|
||||
assert not (code == "302" and "/user/login" in redir), (
|
||||
f"non-actions repo path should not redirect to login: {result!r}"
|
||||
)
|
||||
'';
|
||||
}
|
||||
@@ -428,6 +428,73 @@ pkgs.testers.runNixOSTest {
|
||||
local_playback["PositionTicks"] = 50000000
|
||||
server.succeed(f"curl -sf -X POST 'http://localhost:8096/Sessions/Playing/Stopped' -d '{json.dumps(local_playback)}' -H 'Content-Type:application/json' -H 'X-Emby-Authorization:{local_auth}, Token={local_token}'")
|
||||
|
||||
with subtest("Hairpin'd LAN session (source IP = configured gateway) DOES throttle"):
|
||||
# Simulates a LAN client reaching Jellyfin via the public hostname:
|
||||
# the router SNATs the source to itself, so Jellyfin sees the gateway
|
||||
# IP and IsInLocalNetwork=True even though WAN bandwidth is in play.
|
||||
# We use 127.0.0.1 as the "gateway" in this VM because the localhost
|
||||
# curl below produces source 127.0.0.1 from Jellyfin's view.
|
||||
server.succeed("systemctl stop monitor-test || true")
|
||||
time.sleep(1)
|
||||
server.succeed(f"""
|
||||
systemd-run --unit=monitor-hairpin \
|
||||
--setenv=JELLYFIN_URL=http://localhost:8096 \
|
||||
--setenv=JELLYFIN_API_KEY={token} \
|
||||
--setenv=QBITTORRENT_URL=http://localhost:8080 \
|
||||
--setenv=CHECK_INTERVAL=1 \
|
||||
--setenv=STREAMING_START_DELAY=1 \
|
||||
--setenv=STREAMING_STOP_DELAY=1 \
|
||||
--setenv=TOTAL_BANDWIDTH_BUDGET=50000000 \
|
||||
--setenv=SERVICE_BUFFER=2000000 \
|
||||
--setenv=DEFAULT_STREAM_BITRATE=10000000 \
|
||||
--setenv=MIN_TORRENT_SPEED=100 \
|
||||
--setenv=LAN_GATEWAY_IP=127.0.0.1 \
|
||||
{python} {monitor}
|
||||
""")
|
||||
time.sleep(2)
|
||||
assert not is_throttled(), "Should start unthrottled (no streams yet)"
|
||||
|
||||
hairpin_auth = 'MediaBrowser Client="Hairpin Client", DeviceId="hairpin-2222", Device="HairpinDevice", Version="1.0"'
|
||||
hairpin_auth_result = json.loads(server.succeed(
|
||||
f"curl -sf -X POST 'http://localhost:8096/Users/AuthenticateByName' -d '@${jfLib.payloads.auth}' -H 'Content-Type:application/json' -H 'X-Emby-Authorization:{hairpin_auth}'"
|
||||
))
|
||||
hairpin_token = hairpin_auth_result["AccessToken"]
|
||||
|
||||
hairpin_playback = {
|
||||
"ItemId": movie_id,
|
||||
"MediaSourceId": media_source_id,
|
||||
"PlaySessionId": "test-play-session-hairpin",
|
||||
"CanSeek": True,
|
||||
"IsPaused": False,
|
||||
}
|
||||
server.succeed(f"curl -sf -X POST 'http://localhost:8096/Sessions/Playing' -d '{json.dumps(hairpin_playback)}' -H 'Content-Type:application/json' -H 'X-Emby-Authorization:{hairpin_auth}, Token={hairpin_token}'")
|
||||
time.sleep(3)
|
||||
assert is_throttled(), "Hairpin'd session (source=gateway) should throttle even though source is RFC1918"
|
||||
|
||||
# Cleanup: stop the playback and the override-monitor, restore the normal one.
|
||||
hairpin_playback["PositionTicks"] = 50000000
|
||||
server.succeed(f"curl -sf -X POST 'http://localhost:8096/Sessions/Playing/Stopped' -d '{json.dumps(hairpin_playback)}' -H 'Content-Type:application/json' -H 'X-Emby-Authorization:{hairpin_auth}, Token={hairpin_token}'")
|
||||
time.sleep(2)
|
||||
assert not is_throttled(), "Should unthrottle after hairpin'd playback stops"
|
||||
|
||||
server.succeed("systemctl stop monitor-hairpin || true")
|
||||
time.sleep(1)
|
||||
server.succeed(f"""
|
||||
systemd-run --unit=monitor-test \
|
||||
--setenv=JELLYFIN_URL=http://localhost:8096 \
|
||||
--setenv=JELLYFIN_API_KEY={token} \
|
||||
--setenv=QBITTORRENT_URL=http://localhost:8080 \
|
||||
--setenv=CHECK_INTERVAL=1 \
|
||||
--setenv=STREAMING_START_DELAY=1 \
|
||||
--setenv=STREAMING_STOP_DELAY=1 \
|
||||
--setenv=TOTAL_BANDWIDTH_BUDGET=50000000 \
|
||||
--setenv=SERVICE_BUFFER=2000000 \
|
||||
--setenv=DEFAULT_STREAM_BITRATE=10000000 \
|
||||
--setenv=MIN_TORRENT_SPEED=100 \
|
||||
{python} {monitor}
|
||||
""")
|
||||
time.sleep(2)
|
||||
|
||||
# === WEBHOOK TESTS ===
|
||||
#
|
||||
# Configure the Jellyfin Webhook plugin to target the monitor, then verify
|
||||
@@ -589,7 +656,7 @@ pkgs.testers.runNixOSTest {
|
||||
server.succeed("systemctl restart jellyfin.service")
|
||||
server.wait_for_unit("jellyfin.service")
|
||||
server.wait_for_open_port(8096)
|
||||
server.wait_until_succeeds("curl -sf http://localhost:8096/health | grep -q Healthy", timeout=60)
|
||||
server.wait_until_succeeds("curl -sf http://localhost:8096/health | grep -q Healthy", timeout=180)
|
||||
|
||||
# During Jellyfin restart, monitor can't reach Jellyfin
|
||||
# After restart, sessions are cleared - monitor should eventually unthrottle
|
||||
@@ -645,7 +712,7 @@ pkgs.testers.runNixOSTest {
|
||||
server.succeed("systemctl start jellyfin.service")
|
||||
server.wait_for_unit("jellyfin.service")
|
||||
server.wait_for_open_port(8096)
|
||||
server.wait_until_succeeds("curl -sf http://localhost:8096/health | grep -q Healthy", timeout=60)
|
||||
server.wait_until_succeeds("curl -sf http://localhost:8096/health | grep -q Healthy", timeout=180)
|
||||
|
||||
# After Jellyfin comes back, sessions are gone - should unthrottle
|
||||
time.sleep(3)
|
||||
|
||||
@@ -6,10 +6,10 @@
|
||||
...
|
||||
}:
|
||||
let
|
||||
baseServiceConfigs = import ../hosts/muffin/service-configs.nix;
|
||||
baseSiteConfig = import ../site-config.nix;
|
||||
baseServiceConfigs = import ../hosts/muffin/service-configs.nix { site_config = baseSiteConfig; };
|
||||
testServiceConfigs = lib.recursiveUpdate baseServiceConfigs {
|
||||
zpool_ssds = "";
|
||||
https.domain = "test.local";
|
||||
minecraft.parent_dir = "/var/lib/minecraft";
|
||||
minecraft.memory = rec {
|
||||
heap_size_m = 1000;
|
||||
@@ -31,6 +31,7 @@ testPkgs.testers.runNixOSTest {
|
||||
|
||||
node.specialArgs = {
|
||||
inherit inputs lib;
|
||||
site_config = baseSiteConfig;
|
||||
service_configs = testServiceConfigs;
|
||||
username = "testuser";
|
||||
};
|
||||
|
||||
@@ -13,6 +13,7 @@ in
|
||||
minecraftTest = handleTest ./minecraft.nix;
|
||||
jellyfinQbittorrentMonitorTest = handleTest ./jellyfin-qbittorrent-monitor.nix;
|
||||
deployGuardTest = handleTest ./deploy-guard.nix;
|
||||
deployFinalizeTest = handleTest ./deploy-finalize.nix;
|
||||
filePermsTest = handleTest ./file-perms.nix;
|
||||
|
||||
# fail2ban tests
|
||||
@@ -40,4 +41,7 @@ in
|
||||
|
||||
# gitea runner test
|
||||
giteaRunnerTest = handleTest ./gitea-runner.nix;
|
||||
|
||||
# gitea actions visibility gate test
|
||||
giteaHideActionsTest = handleTest ./gitea-hide-actions.nix;
|
||||
}
|
||||
|
||||
@@ -52,6 +52,7 @@ let
|
||||
SINGLE_CROSS = "A" * 38 + "0C" # movieId=7 single import AND older import for movieId=8
|
||||
SINGLE8_NEW = "A" * 38 + "0D" # movieId=8, newer import → keeper (not in qBit)
|
||||
QUEUED_MOV = "A" * 38 + "0E" # in Radarr queue, not in history
|
||||
INPROGRESS_MOV = "A" * 38 + "0F" # movieId=10, older import, currently re-downloading
|
||||
|
||||
# TV
|
||||
UNMANAGED_TV = "B" * 38 + "01"
|
||||
@@ -62,13 +63,17 @@ let
|
||||
REPACK = "B" * 38 + "06" # episodeId=300, newer import → active
|
||||
REMOVED_TV = "B" * 38 + "07" # episodeId=400, older import (series removed)
|
||||
REMOVED_TV_NEW = "B" * 38 + "08" # episodeId=400, newer import (not in qBit)
|
||||
INPROGRESS_TV = "B" * 38 + "09" # episodeId=500, older import, currently re-downloading
|
||||
INPROGRESS_TV_NEW = "B" * 38 + "0A" # episodeId=500, newer import (not in qBit)
|
||||
INPROGRESS_MOV_NEW = "A" * 38 + "10" # movieId=10, newer import (not in qBit)
|
||||
|
||||
def make_torrent(h, name, size, added_on, state="uploading"):
|
||||
def make_torrent(h, name, size, added_on, state="uploading", progress=1.0):
|
||||
return {
|
||||
"hash": h.lower(),
|
||||
"name": name,
|
||||
"size": size,
|
||||
"state": state,
|
||||
"progress": progress,
|
||||
"added_on": added_on,
|
||||
"content_path": f"/downloads/{name}",
|
||||
}
|
||||
@@ -84,6 +89,9 @@ let
|
||||
make_torrent(LARGER_OLD, "Larger.Movie.2024", 10_737_418_240, 1704067206),
|
||||
make_torrent(SINGLE_CROSS, "SingleCross.Movie.2024", 4_000_000_000, 1704067207),
|
||||
make_torrent(QUEUED_MOV, "Queued.Movie.2024", 2_000_000_000, 1704067208),
|
||||
# In-progress re-download: hash matches an old import, but data is
|
||||
# not yet on disk. Must NOT be flagged as abandoned (regression).
|
||||
make_torrent(INPROGRESS_MOV, "InProgress.Movie.2024", 8_000_000_000, 1704067209, state="downloading", progress=0.05),
|
||||
],
|
||||
"tvshows": [
|
||||
make_torrent(UNMANAGED_TV, "Unmanaged.Show.S01E01", 1_000_000_000, 1704067200),
|
||||
@@ -92,6 +100,7 @@ let
|
||||
make_torrent(NEW_TV, "New.Show.S01E01", 1_200_000_000, 1704067203),
|
||||
make_torrent(SEASON_PACK, "Season.Pack.S02", 5_000_000_000, 1704067204),
|
||||
make_torrent(REMOVED_TV, "Removed.Show.S01E01", 900_000_000, 1704067205),
|
||||
make_torrent(INPROGRESS_TV, "InProgress.Show.S01E01", 1_500_000_000, 1704067209, state="downloading", progress=0.05),
|
||||
],
|
||||
}
|
||||
|
||||
@@ -115,6 +124,9 @@ let
|
||||
{"movieId": 7, "downloadId": SINGLE_CROSS, "eventType": "downloadFolderImported", "date": "2024-03-01T00:00:00Z"},
|
||||
{"movieId": 8, "downloadId": SINGLE_CROSS, "eventType": "downloadFolderImported", "date": "2024-01-01T00:00:00Z"},
|
||||
{"movieId": 8, "downloadId": SINGLE8_NEW, "eventType": "downloadFolderImported", "date": "2024-06-01T00:00:00Z"},
|
||||
# In-progress re-download regression case for movies
|
||||
{"movieId": 10, "downloadId": INPROGRESS_MOV, "eventType": "downloadFolderImported", "date": "2024-01-01T00:00:00Z"},
|
||||
{"movieId": 10, "downloadId": INPROGRESS_MOV_NEW,"eventType": "downloadFolderImported", "date": "2024-06-01T00:00:00Z"},
|
||||
]
|
||||
|
||||
RADARR_MOVIES = [
|
||||
@@ -126,6 +138,7 @@ let
|
||||
{"id": 6, "hasFile": True, "movieFile": {"size": 5_368_709_120, "quality": {"quality": {"name": "Bluray-720p"}}}},
|
||||
{"id": 7, "hasFile": True, "movieFile": {"size": 4_000_000_000, "quality": {"quality": {"name": "Bluray-1080p"}}}},
|
||||
{"id": 8, "hasFile": True, "movieFile": {"size": 5_000_000_000, "quality": {"quality": {"name": "Remux-1080p"}}}},
|
||||
{"id": 10, "hasFile": True, "movieFile": {"size": 8_000_000_000, "quality": {"quality": {"name": "Remux-2160p"}}}},
|
||||
]
|
||||
|
||||
# ── Sonarr mock data ──────────────────────────────────────────────────
|
||||
@@ -148,6 +161,9 @@ let
|
||||
# Removed series scenario
|
||||
{"episodeId": 400, "seriesId": 99, "downloadId": REMOVED_TV, "eventType": "downloadFolderImported", "date": "2024-01-01T00:00:00Z"},
|
||||
{"episodeId": 400, "seriesId": 99, "downloadId": REMOVED_TV_NEW,"eventType": "downloadFolderImported", "date": "2024-06-01T00:00:00Z"},
|
||||
# In-progress re-download regression case for TV
|
||||
{"episodeId": 500, "seriesId": 1, "downloadId": INPROGRESS_TV, "eventType": "downloadFolderImported", "date": "2024-01-01T00:00:00Z"},
|
||||
{"episodeId": 500, "seriesId": 1, "downloadId": INPROGRESS_TV_NEW,"eventType": "downloadFolderImported", "date": "2024-06-01T00:00:00Z"},
|
||||
]
|
||||
SONARR_HISTORY_ALL = SONARR_HISTORY_PAGE1 + SONARR_HISTORY_PAGE2
|
||||
|
||||
@@ -319,14 +335,14 @@ pkgs.testers.runNixOSTest {
|
||||
with subtest("Detects unmanaged movie torrent"):
|
||||
assert "Unmanaged.Movie.2024" in unmanaged_section, \
|
||||
"Should detect unmanaged movie"
|
||||
assert "1 unmanaged / 9 total" in unmanaged_section, \
|
||||
"Should show 1 unmanaged movie out of 9"
|
||||
assert "1 unmanaged / 10 total" in unmanaged_section, \
|
||||
"Should show 1 unmanaged movie out of 10"
|
||||
|
||||
with subtest("Detects unmanaged TV torrent"):
|
||||
assert "Unmanaged.Show.S01E01" in unmanaged_section, \
|
||||
"Should detect unmanaged TV show"
|
||||
assert "1 unmanaged / 6 total" in unmanaged_section, \
|
||||
"Should show 1 unmanaged TV show out of 6"
|
||||
assert "1 unmanaged / 7 total" in unmanaged_section, \
|
||||
"Should show 1 unmanaged TV show out of 7"
|
||||
|
||||
with subtest("Empty category shows zero counts"):
|
||||
assert "0 unmanaged / 0 total" in unmanaged_section, \
|
||||
@@ -380,6 +396,16 @@ pkgs.testers.runNixOSTest {
|
||||
assert "SingleCross.Movie.2024" not in abandoned_section, \
|
||||
"Hash that is sole import for movieId=7 must be in keeper set, not abandoned"
|
||||
|
||||
with subtest("In-progress re-download not abandoned (incomplete payload regression)"):
|
||||
# A torrent whose hash matches an old downloadFolderImported entry but
|
||||
# whose data is not currently on disk (progress < 1.0) must not be
|
||||
# reported as abandoned: its size is metadata, not reclaimable bytes,
|
||||
# and a SAFE verdict could disrupt a re-download in progress.
|
||||
assert "InProgress.Movie.2024" not in abandoned_section, \
|
||||
"In-progress movie re-download must not appear as abandoned"
|
||||
assert "InProgress.Show.S01E01" not in abandoned_section, \
|
||||
"In-progress TV re-download must not appear as abandoned"
|
||||
|
||||
with subtest("Removed movie triggers REVIEW status"):
|
||||
assert "Removed.Movie.2024" in abandoned_section, \
|
||||
"Should detect abandoned torrent for removed movie"
|
||||
|
||||
Reference in New Issue
Block a user