MoreRSS

site iconJacky Wong修改

1997年出生于扬州,前端开发者和音乐制作人。CWGI、只言、R2 Uploader等项目作者。
请复制 RSS 到你的阅读器,或快速订阅到 :

Inoreader Feedly Follow Feedbin Local Reader

Jacky Wong的 RSS 预览

可能是最后一次更换博客引擎

2026-03-18 08:00:00

时间线还是值得记一下: - 2017 年,PHP - 2018 年,Jekyll - 2019 年,Hexo - 2024 年,Astro - 2026 年,Self-Built 这件事其实也不是突然发生的。最近几个月,如果你能看到这个博客仓库的提交记录,大概能看出来我一直在删东西:删不必要的样式,删不必要的依赖,删不必要的中间层,上周甚至连 Tailwind 也一起剔掉了(支持裁员 🤡)。 结果删到最后,我发现最大的那层反而还在,就是框架本身。既然都已经做减法做到这里了,那继续在框架上修修补补就没什么意思了,干脆把框架也干掉。 于是我的博客从 Astro 换成了自建引擎,底层是 Bun。 ## 性能 让 Astro 版本和现在这套引擎在同一台机器上跑同样的 build,对比结果: | 指标 | Astro | Self-Built | 变化 | | --- | --- | --- | --- | | node_modules | 461 MB | 253 MB | -45% | | 构建时间 | 12.1s | 1.6s | -87% | | 构建产物 | 1.9 MB | 1.7 MB | -8% | | 首页体积 (gzip) | 20.4 KB | 11.2 KB | -45% | | 首页体积 (brotli) | 17.9 KB | 9.6 KB | -46% | | 首页文件数 | 5 | 3 | -40% | 这些数字说明了一件很直接的事:Astro 在我的博客上做了太多我根本不需要的工作。 `package.json` 里的依赖条目也少了挺多。`dependencies + devDependencies` 从 `25` 个降到 `12` 个;如果只看运行时 `dependencies`,则是从 `12` 个降到 `3` 个,有两个甚至和前端不相关。 新的引擎其实很简单:`markdown-it` 负责解析 Markdown,`shiki` 负责代码高亮,模板函数负责拼页面,`Bun.serve()` 负责本地开发,build 脚本负责输出静态文件。没有 Vite,没有 Rollup,没有 hydration,也没有额外的内容系统。 还有一个很实际的变化是,build 产物终于变得可预测了。以前首页表面上看也就一个页面,但背后还有 island runtime、renderer chunk、共享 chunk 这些东西,真实体积并不总是直观看得出来。现在这套就直接多了,首页就是明明白白的 HTML、CSS 和 JS 三个文件,没有别的 runtime 藏在后面。 这次顺手还解决了一个 Astro 时代一直很别扭的问题,就是 `atom.xml`。以前 MDX 正文并不能很自然地直接流进 feed 里,结果 feed 一直是一条额外维护的支线:自定义组件要手工转,HTML 要额外清洗,URL 要额外补。现在,正文本身就是 Markdown,feed 直接吃 Markdown,只有遇到自定义内容块时才退化成 Markdown 友好的版本。页面怎么渲染,feed 怎么降级,都是同一层解释器在决定,而不是正文一套逻辑,RSS 再偷偷长出一套逻辑。 ## 动机 我越来越明显地能感觉到:AI 已经在改变抽象层的成本结构。过去很多框架提供的工程收益,在博客这种低复杂度场景里,开始没有以前那么划算了。 这次重构本身,主要也是 Codex 做完的。它花了大概 3 个小时,把整个站点从头到尾重写了一遍。源码层面的变更大概是新增了 `6888` 行代码,删除了 `6344` 行代码。这件事让我重新想了一遍框架的价值。过去这笔交易是成立的:用一点性能,换一套更容易维护的工程结构,模板系统、组件模型、路由约定、内容 schema,这些东西本质上都是为了帮助人更稳定地理解和修改代码。 但 AI coding 打破了这里的平衡。对 Codex 这类 coding agent 来说,一个手写的 HTML 模板函数并不比一个 Astro 组件更难理解。它可以直接顺着 Markdown、模板、样式和脚本一路往下读,再改具体环节。很多原本为了“降低人类维护成本”而存在的抽象,在这种场景下没有以前那么必要了。 但是这不等于框架没用了。恰恰相反,有了 AI 之后,我反而觉得框架真正该解决的问题更清楚了:不是继续发明一套更花哨的模板语法,而是把边界、约束、验证、缓存、产物组织这些事情做扎实。语法糖 AI 可以学得很快,但边界不清、产物不可预测、降级全靠补丁,这些才是真问题。复杂应用、多人协作、长生命周期产品,框架依然值钱。 ## 最后 所以这次大概真的不用再换了,系统已经简单到不太值得继续折腾,接下来真正应该花心思的,就是博客内容了。 最后请欣赏这曼妙的build输出 ⚡: ![video](https://blog-r2.jw1.dev/53qnuAYFbbQPx7bO.mp4)

Desktop notifications for Codex CLI and Claude Code

2026-03-10 08:00:00

## Context This setup was tested on my own machine with: - `Codex CLI 0.113.0` - `Claude Code 2.1.72` - `macOS 26.3.1 (25D2128)` - `arm64` Apple Silicon ![A macOS notification from Codex CLI with the subtitle Notification setup]() --- ## Start with Claude Code’s official setup `Claude Code` already documents the two parts you need: - terminal notifications and terminal integration - hooks for `Notification` and `Stop` - [Hooks reference](https://code.claude.com/docs/en/hooks) - [Hooks guide](https://code.claude.com/docs/en/hooks-guide) - [Terminal config](https://code.claude.com/docs/en/terminal-config) That is the right place to start. On macOS, the most obvious first implementation is also the simplest one: a tiny `osascript` wrapper. File: `$HOME/.claude/notify-osascript.sh` ```bash #!/bin/bash set -euo pipefail MESSAGE="${1:-Claude Code needs your attention}" osascript -e "display notification \"$MESSAGE\" with title \"Claude Code\"" >/dev/null 2>&1 || true ``` And wire it into Claude’s hooks: ```json { "hooks": { "Stop": [ { "hooks": [ { "type": "command", "command": "$HOME/.claude/notify-osascript.sh 'Task completed'" } ] } ], "Notification": [ { "matcher": "", "hooks": [ { "type": "command", "command": "$HOME/.claude/notify-osascript.sh 'Claude Code needs your attention'" } ] } ] } } ``` This worked, but only technically. - Clicking the notification did not cleanly bring me back to the terminal app. - There was no grouping, so notifications piled up. - Once terminal-native notifications entered the picture, especially in `Ghostty`, duplicate alerts got annoying. That was the point where `terminal-notifier` became the better base layer. ## Why I switched to terminal-notifier The official repo is here: - [julienXX/terminal-notifier](https://github.com/julienXX/terminal-notifier) Install it with Homebrew: ```bash brew install terminal-notifier ``` Then verify it: ```bash which terminal-notifier terminal-notifier -help | head ``` The three features that made it worth switching: - `-activate`, so clicking the notification can bring my terminal app to the front - `-group`, so I can keep one live notification per project instead of stacking old ones - better control over subtitle, sound, and macOS notification-center behavior --- ## A shared notification helper Before touching either tool, create one shared helper: ```bash mkdir -p "$HOME/.local/bin" ``` File: `$HOME/.local/bin/mac-notify.sh` ```bash #!/bin/bash set -euo pipefail TITLE="${1:?title is required}" MESSAGE="${2:-}" SUBTITLE="${3:-}" GROUP="${4:-}" SOUND="${5:-Submarine}" case "${TERM_PROGRAM:-}" in ghostty) BUNDLE_ID="com.mitchellh.ghostty" ;; iTerm.app) BUNDLE_ID="com.googlecode.iterm2" ;; Apple_Terminal) BUNDLE_ID="com.apple.Terminal" ;; vscode) BUNDLE_ID="com.microsoft.VSCode" ;; cursor) BUNDLE_ID="com.todesktop.230313mzl4w4u92" ;; zed) BUNDLE_ID="dev.zed.Zed" ;; *) BUNDLE_ID="" ;; esac TERMINAL_NOTIFIER="" if [ -x /opt/homebrew/bin/terminal-notifier ]; then TERMINAL_NOTIFIER="/opt/homebrew/bin/terminal-notifier" elif command -v terminal-notifier >/dev/null 2>&1; then TERMINAL_NOTIFIER="$(command -v terminal-notifier)" fi if [ -n "$TERMINAL_NOTIFIER" ]; then ARGS=( -title "$TITLE" -message "$MESSAGE" -sound "$SOUND" ) if [ -n "$SUBTITLE" ]; then ARGS+=(-subtitle "$SUBTITLE") fi if [ -n "$GROUP" ]; then ARGS+=(-group "$GROUP") fi if [ -n "$BUNDLE_ID" ]; then ARGS+=(-activate "$BUNDLE_ID") fi "$TERMINAL_NOTIFIER" "${ARGS[@]}" exit 0 fi SAFE_MESSAGE="${MESSAGE//\\/\\\\}" SAFE_MESSAGE="${SAFE_MESSAGE//\"/\\\"}" SAFE_SUBTITLE="${SUBTITLE//\\/\\\\}" SAFE_SUBTITLE="${SAFE_SUBTITLE//\"/\\\"}" osascript -e "display notification \"$SAFE_MESSAGE\" with title \"$TITLE\" subtitle \"$SAFE_SUBTITLE\" sound name \"$SOUND\"" >/dev/null 2>&1 || true ``` Make it executable: ```bash chmod +x "$HOME/.local/bin/mac-notify.sh" ``` I scope notification groups by tool and project, not by message. That gives me one live `Claude Code` notification and one live `Codex CLI` notification per repo instead of a growing stack. ### How click-to-focus works The key line is: ```bash -activate "$BUNDLE_ID" ``` `terminal-notifier` accepts a macOS bundle id and activates that app when the notification is clicked. I map the common values from `TERM_PROGRAM`: - `com.mitchellh.ghostty` - `com.googlecode.iterm2` - `com.apple.Terminal` - `com.microsoft.VSCode` - `com.todesktop.230313mzl4w4u92` for Cursor - `dev.zed.Zed` This does not target one exact split or tab. It just brings the app to the front, which is good enough for this workflow. --- ## Claude Code: attention notifications and completion notifications I split notifications into two categories: - `Notification`: Claude needs me to do something, like approve a permission request or answer a prompt - `Stop`: the main agent finished responding - `Notification` for permission prompts or other attention-needed states - `Stop` for completion ### Claude notification script File: `$HOME/.claude/notify.sh` ```bash #!/bin/bash set -euo pipefail MESSAGE="${1:-Claude Code needs your attention}" PROJECT_DIR="${PWD:-$HOME}" PROJECT_NAME="$(basename "$PROJECT_DIR")" [ "$PROJECT_NAME" = "/" ] && PROJECT_NAME="Home" PROJECT_HASH="$(printf '%s' "$PROJECT_DIR" | shasum -a 1 | awk '{print $1}' | cut -c1-12)" GROUP="claude-code:${PROJECT_HASH}" "$HOME/.local/bin/mac-notify.sh" "Claude Code" "$MESSAGE" "$PROJECT_NAME" "$GROUP" ``` ```bash chmod +x "$HOME/.claude/notify.sh" ``` ### Claude hooks configuration File: `$HOME/.claude/settings.json` ```json { "hooks": { "Stop": [ { "hooks": [ { "type": "command", "command": "$HOME/.claude/notify.sh 'Task completed'" } ] } ], "Notification": [ { "matcher": "permission_prompt", "hooks": [ { "type": "command", "command": "$HOME/.claude/notify.sh 'Permission needed'" } ] }, { "matcher": "idle_prompt", "hooks": [ { "type": "command", "command": "$HOME/.claude/notify.sh 'Waiting for your input'" } ] } ] } } ``` If you do not care about different notification types, an empty matcher `""` is enough. One detail worth remembering: Claude snapshots hooks at startup. If changes do not seem to apply, restart the session. Also check macOS notification permissions if nothing shows up. --- ## Codex CLI: completion notifications For `Codex CLI`, the mechanism is not `hooks`. It is `notify`. Official docs: - [Advanced Configuration](https://developers.openai.com/codex/config-advanced) - [Configuration Reference](https://developers.openai.com/codex/config-reference) As of `2026-03-10`, Codex documents external `notify` for supported events like `agent-turn-complete`. So in practice: - completion notifications: yes - Claude-style permission notifications through the same external script: no Approval reminders in Codex are a separate `tui.notifications` problem. ### Codex notify script File: `$HOME/.codex/notify.sh` ```bash #!/bin/bash set -euo pipefail PAYLOAD="${1:-}" [ -n "$PAYLOAD" ] || exit 0 python3 - "$PAYLOAD" <<'PY' import json import pathlib import sqlite3 import subprocess import sys import zlib from datetime import datetime, timezone CODEX_HOME = pathlib.Path.home() / '.codex' def log_skip(reason: str, payload: dict, **extra: object) -> None: log_path = CODEX_HOME / 'notify-filter.log' data = { 'ts': datetime.now(timezone.utc).isoformat(), 'reason': reason, 'client': payload.get('client'), 'thread-id': payload.get('thread-id'), 'cwd': payload.get('cwd'), } data.update(extra) with log_path.open('a', encoding='utf-8') as fh: fh.write(json.dumps(data, ensure_ascii=True) + '\n') def get_thread_originator(thread_id: str) -> tuple[str, str]: db_path = CODEX_HOME / 'state_5.sqlite' if not db_path.exists(): return '', '' try: with sqlite3.connect(db_path) as conn: cur = conn.cursor() cur.execute('select rollout_path, source from threads where id = ?', (thread_id,)) row = cur.fetchone() except Exception: return '', '' if not row: return '', '' rollout_path, source = row if not rollout_path: return '', source or '' try: first_line = pathlib.Path(rollout_path).read_text(encoding='utf-8', errors='ignore').splitlines()[0] payload = json.loads(first_line).get('payload', {}) except Exception: return '', source or '' return (payload.get('originator') or '').strip(), source or '' try: payload = json.loads(sys.argv[1]) except Exception: raise SystemExit(0) if payload.get('type') != 'agent-turn-complete': raise SystemExit(0) client = (payload.get('client') or '').strip().lower() if client and ('app' in client or client == 'appserver'): log_skip('skip-app-client', payload) raise SystemExit(0) thread_id = (payload.get('thread-id') or '').strip() if thread_id: originator, source = get_thread_originator(thread_id) if originator == 'Codex Desktop': log_skip('skip-desktop-originator', payload, originator=originator, source=source) raise SystemExit(0) cwd = payload.get('cwd') or '' subtitle = pathlib.Path(cwd).name if cwd else 'Task completed' message = (payload.get('last-assistant-message') or 'Task completed').replace('\n', ' ').strip() if not message: message = 'Task completed' if cwd: group = 'codex-cli:' + format(zlib.crc32(cwd.encode('utf-8')) & 0xFFFFFFFF, '08x') else: group = 'codex-cli:' + (payload.get('thread-id') or 'default') subprocess.run( [ str(pathlib.Path.home() / '.local' / 'bin' / 'mac-notify.sh'), 'Codex CLI', message[:180], subtitle, group, ], check=False, ) PY ``` ```bash chmod +x "$HOME/.codex/notify.sh" ``` ### Codex config File: `$HOME/.codex/config.toml` ```toml notify = ["/Users/you/.codex/notify.sh"] ``` Use any absolute path you want. I keep the script under `~/.codex/`. --- ## If you use Ghostty, disable terminal-native desktop notifications I hit one more annoying edge case in `Ghostty`: duplicate notifications. What happened was: - my script sent a notification through `terminal-notifier` - `Ghostty` also surfaced a terminal-native desktop notification That produced two macOS notifications for one event. On my machine, the clean fix was to keep `terminal-notifier` as the only notification channel and disable Ghostty’s terminal-native desktop notifications: File: `~/Library/Application Support/com.mitchellh.ghostty/config` ```plaintext desktop-notifications = false ``` Why I prefer this setup: - `terminal-notifier` gives me `-activate`, so click-to-focus still works - `terminal-notifier` gives me `-group`, so notifications stay scoped per project - both `Claude Code` and `Codex CLI` behave the same way Ghostty’s config docs describe `desktop-notifications` as the switch that lets terminal apps show desktop notifications via escape sequences such as `OSC 9` and `OSC 777`. Turning it off avoids the extra notification layer. --- ## If you also use Codex App This is the part that bit me. At first I assumed filtering by the `client` field would be enough. It was not. On my machine, some sessions started from `Codex App` looked like this in local session metadata: ```json { "originator": "Codex Desktop", "source": "vscode" } ``` That creates a duplicate-notification problem: - Codex App shows its own notification - the local CLI `notify` script can still fire - I get duplicate notifications for the same task So the script does two things: 1. fast path: skip obvious app-like `client` values 2. fallback: read `thread-id` from the `notify` payload, query `~/.codex/state_5.sqlite`, load the first `session_meta` line, and skip if `originator == "Codex Desktop"` That is why the script above checks local thread metadata instead of trusting only `client`. I also log skipped events to: ```text ~/.codex/notify-filter.log ``` That makes debugging much easier if Codex changes its session metadata format later. > This part is based on observed local behavior, not on a stable public contract from the docs. If OpenAI changes how Codex App identifies local sessions in future versions, the filter may need a small update. --- ## References - [OpenAI Codex Advanced Configuration](https://developers.openai.com/codex/config-advanced) - [OpenAI Codex Configuration Reference](https://developers.openai.com/codex/config-reference) - [Anthropic Claude Code Hooks Reference](https://code.claude.com/docs/en/hooks) - [Anthropic Claude Code Hooks Guide](https://code.claude.com/docs/en/hooks-guide) - [Anthropic Claude Code Terminal Configuration](https://code.claude.com/docs/en/terminal-config) - [terminal-notifier](https://github.com/julienXX/terminal-notifier)

Dating App Sucks Pt.2

2026-03-03 08:00:00

Ok here we go again. I think I've finally figured out the scariest thing about dating apps. They do actually turn finding love into a fucking job search. > Every date feels like a business meeting or something, no sparks, pure cringe. Think about it, we fill out our "resumes" with our best photos and wittiest bios. We list our "desired positions" in the filters. We swipe through "candidates" hoping to get a "decent offer". The whole thing is an HR pipeline with better lighting. But love is the exact opposite of a job search, which follows logic. Love? Personally, I think there is no logic in love. Love is a bias, a fucking tyranny. The bias is that you only want one specific person to do the things literally anyone could do. The tyranny is that you pour all your emotions, irrationally, recklessly, entirely onto another human being. And dating apps have always given me this weird feeling, love obtained through this process feels so bland it's almost offensive. If I were a planet, this whole approach would be like some engineer calculated the perfect speed, angle, and mass, then launched another planet at precisely the right time so we'd form a nice, stable binary star system. How romantic. How efficient, how abso-fucking-lutely dead inside. What I want is a rogue planet hurtling toward me at full speed out of nowhere in the middle of the void. The moment we touch, atoms from two entirely separate worlds are forced into lattices they were never meant to share. Molecular bonds snap, shatter, and reform into something unrecognizable. The pressure breeds temperatures that fuse nuclei into heavy, unnamed elements that no periodic table has ever seen, existing for a few picoseconds before decaying into something else entirely. Oceans of molten rock erupt outward, entire crusts peeled off like skin, shockwaves rippling through mantles at speeds no device could ever measure. What used to be two worlds is now a single, blinding wound in space. Some debris escapes into strange new orbits. The rest? Fuses together so tightly that nothing, not time, not entropy, can pull it apart, until our one last atom is annihilated with the heat death of the universe. I'm not saying dating apps are pure evil, you could still meet someone real on there, the odds exist. But what's truly terrifying about these things is that they teach you how to NOT invest. Everyone on there wants low-risk love. A guaranteed return with minimal downside. But since when has that ever been how love works? I've seen people around me become professional swipers. Always chatting, always got girls around them. And then what? This one's family background isn't great. That one's not pretty enough. Another one said something weird at dinner that gave them the "ick". Next. Next. Next. Bro, stop cos-ing a fucking conveyor belt. Being overly rational in love is a slow way to lose everything. The second anything feels slightly off, they're gone. No friction allowed. But no friction means no sparks either. They end up like the guy in Socrates' wheat field parable, walking through the field, always convinced a bigger stalk is just ahead, waiting, but never actually picking one. And the field does end. It very much does end. Uninstalled.

The Cursor Moment in Music Production

2026-01-08 08:00:00

I've been thinking about this for a while. Cursor didn't change programming because it could write code. It changed programming because it **made real work faster while keeping every line editable**. That's the key. So when does music production get its Cursor moment? --- ## My imagination of this AI DAW Not magic. **Delegation with control.** And the difficulty scales fast. **Level 1:** "Generate a 4-bar piano MIDI with emotion." Already harder than it sounds. Pitch, velocity, note length, micro-timing, articulation, envelopes. Emotion isn't metadata, it's embedded in low-level decisions. Like writing a small utility function. Simple scope, high quality bar. **Level 2:** "Generate a 16-bar violin MIDI that matches the drum." Everything from Level 1, plus context. Groove awareness, phrasing, rhythmic interaction. The model has to listen. Like adding a feature that integrates with an existing module. **Level 3:** "Generate a sequence using Serum, make bars 8-16 flow, sidechain from track 2, match the vibe." This is the inflection point. Now the AI needs full DAW access, third-party plugin knowledge, routing logic, arrangement continuity, aesthetic coherence. Plugins become libraries. You need something like documentation context for tools, not just parameters. Multi-system engineering, not generation. **Level 4:** "Generate a clean vocal track based on the whole song." Lyrics, melody, phrasing, emotion, refinement at the word and timing level. Like adding a major feature to a large codebase. One-shot attempts will fail. But with a human-in-the-loop, reviewing, steering, refining, this becomes feasible. The AI drafts, the human produces. **Level 5:** "Give me fire." This must fail. Just like "make me Facebook" in coding, the spec is undefined. Taste *is* the task. Neither humans nor AI can guarantee this. --- ## Who could even make this Building a brand-new DAW? Dead end. Producers are deeply locked into their tools. Switching DAWs isn't like switching editors, it means relearning muscle memory, mental models, creative habits. For many producers, it's practically impossible. So any Cursor moment has to either sit on top of existing DAWs or deeply integrate with them. **Splice** has an interesting edge. Not DAW engineering, but data. Cross-DAW usage, massive libraries, user behavior at scale. Its natural position is as an intelligence layer, something like an LLM for music production that other tools call into. **Apple and Logic Pro** already ship features that hint at an agentic future. Session players that suggest MIDI, react to reference audio, generate parts from scratch. Apple has vertical integration: hardware, OS, DAW. It can ship something real. But it's also closed. A Cursor-like ecosystem thrives on extensibility, not just polished features. **Ableton Live** is interesting for a specific reason: its project files are XML-based. That means sessions are structured, serializable, writable. In principle, an AI can already read and modify a Live project the way a coding agent reads and edits source files. The blocker isn't the format. It's the model. The IDE substrate exists. What's missing is a music-production-focused base model that understands intent, taste, and workflow, not just structure. --- ## Why this hasn't happened yet Three hard blockers. **No clear "text of music."** Code has text. Serializable, diffable, composable. That's why LLMs worked so well, so fast. Music doesn't have a single equivalent. MIDI is editable but incomplete. Audio is complete but opaque. Without a clear fundamental representation, everything above it becomes fragile. **No production-native base model.** Cursor didn't invent intelligence. It orchestrated a strong base model. There's no equivalent yet for music production, one that understands MIDI, audio, arrangement, plugins, mixing, and taste as a unified domain. Current models generate outputs, they don't reason inside workflows. **Locked ecosystems.** There's no VS Code-level DAW that is open and dominant. DAWs are closed, fragmented, deeply personal. That pushes AI to the plugin layer, where integration is safer and optional. That's why we see "AI inside plugins" everywhere, and almost nowhere at the DAW core. --- ## Where we are now Here's something I made in 30 minutes, mostly Splice samples: [Piece made in 30 min]() ![Ableton Live project screenshot]() This is probably the fastest way to get something production-ready with a traditional workflow. But let's be honest, the vibe factor is low. It's assembled, not created. What's not AI here? Pretty much everything that matters. Sidechain, dialed in by hand. Reverb, tweaked until it sat right. Arrangement, mixing, mastering, all me. And what kind of AI do we have now in music production? Splice. It uses AI to find sounds faster, made matching tones easier. Real gains, but still operating inside the traditional production phase. Plugins. Pitch correction, noise reduction, vocal tuning, loudness matching. These are genuinely useful. They save time. But they don't change how music is made, they just speed up tasks inside the same old workflow. Then there are the full-track generators. Suno, Udio, you name it. They can spit out complete songs, sometimes with lyrics, and honestly the results are surprisingly not bad. For background music, promotional videos, that kind of stuff? They work. Fast, cheap, good enough. But at this point they skip something critical: **production**. ### The paradox Prompt → final audio. No MIDI, no arrangement control, no micro-timing, no note-level editing. You get output, not a workspace. It's closer to collage than composition. Yes, I know Suno has its own studio app, the thing is, if Suno really wants to enter serious production, it needs fine-grained control. How much sidechain pump fits my taste? How hard should the compressor hit before the vocal sounds perfect? What's the right master loudness? Should I apply true peak limiting? But the moment you add those controls, you're building a DAW. And those controls still require professional knowledge to use well. A beginner and a seasoned producer using the same AI tool will produce vastly different results. Just like a junior dev and a senior dev using Cursor. The tool accelerates. It doesn't replace judgment. ### The taste In code, taste is often invisible to users. A shitty function and a beautifully designed one can produce the same result: it works. Architecture and elegance mostly matter to developers. Music doesn't work like that. Humans are extremely sensitive to sound. Timing, tone, balance, texture, these are immediately perceptible. Choose a bad string library? Listener knows instantly. There's no abstraction layer that hides bad taste. And there's no "ship now, fix later" model. Software can be patched. Music can't. Once it's released, it's frozen. The first version is the version. ### The moving target There's something even deeper. Code is functional, music is cultural. Assembly from 1950 still runs correctly today. Music from 1950? Technically fine, culturally dated. Code has a stable target: correctness. Music's target drifts with time, with generations, with vibes no one can fully articulate. AI learning code is learning "what works." AI learning music is learning "what felt good to people in the past." But taste keeps moving. Training data is always yesterday. The ground truth itself is in motion. That's why skipping the production phase works for low-stakes content, but fails for anything serious. --- ## So when? Not when AI makes better songs. Not when generation gets faster. It happens when AI can **work inside music production**, not around it. When it accelerates real workflows, preserves taste, and allows refinement down to the smallest unit. I've been wanting to build this myself. A VST that generates context-aware MIDI, something that listens to what's already in the session and proposes what comes next. A few years ago, the blocker was obvious: I don't write C++, I don't know JUCE. Now? The blocker has shifted. I could probably vibe-code my way through the plugin architecture. But training a model that actually understands musical context? That's where it gets hard. Really hard.

2025

2025-12-29 08:00:00

Sup, it's December 30th, 2025, 4:37 AM, and I just started to write this wrap-up blog. This year felt different, things moved forward, also got heavier. ## The tension I joined Flowith when it was a 6-person team. Now it's around 40. That kind of growth changes how you work. I went from writing code to mostly keeping things from breaking. Code itself got weird. Coding agents, automations, black boxes everywhere. More production errors, but they get fixed faster. Development speed is up, so is responsibility. I care less about elegant code and more about whether the system actually survives. The trade-off works, I guess, for now. My WeChat contacts doubled. More conversations, more group chats, more weak ties. I'm not feeling more social, just more connected in a noisier way. This leaked into everything else. ## The scatter After the [thing](https://jw1.dev/breakup) in May, I definitely got more time and money to waste. Traveled a lot. Yunnan, Thailand, Macau, Thailand again, plus countless shorter trips by car and train. Took a lot of pictures, thought maybe put some in the post but you guys must have seen them so nah. Picked up billiards 🎱 after Yunnan, went from complete (somehow) beginner to clearing tables pretty fast, nice to know I can still get good at new things. Built a gaming PC recently and I barely have time to use it. Bought my first watch after years of not wearing one, not an Apple one, surprise, it's a Casio. Back home, the old house got torn down. Reconstruction started. Biggest expense my family took on this year, but it feels right. Before the Chinese New Year, the place that raised me will be there again, just renewed. Gotta say, time and money, nicely wasted! ## The broken Mind's definitely broken. Haven't really found my footing since that [thing](https://jw1.dev/breakup) happened. Living alone means I can do whatever the fuck I want, stay up all night? Zero, people, care. I know, of course I want someone new, been swiping on dating apps for months, but you guys know [how that went](https://jw1.dev/dating-app-sucks). Finding the right person is hard. You pick wrong? lifetime regret, you don't pick? parents nag. Sometimes I really envy my parents' or grandparents' generation, love was simple, almost pure. Body's broken too. My body's been keeping track, whether I like it or not. Weight's the same (good news?), everything else changed. Neck pain, lower back pain, more frequent now. Wrinkles at my eyes. Hairline maybe retreating. Chronic(慢性) rhinitis(鼻炎) and pharyngitis(咽炎) getting harder to ignore. I should go to the hospital. I don't want to. Some things are easier to postpone than to face. ## The view The future doesn't excite me like it used to. It feels conditional now. We're in a time where everything can change overnight. We want things to happen, until they don't benefit us. Then we hope nothing changes at all. Maybe that's just growing up. ## ... Shit man, 4 AM brain definitely got the mood 👀.