2025-12-04 16:33:21
PostgreSQL 18 was released in September 2025.
https://www.postgresql.org/about/news/postgresql-18-released-3142/
Some folks are excited about the native support for UUID v7, which is indeed awesome — but personally, I think the introduction of async I/O is the real game changer.
In this article, as a humble member of the GIS cluster world, I’m going to benchmark PostGIS to see how much performance improvement we can actually expect from async in PostgreSQL 18.
To reduce variability in conditions, I picked a cloud environment. I often use AWS, Supabase, or Neon, but at the time of writing, PostgreSQL 18 is available only on Amazon RDS — so AWS wins this time. Default configuration:
set io_method to worker. io_uring isn't supported yet.
We’ll prepare two RDS instances — one with async enabled and one without — execute the exact same query, and compare the execution plans.
Load Overture Maps Buildings dataset (Hokkaido region, ~3M polygons)
Implement a Web API that serves MVT tiles from this table
Issue tile requests and observe the behavior
This is not meant to be a super-strict benchmark — just trying to capture the overall trend (yes, that’s an excuse)
ogr2ogr -f PostgreSQL postgresql://username:password@pg17.<suppressed>.ap-northeast-1.rds.amazonaws.com:5432/postgres buildings.parquet -progress
Create MVT tiles using ST_AsMVT:
// Using drizzle, so written in JavaScript syntax
const _sql = sql`
WITH bounds AS (
SELECT ST_TileEnvelope(${z}::int, ${x}::int, ${y}::int) AS geom
),
mvtgeom AS (
SELECT
ST_AsMVTGeom(
ST_Transform(buildings.geometry, 3857),
bounds.geom,
4096,
256,
true
) AS geom,
buildings.ogc_fid
FROM public.buildings, bounds
WHERE
buildings.geometry && ST_Transform(bounds.geom, 4326)
AND ST_Intersects(buildings.geometry, ST_Transform(bounds.geom, 4326))
)
SELECT ST_AsMVT(mvtgeom.*, 'layer', 4096, 'geom') FROM mvtgeom;
`;
Then I reviewed the execution plans. Below are results from the same query executed on PostgreSQL 18 with sync vs async.
{
pg18Sync: [
'Aggregate (cost=2850.42..2850.43 rows=1 width=32) (actual time=2908.535..2908.536 rows=1.00 loops=1)',
' Buffers: shared hit=1913',
' -> Index Scan using bq_rg_geometry_geom_idx on bq_rg f (cost=0.41..1557.29 rows=94 width=146) (actual time=0.159..528.852 rows=20564.00 loops=1)',
" Index Cond: ((geometry && '0103000020E610000001000000050000000000000000C161404F90BC55CB4645400000000000C161408C938850D18845400000000080D761408C938850D18845400000000080D761404F90BC55CB4645400000000000C161404F90BC55CB464540'::geometry) AND (geometry && '0103000020E610000001000000050000000000000000C161404F90BC55CB4645400000000000C161408C938850D18845400000000080D761408C938850D18845400000000080D761404F90BC55CB4645400000000000C161404F90BC55CB464540'::geometry))",
" Filter: st_intersects(geometry, '0103000020E610000001000000050000000000000000C161404F90BC55CB4645400000000000C161408C938850D18845400000000080D761408C938850D18845400000000080D761404F90BC55CB4645400000000000C161404F90BC55CB464540'::geometry)",
' Rows Removed by Filter: 3',
' Index Searches: 1',
' Buffers: shared hit=1907',
'Planning:',
' Buffers: shared hit=284',
'Planning Time: 57.249 ms',
'Execution Time: 2908.974 ms'
],
pg18Async: [
'Aggregate (cost=4212.57..4212.58 rows=1 width=32) (actual time=382.600..382.601 rows=1.00 loops=1)',
' Buffers: shared hit=1940',
' -> Index Scan using bq_rg_geometry_geom_idx on bq_rg f (cost=0.41..2300.69 rows=139 width=146) (actual time=0.166..39.303 rows=20564.00 loops=1)',
" Index Cond: ((geometry && '0103000020E610000001000000050000000000000000C161404F90BC55CB4645400000000000C161408C938850D18845400000000080D761408C938850D18845400000000080D761404F90BC55CB4645400000000000C161404F90BC55CB464540'::geometry) AND (geometry && '0103000020E610000001000000050000000000000000C161404F90BC55CB4645400000000000C161408C938850D18845400000000080D761408C938850D18845400000000080D761404F90BC55CB4645400000000000C161404F90BC55CB464540'::geometry))",
" Filter: st_intersects(geometry, '0103000020E610000001000000050000000000000000C161404F90BC55CB4645400000000000C161408C938850D18845400000000080D761408C938850D18845400000000080D761404F90BC55CB4645400000000000C161404F90BC55CB464540'::geometry)",
' Rows Removed by Filter: 3',
' Index Searches: 1',
' Buffers: shared hit=1934',
'Planning:',
' Buffers: shared hit=285',
'Planning Time: 14.621 ms',
'Execution Time: 383.089 ms'
]
}
{
pg18Sync: [
'Aggregate (cost=35.31..35.32 rows=1 width=32) (actual time=0.017..0.018 rows=1.00 loops=1)',
' Buffers: shared hit=1',
' -> Index Scan using bq_rg_geometry_geom_idx on bq_rg f (cost=0.41..20.93 rows=1 width=146) (actual time=0.015..0.015 rows=0.00 loops=1)',
" Index Cond: ((geometry && '0103000020E610000001000000050000000000000000946140D8DD5F3F4ACA4540000000000094614053F777FC350B46400000000080AA614053F777FC350B46400000000080AA6140D8DD5F3F4ACA45400000000000946140D8DD5F3F4ACA4540'::geometry) AND (geometry && '0103000020E610000001000000050000000000000000946140D8DD5F3F4ACA4540000000000094614053F777FC350B46400000000080AA614053F777FC350B46400000000080AA6140D8DD5F3F4ACA45400000000000946140D8DD5F3F4ACA4540'::geometry))",
" Filter: st_intersects(geometry, '0103000020E610000001000000050000000000000000946140D8DD5F3F4ACA4540000000000094614053F777FC350B46400000000080AA614053F777FC350B46400000000080AA6140D8DD5F3F4ACA45400000000000946140D8DD5F3F4ACA4540'::geometry)",
' Index Searches: 1',
' Buffers: shared hit=1',
'Planning Time: 0.228 ms',
'Execution Time: 0.044 ms'
],
pg18Async: [
'Aggregate (cost=35.31..35.32 rows=1 width=32) (actual time=0.015..0.016 rows=1.00 loops=1)',
' Buffers: shared hit=1',
' -> Index Scan using bq_rg_geometry_geom_idx on bq_rg f (cost=0.41..20.93 rows=1 width=146) (actual time=0.013..0.013 rows=0.00 loops=1)',
" Index Cond: ((geometry && '0103000020E610000001000000050000000000000000946140D8DD5F3F4ACA4540000000000094614053F777FC350B46400000000080AA614053F777FC350B46400000000080AA6140D8DD5F3F4ACA45400000000000946140D8DD5F3F4ACA4540'::geometry) AND (geometry && '0103000020E610000001000000050000000000000000946140D8DD5F3F4ACA4540000000000094614053F777FC350B46400000000080AA614053F777FC350B46400000000080AA6140D8DD5F3F4ACA45400000000000946140D8DD5F3F4ACA4540'::geometry))",
" Filter: st_intersects(geometry, '0103000020E610000001000000050000000000000000946140D8DD5F3F4ACA4540000000000094614053F777FC350B46400000000080AA614053F777FC350B46400000000080AA6140D8DD5F3F4ACA45400000000000946140D8DD5F3F4ACA4540'::geometry)",
' Index Searches: 1',
' Buffers: shared hit=1',
'Planning Time: 0.238 ms',
'Execution Time: 0.042 ms'
]
}
{
pg18Sync: [
'Aggregate (cost=216.93..216.94 rows=1 width=32) (actual time=424.191..424.192 rows=1.00 loops=1)',
' Buffers: shared hit=341',
' -> Index Scan using bq_rg_geometry_geom_idx on bq_rg f (cost=0.41..120.05 rows=7 width=146) (actual time=0.133..6.610 rows=2663.00 loops=1)',
" Index Cond: ((geometry && '0103000020E610000001000000050000000000000060BB6140D5C3C9336FA145400000000060BB6140BB7A166C9FA945400000000030BE6140BB7A166C9FA945400000000030BE6140D5C3C9336FA145400000000060BB6140D5C3C9336FA14540'::geometry) AND (geometry && '0103000020E610000001000000050000000000000060BB6140D5C3C9336FA145400000000060BB6140BB7A166C9FA945400000000030BE6140BB7A166C9FA945400000000030BE6140D5C3C9336FA145400000000060BB6140D5C3C9336FA14540'::geometry))",
" Filter: st_intersects(geometry, '0103000020E610000001000000050000000000000060BB6140D5C3C9336FA145400000000060BB6140BB7A166C9FA945400000000030BE6140BB7A166C9FA945400000000030BE6140D5C3C9336FA145400000000060BB6140D5C3C9336FA14540'::geometry)",
' Rows Removed by Filter: 2',
' Index Searches: 1',
' Buffers: shared hit=335',
'Planning:',
' Buffers: shared hit=284',
'Planning Time: 34.613 ms',
'Execution Time: 424.417 ms'
],
pg18Async: [
'Aggregate (cost=186.66..186.67 rows=1 width=32) (actual time=105.514..105.515 rows=1.00 loops=1)',
' Buffers: shared hit=325',
' -> Index Scan using bq_rg_geometry_geom_idx on bq_rg f (cost=0.41..103.53 rows=6 width=146) (actual time=0.141..6.406 rows=2663.00 loops=1)',
" Index Cond: ((geometry && '0103000020E610000001000000050000000000000060BB6140D5C3C9336FA145400000000060BB6140BB7A166C9FA945400000000030BE6140BB7A166C9FA945400000000030BE6140D5C3C9336FA145400000000060BB6140D5C3C9336FA14540'::geometry) AND (geometry && '0103000020E610000001000000050000000000000060BB6140D5C3C9336FA145400000000060BB6140BB7A166C9FA945400000000030BE6140BB7A166C9FA945400000000030BE6140D5C3C9336FA145400000000060BB6140D5C3C9336FA14540'::geometry))",
" Filter: st_intersects(geometry, '0103000020E610000001000000050000000000000060BB6140D5C3C9336FA145400000000060BB6140BB7A166C9FA945400000000030BE6140BB7A166C9FA945400000000030BE6140D5C3C9336FA145400000000060BB6140D5C3C9336FA14540'::geometry)",
' Rows Removed by Filter: 2',
' Index Searches: 1',
' Buffers: shared hit=319',
'Planning:',
' Buffers: shared hit=285',
'Planning Time: 14.389 ms',
'Execution Time: 105.727 ms'
]
}
{
pg18Sync: [
'Aggregate (cost=35.31..35.32 rows=1 width=32) (actual time=380.362..380.363 rows=1.00 loops=1)',
' Buffers: shared hit=257',
' -> Index Scan using bq_rg_geometry_geom_idx on bq_rg f (cost=0.41..20.93 rows=1 width=146) (actual time=0.134..15.859 rows=2315.00 loops=1)',
" Index Cond: ((geometry && '0103000020E6100000010000000500000000000000C0B56140D5C3C9336FA1454000000000C0B56140BB7A166C9FA945400000000090B86140BB7A166C9FA945400000000090B86140D5C3C9336FA1454000000000C0B56140D5C3C9336FA14540'::geometry) AND (geometry && '0103000020E6100000010000000500000000000000C0B56140D5C3C9336FA1454000000000C0B56140BB7A166C9FA945400000000090B86140BB7A166C9FA945400000000090B86140D5C3C9336FA1454000000000C0B56140D5C3C9336FA14540'::geometry))",
" Filter: st_intersects(geometry, '0103000020E6100000010000000500000000000000C0B56140D5C3C9336FA1454000000000C0B56140BB7A166C9FA945400000000090B86140BB7A166C9FA945400000000090B86140D5C3C9336FA1454000000000C0B56140D5C3C9336FA14540'::geometry)",
' Rows Removed by Filter: 1',
' Index Searches: 1',
' Buffers: shared hit=251',
'Planning:',
' Buffers: shared hit=284',
'Planning Time: 52.362 ms',
'Execution Time: 380.598 ms'
],
pg18Async: [
'Aggregate (cost=35.31..35.32 rows=1 width=32) (actual time=130.899..130.900 rows=1.00 loops=1)',
' Buffers: shared hit=254',
' -> Index Scan using bq_rg_geometry_geom_idx on bq_rg f (cost=0.41..20.93 rows=1 width=146) (actual time=0.125..5.462 rows=2315.00 loops=1)',
" Index Cond: ((geometry && '0103000020E6100000010000000500000000000000C0B56140D5C3C9336FA1454000000000C0B56140BB7A166C9FA945400000000090B86140BB7A166C9FA945400000000090B86140D5C3C9336FA1454000000000C0B56140D5C3C9336FA14540'::geometry) AND (geometry && '0103000020E6100000010000000500000000000000C0B56140D5C3C9336FA1454000000000C0B56140BB7A166C9FA945400000000090B86140BB7A166C9FA945400000000090B86140D5C3C9336FA1454000000000C0B56140D5C3C9336FA14540'::geometry))",
" Filter: st_intersects(geometry, '0103000020E6100000010000000500000000000000C0B56140D5C3C9336FA1454000000000C0B56140BB7A166C9FA945400000000090B86140BB7A166C9FA945400000000090B86140D5C3C9336FA1454000000000C0B56140D5C3C9336FA14540'::geometry)",
' Rows Removed by Filter: 1',
' Index Searches: 1',
' Buffers: shared hit=248',
'Planning:',
' Buffers: shared hit=285',
'Planning Time: 23.726 ms',
'Execution Time: 131.086 ms'
]
}
{
pg18Sync: [
'Aggregate (cost=95.85..95.86 rows=1 width=32) (actual time=431.336..431.337 rows=1.00 loops=1)',
' Buffers: shared hit=336',
' -> Index Scan using bq_rg_geometry_geom_idx on bq_rg f (cost=0.41..53.97 rows=3 width=146) (actual time=0.130..7.223 rows=2919.00 loops=1)',
" Index Cond: ((geometry && '0103000020E610000001000000050000000000000090B86140D5C3C9336FA145400000000090B86140BB7A166C9FA945400000000060BB6140BB7A166C9FA945400000000060BB6140D5C3C9336FA145400000000090B86140D5C3C9336FA14540'::geometry) AND (geometry && '0103000020E610000001000000050000000000000090B86140D5C3C9336FA145400000000090B86140BB7A166C9FA945400000000060BB6140BB7A166C9FA945400000000060BB6140D5C3C9336FA145400000000090B86140D5C3C9336FA14540'::geometry))",
" Filter: st_intersects(geometry, '0103000020E610000001000000050000000000000090B86140D5C3C9336FA145400000000090B86140BB7A166C9FA945400000000060BB6140BB7A166C9FA945400000000060BB6140D5C3C9336FA145400000000090B86140D5C3C9336FA14540'::geometry)",
' Rows Removed by Filter: 4',
' Index Searches: 1',
' Buffers: shared hit=330',
'Planning:',
' Buffers: shared hit=284',
'Planning Time: 53.371 ms',
'Execution Time: 431.572 ms'
],
pg18Async: [
'Aggregate (cost=186.66..186.67 rows=1 width=32) (actual time=164.077..164.078 rows=1.00 loops=1)',
' Buffers: shared hit=355',
' -> Index Scan using bq_rg_geometry_geom_idx on bq_rg f (cost=0.41..103.53 rows=6 width=146) (actual time=0.129..11.601 rows=2919.00 loops=1)',
" Index Cond: ((geometry && '0103000020E610000001000000050000000000000090B86140D5C3C9336FA145400000000090B86140BB7A166C9FA945400000000060BB6140BB7A166C9FA945400000000060BB6140D5C3C9336FA145400000000090B86140D5C3C9336FA14540'::geometry) AND (geometry && '0103000020E610000001000000050000000000000090B86140D5C3C9336FA145400000000090B86140BB7A166C9FA945400000000060BB6140BB7A166C9FA945400000000060BB6140D5C3C9336FA145400000000090B86140D5C3C9336FA14540'::geometry))",
" Filter: st_intersects(geometry, '0103000020E610000001000000050000000000000090B86140D5C3C9336FA145400000000090B86140BB7A166C9FA945400000000060BB6140BB7A166C9FA945400000000060BB6140D5C3C9336FA145400000000090B86140D5C3C9336FA14540'::geometry)",
' Rows Removed by Filter: 4',
' Index Searches: 1',
' Buffers: shared hit=349',
'Planning:',
' Buffers: shared hit=285',
'Planning Time: 23.486 ms',
'Execution Time: 164.316 ms'
]
}
{
pg18Sync: [
'Aggregate (cost=35.31..35.32 rows=1 width=32) (actual time=93.973..93.974 rows=1.00 loops=1)',
' Buffers: shared hit=108',
' -> Index Scan using bq_rg_geometry_geom_idx on bq_rg f (cost=0.41..20.93 rows=1 width=146) (actual time=0.044..32.581 rows=1110.00 loops=1)',
" Index Cond: ((geometry && '0103000020E610000001000000050000000000000060BB61407873EE25089145400000000060BB614093170BC73C9945400000000030BE614093170BC73C9945400000000030BE61407873EE25089145400000000060BB61407873EE2508914540'::geometry) AND (geometry && '0103000020E610000001000000050000000000000060BB61407873EE25089145400000000060BB614093170BC73C9945400000000030BE614093170BC73C9945400000000030BE61407873EE25089145400000000060BB61407873EE2508914540'::geometry))",
" Filter: st_intersects(geometry, '0103000020E610000001000000050000000000000060BB61407873EE25089145400000000060BB614093170BC73C9945400000000030BE614093170BC73C9945400000000030BE61407873EE25089145400000000060BB61407873EE2508914540'::geometry)",
' Index Searches: 1',
' Buffers: shared hit=108',
'Planning Time: 0.247 ms',
'Execution Time: 94.014 ms'
],
pg18Async: [
'Aggregate (cost=35.31..35.32 rows=1 width=32) (actual time=81.956..81.956 rows=1.00 loops=1)',
' Buffers: shared hit=109',
' -> Index Scan using bq_rg_geometry_geom_idx on bq_rg f (cost=0.41..20.93 rows=1 width=146) (actual time=0.048..12.595 rows=1110.00 loops=1)',
" Index Cond: ((geometry && '0103000020E610000001000000050000000000000060BB61407873EE25089145400000000060BB614093170BC73C9945400000000030BE614093170BC73C9945400000000030BE61407873EE25089145400000000060BB61407873EE2508914540'::geometry) AND (geometry && '0103000020E610000001000000050000000000000060BB61407873EE25089145400000000060BB614093170BC73C9945400000000030BE614093170BC73C9945400000000030BE61407873EE25089145400000000060BB61407873EE2508914540'::geometry))",
" Filter: st_intersects(geometry, '0103000020E610000001000000050000000000000060BB61407873EE25089145400000000060BB614093170BC73C9945400000000030BE614093170BC73C9945400000000030BE61407873EE25089145400000000060BB61407873EE2508914540'::geometry)",
' Index Searches: 1',
' Buffers: shared hit=109',
'Planning Time: 0.223 ms',
'Execution Time: 81.995 ms'
]
}
Honestly… I'm not sure how async affects on performance. However even when the execution plans look nearly identical, the async version tends to show shorter execution times. Because the difference in performance was too large, I suprised and I suspects there is a mistake on the benchmarking or not but in this time, I’ll conclude that async does provide some benefits!
If any experts out there can extract deeper insight from these results, please share your comments.
Subjectively, async seems to make a noticeable difference when the query hits a relatively large number of records. Since that’s a common pattern in PostGIS workloads, async I/O could end up being a meaningful performance boost in practice.
2025-12-04 16:32:13
If you went to a coding bootcamp or read a software engineering book in the last 10 years, you were taught a golden rule: Separation of Concerns (SoC).
They told you:
For years, we dutifully split our code into Button.js, Button.css, and ButtonController.js. We felt clean. We felt organized.
But lately, I’ve noticed a shift. The best developers I know are doing the exact opposite. They are merging styles into markup (Tailwind), putting database queries inside UI components (RSC), and writing logic right on the HTML element (htmx).
Are they writing bad code? No. They have just realized that Separation of Concerns is often a lie we tell ourselves to feel organized, when in reality, we are just creating Separation of Files.
Imagine you are cleaning your bedroom (your codebase).
Shallow Knowledge says: "Organization means putting similar items together."
It looks clean. All the "Left Socks" are strictly separated from the "Shoes."
But now, try to get dressed (build a feature). You have to run to the drawer, then the other drawer, then the attic, just to put on your feet.
This is what traditional "Separation of Concerns" does to your code:
// ❌ The "Clean" Way (Separation of Files)
// 1. structure.html
<button id="submit-btn" class="btn-primary">Submit</button>
// 2. styles.css
.btn-primary { background: blue; color: white; }
// 3. controller.js
document.getElementById('submit-btn').addEventListener('click', () => {
// 4. api.js
submitForm();
});
To understand what this button does, you have to open four different files. You have separated the technologies, but you have scattered the feature.
Carson Gross (creator of htmx) coined a term that is redefining modern architecture: Locality of Behavior.
"The behavior of a unit of code should be as obvious as possible by looking only at that unit of code."
Deep Knowledge understands that true maintainability isn't about separating JS from CSS; it's about separating Feature A from Feature B.
If I want to delete the "Submit Button" feature, I shouldn't have to hunt down a dangling CSS class in one file and an orphaned event listener in another. I should be able to delete one block and be done.
This is why tools like Tailwind CSS and React won. They embrace Locality of Behavior.
Look at this "messy" React component:
// ✅ The "Locality of Behavior" Way
export function SubmitButton() {
// Logic is right here
const handleSubmit = async () => {
await db.users.create({ ... });
};
return (
<button
// Styles are right here (Tailwind)
className="bg-blue-500 text-white py-2 px-4 rounded"
// Trigger is right here
onClick={handleSubmit}
>
Submit
</button>
);
}
Shallow Developers look at this and scream: "You're mixing database logic, styles, and markup! It's chaos!"
Senior Developers look at this and sigh in relief: "Thank god. Everything I need to know about the Submit Button is in one place."
Stop thinking about your code as a Grocery Store (all fruit in aisle 1, all meat in aisle 5).
Start thinking about your code as a Toolkit.
If you are fixing a sink, you want a "Plumbing Kit" that has a wrench, tape, and a washer together. You don't want to walk to the "Wrench Room" and then the "Tape Room."
The industry is swinging back. We realized that separating files by "file extension" (.js, .css, .html) was a mistake.
The future of software engineering (whether it's React Server Components, htmx, or Vue) is about Colocation.
Don't be afraid to put your CSS classes in your HTML. Don't be afraid to put your SQL query near your button.
If it changes together, it should live together.
2025-12-04 16:32:02
Ever notice how you stopped asking "why"?
Not because you got the answers. Because asking became friction.
You're staring at your terminal. The command works. Ship it. Next ticket. Why does it work? Who cares, it works.
You've got 47 dependencies for a page with three buttons. Somewhere in your gut, a voice whispers: "this feels... wrong?"
But you've trained that voice to shut the fuck up. Because questioning is slow. Questioning doesn't ship. Questioning isn't 10x.
That npm install you run without reading? That framework you chose because the AI suggested it? That architecture you can't explain to the junior dev who just asked "why did we build it this way?"
You're not stupid. You're not lazy.
You're just further along a path than you realized.
Let me show you how we got here...
Not "digital beings." Not "the new generation."
Bob.
Bob is that guy in your standup. Bob merged that PR yesterday. Bob sits three Slack channels away from you. Bob might be looking at this article right now.
Bob has two variants, and you're about to figure out which one you are.
This Bob has scars. He remembers the jQuery wars. He's fought CSS specificity battles at 4 AM. He's read "Eloquent JavaScript" cover to cover (okay, he skimmed chapters 8-12, but he GETS it).
Smart-Lazy Bob uses AI like a power tool. He knows what he wants to build. He just doesn't want to type the boilerplate. AI is his nail gun when he already knows how to swing a hammer.
The equation works: 50 units of knowledge × AI = 500 units of output
When AI goes down? Smart-Lazy Bob grumbles, cracks his knuckles, and keeps shipping. Slower, sure. But he's not helpless.
This Bob learned to code in 2024. His first "hello world" was prompted into existence. He's never read a programming book. Why would he? AI explains everything!
Fool-Eager Bob ships FAST. Ten apps this month. His GitHub is a beautiful green garden of contributions. His portfolio looks STACKED.
But here's the math nobody told him:
0 units of knowledge × AI = 0 units of knowledge (disguised as 10 apps)
When AI goes down? Fool-Eager Bob doesn't work that day. He literally can't. It's like asking someone who's only ever been a passenger to suddenly drive the F1 car.
Let's talk about those 10 apps Fool-Eager Bob shipped.
They EXIST. They're REAL. They're deployed on Vercel. The URLs work. Hell, people might even be using them.
But here's the question nobody asks:
What did Bob actually LEARN from building them?
Strip away the AI. Lock Claude in a box. Now ask Bob to build app #11.
Can he?
0 skill + AI = 10 apps = I learned something ✅
It feels true. It LOOKS true. His portfolio proves it, right?
The "addition" is a myth. AI doesn't ADD capabilities to you. It MULTIPLIES what's already there.
The Interview:
"Can you whiteboard this algorithm?"
Fool-Eager Bob freezes. His mind reaches for prompt syntax that isn't there. "Can I use my laptop?" he asks.
"No, just the whiteboard."
He writes something. It's wrong. He can FEEL it's wrong from the interviewer's face. But he doesn't know WHY it's wrong. He doesn't have the foundation to even debug his own thinking.
Smart-Lazy Bob might be rusty, might need a minute, but his brain still WORKS. The neural pathways are there, just need dusting off.
The Code Review:
"Why did you structure it this way?"
Fool-Eager Bob: "Um... it seemed like the right approach?"
Translation: "AI said to."
Smart-Lazy Bob: "Because we're optimizing for read performance over writes here, and this structure gives us O(1) lookups. The tradeoff is slightly more complex inserts, but based on our usage patterns, that's fine."
One Bob understands tradeoffs. The other just has code that exists.
The Junior Dev Question:
"Hey, can you help me understand why this isn't working?"
Fool-Eager Bob: sweating "Uh... did you try asking Claude?"
Smart-Lazy Bob: "Let me see... okay, you're mutating state inside a closure. Here's what's happening and why..."
One Bob can teach. The other can only redirect to the same AI crutch that made him helpless in the first place.
If the only skill you're sharpening is "prompting better" - you're not climbing a ladder.
You're chasing your own shadow in circles.
Think about what Fool-Eager Bob is actually getting better at:
These are real skills! But they're skills for using a tool, not for doing the craft.
It's like getting REALLY good at asking a chef to cook for you. You know exactly how to describe what you want. You can tell when the dish is off. You can request modifications like a pro.
But you still can't cook.
And when the chef goes on vacation? You starve.
Not today. Maybe not this quarter. But it's coming like compound interest on debt he forgot he owed.
The market will figure it out.
Because eventually, someone asks him to explain production breaks and he's the only one online. And he just... can't. The Slack channel waits. The customers wait. And he's typing increasingly desperate prompts into an AI that's timing out.
Or a junior dev asks for mentorship. And Fool-Eager Bob realizes with creeping horror that he can't mentor anyone because he never learned it himself.
If your core skill is "prompting better" - you're not a developer.
You're a really good customer service rep for an AI.
And when that AI has downtime? So do you.
Zero × Anything = Zero
The digital Anunnaki are powerful. They can 10x your output. They can make you LOOK incredibly productive.
But they can only multiply what's there.
If what's there is zero?
Then zero is what survives.
Those 10 apps you vibe-coded? They're artifacts. Monuments. Proof that AI works.
But they're not proof that you work.
And evolution - whether biological or career-based - doesn't care about your artifacts.
It cares about what survives when the gifts from the gods disappear.
Smart-Lazy Bob survives. Slower, grumpier, but alive.
Fool-Eager Bob?
He was never really there to begin with.
Just a shadow. Chasing itself. Multiplied by infinity.
Still zero.
2025-12-04 16:26:45
heard that AI agents can work as co-workers, doing the same type of work as humans in companies. One common myth is that AI agents could result in job displacement. The reality is more nuanced: it lies in developing reliable AI agents that do not hallucinate and can adapt to different social settings while building trust with other co-workers. These are the core objectives taught during this 5-day AI agents intensive course. I was very excited to learn about AI agents and their applications, especially since one could explore many ideas such as journalism, research, prediction, or analysis and actually get a platform to bring them to reality. Deep down, I wanted to build an AI agent to automate some personal tasks. This course came at exactly the right moment.
On the first day, we started with building simple agents and multi-agent architectures. I learned about different Google ADK modules and their functions: agent (model, instruction, output key), runner, parallel agent, sequential agent, and loop agent. Each pattern revealed a different way to structure agent behavior, from linear execution to parallel processing.
What made this course truly transformative was listening to various experts in the field of AI. Their thoughts and analysis across different topics provided real-world perspective that lectures alone cannot offer. The summarization sessions and quizzes at the end of each live session made remembering the concepts feel natural and engaging rather than forced.
On the second day, we explored tools and their usage. Tools come in different forms: functional tools, in-built tools like Google search and code execution, agent tools, and importantly, MCP tool usage. I realized that these are not just extensions of simple LLM based agents, they are what transform an LLM into a specialist. These tools help the LLM take the right context and make informed decisions, moving beyond what training data alone can provide.
The third day focused on knowledge management: how to store knowledge, retrieve it, and use it effectively. This is what's called context engineering. Two important concepts emerged: session based memory and persistent memory. Session based memory can be wiped out after the session ends, but for production systems, database services like SQLite or vector databases are needed for memory persistence. Beyond storage, I learned about context rot, the degradation that happens when context windows become polluted. Context compaction and context chunking became the ultimate concepts for preventing agents from becoming rogue and unreliable over time.
On the fourth day, we tackled observability and evaluation, which are essential for production systems. We learned to log every trace: tool calling (input, output) and agent calling (input, output). But logging alone is not enough. We also evaluated outputs at each stage, checking whether they were truthful, reliable, free from hallucination, and efficient. This systematic approach transformed debugging from guesswork into measurable validation. Finally, we explored the Agent to Agent protocol for building remote agents and connecting multiple agents into a cohesive system.
By the fifth day, everything came together. We had production ready knowledge paired with hands on experience through the capstone project: a public policy agent. I employed a multi-agent architecture with specialized roles: analysis, critique, lobbyist, and final synthesis agents. Tools like Tavily and Google search provided real-time context, while logging traces ensured full observability throughout the system. The result wasn't just a working project, it demonstrated the principles of building reliable, production grade AI systems.
This course fundamentally changed how I think about development. It transformed me from someone building NLP related projects to an AI agent expert. More importantly, it shifted my perspective from just building a project to building a system. That distinction matters deeply: it's the difference between feature focused work and architecture focused thinking.
2025-12-04 16:25:34
Shipping icons as PNG or JPG used to be fine — until we needed:
SVG solves all of this:
So if you want your icons used everywhere — mobile apps, dashboards, design tools, open-source projects — SVG is the right format.
Grab a collection of icons you want to open-source.
Ideal characteristics:
But don’t worry — even low-res icons can be fixed later.
Place them in a folder, e.g.:
/my-icons/
Head over to aivector.ai
Upload an icon, wait ~5 seconds, download the SVG.
Repeat for your whole set.
Why this tool?
Pro tip:
Icons with clear edges and strong contrast convert best.
AI-generated SVGs are rarely production-ready.
Typical problems:
Recommended tools:
Example svgo config:
{
"multipass": true,
"floatPrecision": 2,
"plugins": [
"removeDimensions",
"removeDoctype",
"removeComments",
"removeMetadata",
"convertPathData"
]
}
Run optimization:
svgo *.svg --config=svgo.config.json
Results:
Good naming matters for developers who will use your library.
Recommended naming style:
action-add.svg
arrow-left.svg
file-open.svg
user-edit.svgRules that help:
triangle.svg)
Organize folders:
icons/
actions/
arrows/
files/
users/
This helps scalability later.
Check rendering in:
Simple HTML preview script:
<!DOCTYPE html>
<html>
<body style="display:flex;flex-wrap:wrap">
<img src="arrow-left.svg" width="32" />
<img src="arrow-left.svg" width="64" />
<img src="arrow-left.svg" width="128" />
</body>
</html>
Look for:
Fix before shipping.
Structure:
README.md
LICENSE
icons/
package.json (optional)
Default license for icons:
Example README snippet:
# My Open Source SVG Icons
Clean, minimal, open-source icons for web and mobile apps.
MIT licensed. Free for commercial use.
> npm install my-icons? maybe someday 😉
Best for React / Vue icon libraries.
Example folder:
dist/
src/
package.json
Useful for designers.
Places to share:
Also, make sure you include:
Your goal: Make adoption friction-less.
AI vectorization is great, but:
Manual polishing is still required.
This workflow works best for:
Not great for:
Always check licensing.
Example component:
export function IconHome(props) {
return (
<svg
width={props.size || 24}
height={props.size || 24}
fill="none"
stroke="currentColor"
>
<path d="..." />
</svg>
);
}
Bundle with:
Now you have a modern DX library.
🎁 Final Thoughts
Building an open-source icon library used to mean:
Manual tracing
Expensive software
Hours of work
With tools like AIVector, it becomes:
drag-and-drop → cleanup → release
And the best part?
You empower other developers & designers with reusable assets.
Open-source doesn’t have to be complicated — sometimes it’s just sharing useful pixels with the world.
And, most importantly:
Publish early, improve later.
2025-12-04 16:21:02
REST (Representational State Transfer) was introduced by Roy Fielding in his doctoral dissertation in 2000, building upon the foundational work of Tim Berners-Lee who started the World Wide Web. The web was built on fundamental concepts like URIs, HTTP, URLs, HTML, web servers, and WYSIWYG editors that made content creation accessible to everyone.
R - Representational
S - State
T - Transfer
https://google.com/blog/post?q=something#header
Components:
https or http - the protocol usedgoogle.com - can include subdomains/blog/post - hierarchical path showing resource relationships?q=something - filters or search criteria#header - navigates to a specific section of the resourceDefinition: An idempotent operation produces the same result no matter how many times it's executed.
GET /api/articles
GET /api/articles/123
GET /api/articles?author=john&status=published
POST /api/articles
POST /api/articles/123/publish
POST /api/users/456/send-welcome-email
POST /api/orders/789/calculate-shipping
Why POST is Open-Ended: When you need to perform an action that doesn't map to standard CRUD operations, POST is your go-to method. It's intentionally flexible for custom operations.
PUT /api/articles/123
PATCH /api/articles/123
DELETE /api/articles/123
Use Plural Nouns
✅ /api/articles
✅ /api/users
✅ /api/comments
❌ /api/article
❌ /api/user
❌ /api/comment
Why Plural?
Handle Whitespace in URLs
Input: "Harry Potter"
URL: /api/books/harry-potter
✅ Use hyphens (kebab-case)
❌ Avoid underscores or spaces
GET /api/articles/123/comments
GET /api/articles/123/comments/456
POST /api/articles/123/comments
DELETE /api/users/789/preferences/notifications
GET /api/articles?status=published&author=john&sort=date&order=desc
GET /api/users?role=admin&page=2&limit=20
/api/v1/articles
/api/v2/articles
Or via headers:
Accept: application/vnd.api.v1+json
GET /api/articles/999
Response: 404 Not Found
Why? The specific resource doesn't exist - this is an error state.
GET /api/articles?author=unknown
Response: 200 OK
Body: []
Why? The request was successful - an empty result set is valid data, not an error.
Before writing a single line of API code, design the user interface or understand the client needs.
Questions to Ask:
Example Flow:
[Homepage]
↓
[Article List] → Filter by category, author, date
↓
[Article Detail] → View comments
↓
[Comment Section] → Add/edit/delete comments
| UI Component | HTTP Method | Endpoint | Purpose |
|---|---|---|---|
| Article List | GET | /api/articles |
Fetch all articles |
| Article Filters | GET | /api/articles?category=tech |
Filtered list |
| Article Detail | GET | /api/articles/123 |
Single article |
| Create Article | POST | /api/articles |
New article |
| Edit Article | PUT/PATCH | /api/articles/123 |
Update article |
| Delete Article | DELETE | /api/articles/123 |
Remove article |
| Publish Article | POST | /api/articles/123/publish |
Custom action |
| Article Comments | GET | /api/articles/123/comments |
Nested resource |
| Add Comment | POST | /api/articles/123/comments |
Create comment |
{
"id": 123,
"title": "Understanding REST APIs",
"slug": "understanding-rest-apis",
"author": {
"id": 456,
"name": "John Doe",
"avatar": "https://..."
},
"content": "...",
"status": "published",
"category": "technology",
"tags": ["api", "rest", "web-development"],
"created_at": "2025-01-15T10:30:00Z",
"updated_at": "2025-01-20T14:45:00Z",
"comments_count": 42,
"links": {
"self": "/api/articles/123",
"comments": "/api/articles/123/comments",
"author": "/api/users/456"
}
}
| Status Code | Meaning | Use Case |
|---|---|---|
| 200 OK | Success | GET, PATCH successful |
| 201 Created | Resource created | POST successful |
| 204 No Content | Success, no body | DELETE successful |
| 400 Bad Request | Invalid input | Validation errors |
| 401 Unauthorized | Not authenticated | Missing/invalid token |
| 403 Forbidden | Not authorized | Insufficient permissions |
| 404 Not Found | Resource missing | Single resource not found |
| 409 Conflict | Resource conflict | Duplicate resource |
| 422 Unprocessable | Validation failed | Business logic errors |
| 500 Server Error | Server problem | Unexpected errors |
{
"error": {
"code": "VALIDATION_ERROR",
"message": "Invalid input data",
"details": [
{
"field": "email",
"message": "Email is required"
},
{
"field": "password",
"message": "Password must be at least 8 characters"
}
],
"timestamp": "2025-01-20T14:45:00Z",
"path": "/api/users"
}
}
# Articles
GET /api/v1/articles # List all articles
GET /api/v1/articles?status=published # Filter articles
GET /api/v1/articles/understanding-rest # Get single article
POST /api/v1/articles # Create article
PUT /api/v1/articles/123 # Update full article
PATCH /api/v1/articles/123 # Partial update
DELETE /api/v1/articles/123 # Delete article
# Custom Actions
POST /api/v1/articles/123/publish # Publish article
POST /api/v1/articles/123/unpublish # Unpublish article
POST /api/v1/articles/123/duplicate # Duplicate article
# Nested Resources
GET /api/v1/articles/123/comments # Get article comments
POST /api/v1/articles/123/comments # Add comment
GET /api/v1/articles/123/comments/456 # Get single comment
PATCH /api/v1/articles/123/comments/456 # Update comment
DELETE /api/v1/articles/123/comments/456 # Delete comment
# Authors
GET /api/v1/authors # List authors
GET /api/v1/authors/789 # Get author profile
GET /api/v1/authors/789/articles # Author's articles
# Categories & Tags
GET /api/v1/categories # List categories
GET /api/v1/categories/tech/articles # Articles in category
GET /api/v1/tags # List tags
GET /api/v1/tags/api/articles # Articles with tag
Great REST API design starts with understanding user needs through wireframes and UI flows. By following these principles—proper resource naming, appropriate HTTP methods, clear response codes, and the critical distinction between empty results and missing resources—you create APIs that are intuitive, maintainable, and developer-friendly.
Remember: Design for your users first, then build the API that serves their needs. The technical excellence of your REST API should be invisible to end users, manifesting only as a seamless, fast, and reliable experience.
Happy API building! 🚀