2026-02-03 19:05:45
The title of my ending keynote at FOSDEM February 1, 2026.
As the last talk of the conference, at 17:00 on the Sunday lots of people had already left, and presumably a lot of the remaining people were quite tired and ready to call it a day.
Still, the 1500 seats in Janson got occupied and there was even a group of more people outside wanting to get in that had to be refused entry.
Thanks to the awesome FOSDEM video team, the recording was made available this quickly after the presentation.
You can also get the video off FOSDEM servers.
2026-02-03 00:12:24
In January 2025 I received the European Open Source Achievement Award. The physical manifestation of that prize was a trophy made of translucent acrylic (or something similar). The blog post I above has a short video where I show it off.
In the year that passed since, we have established an organization for how do the awards going forward in the European Open Source Academy and we have arranged the creation of actual medals for the awardees.
That was the medal we gave the award winners last week at the award ceremony where I handed Greg his prize.
I was however not prepared for it, but as a direct consequence I was handed a medal this year, in recognition for the award a got last year, because now there is a medal. A retroactive medal if you wish. It felt almost like getting the award again. An honor.



The medal is made in a shiny metal, roughly 50mm in diameter. In the middle of it is a modern version (with details inspired by PCB looks) of the Yggdrasil tree from old Norse mythology – the “World Tree”. A source of life, a sacred meeting place for gods.
In a circle around the tree are twelve stars, to visualize the EU and European connection.
On the backside, the year and the name are engraved above an EU flag, and the same circle of twelve stars is used there as a margin too, like on the front side.
The medal has a blue and white ribbon, to enable it to be draped over the head and hung from the neck.
The box is sturdy thing in dark blue velvet-like covering with European Open Source Academy printed on it next to the academy’s logo. The same motif is also in the inside of the top part of the box.
I do feel overwhelmed and I acknowledge that I have receive many medals by now. I still want to document them and show them in detail to you, dear reader. To show appreciation; not to boast.
2026-01-30 19:32:51
I had the honor and pleasure to hand over this prize to its first real laureate during the award gala on Thursday evening in Brussels, Belgium.
This annual award ceremony is one of the primary missions for the European Open Source Academy, of which I am the president since last year.
As an academy, we hand out awards and recognition to multiple excellent individuals who help make Europe the home of excellent Open Source. Fellow esteemed academy members joined me at this joyful event to perform these delightful duties.
As I stood on the stage, after a brief video about Greg was shown I introduced Greg as this year’s worthy laureate. I have included the said words below. Congratulations again Greg. We are lucky to have you.
There are tens of millions of open source projects in the world, and there are millions of open source maintainers. Many more would count themselves as at least occasional open source developers. These are the quiet builders of Europe’s digital world.
When we work on open source projects, we may spend most of our waking hours deep down in the weeds of code, build systems, discussing solutions, or tearing our hair out because we can’t figure out why something happens the way it does, as we would prefer it didn’t.
Open source projects can work a little like worlds on their own. You live there, you work there, you debate with the other humans who similarly spend their time on that project. You may not notice, think, or even care much about other projects that similarly have a set of dedicated people involved. And that is fine.
Working deep in the trenches this way makes you focus on your world and maybe remain unaware and oblivious to champions in other projects. The heroes who make things work in areas that need to work for our lives to operate as smoothly as they, quite frankly, usually do.
Greg Kroah-Hartman, however, our laureate of the Prize for Excellence in Open Source 2026, is a person whose work does get noticed across projects.
Our recognition of Greg honors his leading work on the Linux kernel and in the Linux community, particularly through his work on the stable branch of Linux. Greg serves as the stable kernel maintainer for Linux, a role of extraordinary importance to the entire computing world. While others push the boundaries of what Linux can do, Greg ensures that what already exists continues to work reliably. He issues weekly updates containing critical bug fixes and security patches, maintaining multiple long-term support versions simultaneously. This is work that directly protects billions of devices worldwide.
It’s impossible to overstate the importance of the work Greg has done on Linux. In software, innovation grabs headlines, but stability saves lives and livelihoods. Every Android phone, every web server, every critical system running Linux depends on Greg’s meticulous work. He ensures that when hospitals, banks, governments, and individuals rely on Linux, it doesn’t fail them. His work represents the highest form of service: unglamorous, relentless, and essential.
Without maintainers like Greg, the digital infrastructure of our world would crumble. He is, quite literally, one of the people keeping the digital infrastructure we all depend on running.
As a fellow open source maintainer, Greg and I have worked together in the open source security context. Through my interactions with him and people who know him, I learned a few things:
An American by origin, Greg now calls Europe his home, having lived in the Netherlands for many years. While on this side of the pond, he has taken on an important leadership role in safeguarding and advocating for the interests of the open source community. This is most evident through his work on the Cyber Resilience Act, through which he has educated and interacted with countless open source contributors and advocates whose work is affected by this legislation.
We — if I may be so bold — the Open Source community in Europe — and yes, the whole world, in fact — appreciate your work and your excellence. Thank you, Greg. Please come on stage and collect your award.


2026-01-28 17:44:51
We are doing another curl + distro online meeting this spring in what now has become an established annual tradition. A two-hour discussion, meeting, workshop for curl developers and curl distro maintainers.
2026 curl distro meeting details
The objective for these meetings is simply to make curl better in distros. To make distros do better curl. To improve curl in all and every way we think we can, together.
A part of this process is to get to see the names and faces of the people involved and to grease the machine to improve cross-distro collaboration on curl related topics.
Anyone who feels this is a subject they care about is welcome to join. We aim for the widest possible definition of distro and we don’t attempt to define the term.
The 2026 version of this meeting is planned to take place in the early evening European time, morning west coast US time. With the hope that it covers a large enough amount of curl interested people.
The plan is to do this on March 26, and all the details, planning and discussion items are kept on the dedicated wiki page for the event.
Please add your own discussion topics that you want to know or talk about, and if you feel inclined, add yourself as an intended participant. Feel free to help make this invite reach the proper people.
See you on March 26!
2026-01-27 07:01:39
We introduced curl’s -J option, also known as --remote-header-name back in February 2010. A decent amount of years ago.
The option is used in combination with -O (--remote-name) when downloading data from a HTTP(S) server and instructs curl to use the filename in the incoming Content-Disposition: header when saving the content, instead of the filename of the URL passed on the command line (if provided). That header would later be explained further in RFC 6266.
The idea is that for some URLs the server can provide a more suitable target filename than what the URL contains from the beginning. Like when you do a command similar to:
curl -O -J https://example.com/download?id=6347d
Without -J, the content would be save in the target output filename called ‘download’ – since curl strips off the query part.
With -J, curl parses the server’s response header that contains a better filename; in the example below fun.jpg.
Content-Disposition: filename="fun.jpg";
The above approach mentioned works pretty well, but has several limitations. One of them being that the obvious that if the site instead of providing a Content-Disposition header perhaps only redirects the client to a new URL to the download from, curl does not pick up the new name but instead keeps using the one from the originally provided URL.
This is not what most users want and not what they expect. As a consequence, we have had this potential improvement mentioned in the TODO file for many years. Until today.
We have now merged a change that makes curl with -J pick up the filename from Location: headers and it uses that filename if no Content-Disposition.
This means that if you now rerun a similar command line as mentioned above, but this one is allowed to follow redirects:
curl -L -O -J https://example.com/download?id=6347d
And that site redirects curl to the actual download URL for the tarball you want to download:
HTTP/1 301 redirect
Location: https://example.org/release.tar.gz
… curl now saves the contents of that transfer in a local file called release.tar.gz.
If there is both a redirect and a Content-Disposition header, the latter takes precedence.
Since this gets the filename from the server’s response, you give up control of the name to someone else. This can of course potentially mess things up for you. curl ignores all provided directory names and only uses the filename part.
If you want to save the download in a dedicated directory other than the current one, use –output-dir.
As an additional precaution, using -J implies that curl avoids to clobber, overwrite, any existing files already present using the same filename unless you also use –clobber.
Since the selected final name used for storing the data is selected based on contents of a header passed from the server, using this option in a scripting scenario introduces the challenge: what filename did curl actually use?
A user can easily extract this information with curl’s -w option. Like this:
curl -w '%{filename_effective}' -O -J -L https://example.com/download?id=6347d
This command line outputs the used filename to stdout.
Tweak the command line further to instead direct that name to stderr or to a specific file etc. Whatever you think works.
The content-disposition RFC mentioned above details a way to provide a filename encoded as UTF-8 using something like the below, which includes a U+20AC Euro sign:
Content-Disposition: filename*= UTF-8''%e2%82%ac%20rates
curl still does not support this filename* style of providing names. This limitation remains because curl cannot currently convert that provided name into a local filename using the provided characters – with certainty.
Room for future improvement!
This -J improvement ships in curl 8.19.0, coming in March 2026.
2026-01-26 15:24:41
tldr: an attempt to reduce the terror reporting.
There is no longer a curl bug-bounty program. It officially stops on January 31, 2026.
After having had a few half-baked previous takes, in April 2019 we kicked off the first real curl bug-bounty with the help of Hackerone, and while it stumbled a bit at first it has been quite successful I think.
We attracted skilled researchers who reported plenty of actual vulnerabilities for which we paid fine monetary rewards. We have certainly made curl better as a direct result of this: 87 confirmed vulnerabilities and over 100,000 USD paid as rewards to researchers. I’m quite happy and proud of this accomplishment.
I would like to especially highlight the awesome Internet Bug Bounty project, which has paid the bounties for us for many years. We could not have done this without them. Also of course Hackerone, who has graciously hosted us and been our partner through these years.
Thanks!
Looking back, I think we can say that the downfall of the bug-bounty program started slowly in the second half of 2024 but accelerated badly in 2025.
We saw an explosion in AI slop reports combined with a lower quality even in the reports that were not obvious slop – presumably because they too were actually misled by AI but with that fact just hidden better.
Maybe the first five years made it possible for researchers to find and report the low hanging fruit. Previous years we have had a rate of somewhere north of 15% of the submissions ending up confirmed vulnerabilities. Starting 2025, the confirmed-rate plummeted to below 5%. Not even one in twenty was real.
The never-ending slop submissions take a serious mental toll to manage and sometimes also a long time to debunk. Time and energy that is completely wasted while also hampering our will to live.
I have also started to get the feeling that a lot of the security reporters submit reports with a bad faith attitude. These “helpers” try too hard to twist whatever they find into something horribly bad and a critical vulnerability, but they rarely actively contribute to actually improve curl. They can go to extreme efforts to argue and insist on their specific current finding, but not to write a fix or work with the team on improving curl long-term etc. I don’t think we need more of that.
There are these three bad trends combined that makes us take this step: the mind-numbing AI slop, humans doing worse than ever and the apparent will to poke holes rather than to help.
In an attempt to do something about the sorry state of curl security reports, this is what we do:
We believe that we can maintain and continue to evolve curl security in spite of this change. Maybe even improve thanks to this, as hopefully this step helps prevent more people pouring sand into the machine. Ideally we reduce the amount of wasted time and effort.
I believe the best and our most valued security reporters still will tell us when they find security vulnerabilities.
If you suspect a security problem in curl going forward, we advise you to head over to GitHub and submit them there.
Alternatively, you send an email with the full report to security @ curl.se.
In both cases, the report is received and handled privately by the curl security team. But with no monetary reward offered.
Hackerone was good to us and they have graciously allowed us to run our program on their platform for free for many years. We thank them for that service.
As we now drop the rewards, we feel it makes a clear cut and displays a clearer message to everyone involved by also moving away from Hackerone as a platform for vulnerability reporting. It makes the change more visible.
It is probably going to be harder for us to publicly disclose every incoming security report in the same way we have done it on Hackerone for the last year. We need to work out something to make sure that we can keep doing it at least imperfectly, because I believe in the goodness of such transparency.
Let me emphasize that this change does not impact our presence and mode of operation with the curl repository and its hosting on GitHub. We hear about projects having problems with low-quality AI slop submissions on GitHub as well, in the form of issues and pull-requests, but for curl we have not (yet) seen this – and frankly I don’t think switching to a GitHub alternative saves us from that.
Compared to others, we seem to be affected by the sloppy security reports to a higher degree than the average Open Source project.
With the help of Hackerone, we got numbers of how the curl bug-bounty has compared with other programs over the last year. It turns out curl’s program has seen more volume and noise than other public open source bug bounty programs in the same cohort. Over the past four quarters, curl’s inbound report volume has risen sharply, while other bounty-paying open source programs in the cohort, such as Ruby, Node, and Rails, have not seen a meaningful increase and have remained mostly flat or declined slightly. In the chart, the pink line represents curl’s report volume, and the gray line reflects the broader cohort.

We suspect the idea of getting money for it is a big part of the explanation. It brings in real reports, but makes it too easy to be annoying with little to no penalty to the user. The reputation system and available program settings were not sufficient for us to prevent sand from getting into the machine.
The exact reason why we suffer more of this abuse than others remains a subject for further speculation and research.
There is a non-zero risk that our guesses are wrong and that the volume and security report frequency will keep up even after these changes go into effect.
If that happens, we will deal with it then and take further appropriate steps. I prefer not to overdo things or overplan already now for something that ideally does not happen.
People keep suggesting that one way to deal with the report tsunami is to charge security researchers a small amount of money for the privilege of submitting a vulnerability report to us. A curl reporters security club with an entrance fee.
I think that is a less good solution than just dropping the bounty. Some of the reasons include:
Maybe we need to do this later anyway, but we stay away from it for now.
We have seen other projects and repositories see similar AI-induced problems for pull requests, but this has not been a problem for the curl project. I believe that for PRs we have much better means to sort out the weed with automatic means, since we have tools, tests and scanners to verify such contributions. We don’t need to waste any human time on pull requests until the quality is good enough to get green check-marks from 200 CI jobs.
I will do a talk at FOSDEM 2026 titled Open Source Security in spite of AI that of course will touch on this subject.
We never say never. This is now and we might have reasons to reconsider and make a different decision in the future. If we do, we will let you know. These changes are applied now with the hope that they will have a positive effect for the project and its maintainers. If that turns out to not be the outcome, we will of course continue and apply further changes later.
Since I created the pull request for updating the bug-bounty information for curl on January 14, almost two weeks before we merged it, various media picked up the news and published articles. Long before I posted this blog post.
Also discussed (indirectly) on Hacker News.