<?xml version="1.0"?>
<rss version="2.0">

<channel>
	<title>Planet Debian</title>
	<link>https://planet.debian.org/</link>
	<language>en</language>
	<description>Planet Debian - https://planet.debian.org/</description>


<item>
	<title>Jonathan Dowland: debian swirl font glyph</title>
	<guid>https://jmtd.net/log/debian_glyph/</guid>
	<link>https://jmtd.net/log/debian_glyph/</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/jmtd.png&quot; width=&quot;65&quot; height=&quot;85&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;When I wrote about &lt;a href=&quot;https://jmtd.net/log/redhat_prompt/&quot;&gt;the redhat logo in a shell prompt&lt;/a&gt;,
a commenter said it would be nice to achieve something similar for Debian, and
suggested &quot;🍥&quot; (U+1F365 FISH CAKE WITH SWIRL DESIGN) which, in some renderings,
looks to have a red swirl on top. This is not bad, but I thought we could do
better.&lt;/p&gt;

&lt;p&gt;On Apple systems, the character &quot;&quot; (&lt;code&gt;U+F8FF&lt;/code&gt;) displays as the corporate
Apple logo. That particular unicode code point is reserved: systems are free
to use it for something private and internal, but other systems won&#39;t use it
for the same thing. So if an Apple user tries to send a document with that
character in it to someone else, they won&#39;t see the Apple unless they are also
viewing it on an Apple computer. (&lt;a href=&quot;https://www.evertype.com/standards/csur/conscript-table.html0&quot;&gt;Some folks use it for Klingon&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://jmtd.net/log/DebianSwirlFont.zip&quot;&gt;Here&#39;s a font that maps the Debian swirl to the same code point&lt;/a&gt;.
It&#39;s covered by the &lt;a href=&quot;https://www.debian.org/logos/&quot;&gt;Debian logo license terms&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://www.nerdfonts.com/&quot;&gt;Nerd Font&lt;/a&gt; maps the Debian swirl logo to codepoints &lt;code&gt;e77d&lt;/code&gt;, &lt;code&gt;f306&lt;/code&gt;, &lt;code&gt;ebc5&lt;/code&gt; and
&lt;code&gt;f08da&lt;/code&gt; (all of which are also in the Private Use Area). I&#39;ve gone ahead and mapped
it to all those points but the last one (simply because I couldn&#39;t find it in FontForge.)&lt;/p&gt;

&lt;p&gt;Note that, unless your recipients have this font, or the Nerd Font, or similar
set up, they aren&#39;t going to see the swirl. But enjoy it for private use. Getting
your system to actually &lt;em&gt;use&lt;/em&gt; the font is, I&#39;m afraid, left as an exercise for the
reader (but feel free to leave comments)&lt;/p&gt;

&lt;p&gt;Thanks to mirabilos for chatting to me about this back in 2019. It&#39;s taken me
that long to get this blog post out of draft!&lt;/p&gt; </description> 
	<pubDate>Fri, 13 Mar 2026 22:11:03 +0000</pubDate>

</item> 
<item>
	<title>Dirk Eddelbuettel: RcppCNPy 0.2.15 on CRAN: Maintenance</title>
	<guid>http://dirk.eddelbuettel.com/blog/2026/03/13#rcppcnpy_0.2.15</guid>
	<link>http://dirk.eddelbuettel.com/blog/2026/03/13#rcppcnpy_0.2.15</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/dirk.png&quot; width=&quot;65&quot; height=&quot;90&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;Another maintenance release of the &lt;a href=&quot;https://dirk.eddelbuettel.com/code/rcpp.cnpy.html&quot;&gt;RcppCNPy&lt;/a&gt;
package arrived on &lt;a href=&quot;https://cran.r-project.org&quot;&gt;CRAN&lt;/a&gt; today,
and has already been built as an &lt;a href=&quot;https://eddelbuettel.github.io/r2u/&quot;&gt;r2u&lt;/a&gt; binary. &lt;a href=&quot;https://dirk.eddelbuettel.com/code/rcpp.cnpy.html&quot;&gt;RcppCNPy&lt;/a&gt;
provides R with read and write access to &lt;a href=&quot;https://www.numpy.org/&quot;&gt;NumPy&lt;/a&gt; files thanks to the &lt;a href=&quot;https://github.com/rogersce/cnpy&quot;&gt;cnpy&lt;/a&gt; library by Carl Rogers
along with &lt;a href=&quot;https://www.rcpp.org&quot;&gt;Rcpp&lt;/a&gt; for the glue to &lt;a href=&quot;https://www.r-project.org&quot;&gt;R&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The changes are minor and similar to other recent changes. We aid &lt;a href=&quot;https://www.rcpp.org&quot;&gt;Rcpp&lt;/a&gt; in the transition away from calling
&lt;code&gt;Rf_error()&lt;/code&gt; by relying in &lt;code&gt;Rcpp::stop()&lt;/code&gt; which
has better behaviour and unwinding when errors or exceptions are
encountered. So once again no user-facing changes. Full details are
below.&lt;/p&gt;
&lt;blockquote&gt;
&lt;h4 id=&quot;changes-in-version-0.2.15-2026-03-13&quot;&gt;Changes in version 0.2.15
(2026-03-13)&lt;/h4&gt;
&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Replaced Rf_error with Rcpp::stop in three files&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Maintenance updates to continuous integration&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a href=&quot;https://dirk.eddelbuettel.com/cranberries/&quot;&gt;CRANberries&lt;/a&gt;
also provides a diffstat report &lt;a href=&quot;https://dirk.eddelbuettel.com/cranberries/2026/03/13#RcppCNPy_0.2.15&quot;&gt;for
the latest release&lt;/a&gt;. As always, feedback is welcome and the best
place to start a discussion may be the &lt;a href=&quot;https://github.com/eddelbuettel/rcppcnpy/issues&quot;&gt;GitHub issue
tickets&lt;/a&gt; page.&lt;/p&gt;
&lt;p&gt;If you like this or other open-source work I do, you can now &lt;a href=&quot;https://github.com/sponsors/eddelbuettel&quot;&gt;sponsor me at
GitHub&lt;/a&gt;.&lt;/p&gt;
&lt;p style=&quot;font-size: 80%; font-style: italic;&quot;&gt;
This post by &lt;a href=&quot;https://dirk.eddelbuettel.com&quot;&gt;Dirk
Eddelbuettel&lt;/a&gt; originated on his &lt;a href=&quot;https://dirk.eddelbuettel.com/blog/&quot;&gt;Thinking inside the box&lt;/a&gt;
blog. Please report excessive re-aggregation in third-party for-profit
settings.
&lt;/p&gt;&lt;p&gt;&lt;/p&gt; </description> 
	<pubDate>Fri, 13 Mar 2026 19:10:00 +0000</pubDate>

</item> 
<item>
	<title>Sven Hoexter: container image with ECH enabled curl</title>
	<guid>http://sven.stormbind.net/blog/posts/misc_ech_enabled_curl_oci_image/</guid>
	<link>http://sven.stormbind.net/blog/posts/misc_ech_enabled_curl_oci_image/</link>
     <description>  &lt;p&gt;As an opportunity to rewire my brain from &quot;docker&quot; to &quot;podman&quot; and &quot;buildah&quot;
I started to create an image build with an ECH enabled curl at
&lt;a href=&quot;https://gitlab.com/hoexter-experiments/ech&quot;&gt;https://gitlab.com/hoexter-experiments/ech&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Not sure if it helps anyone, but setup should be like this:&lt;/p&gt;

&lt;pre&gt;&lt;code&gt;git clone https://gitlab.com/hoexter-experiments/ech
cd ech
buildah build --layers -f Dockerfile -t echtest
podman run -ti echtest /usr/local/bin/curl \
  --ech true --doh-url https://one.one.one.one/dns-query \
  https://crypto.cloudflare.com/cdn-cgi/trace.cgi
fl=48f121
h=crypto.cloudflare.com
ip=2.205.251.187
ts=1773410985.168
visit_scheme=https
uag=curl/8.19.0
colo=DUS
sliver=none
http=http/2
loc=DE
tls=TLSv1.3
sni=encrypted
warp=off
gateway=off
rbi=off
kex=X25519
&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;It also builds nginx and you can use that for a local test within
the image. More details in the
&lt;a href=&quot;https://gitlab.com/hoexter-experiments/ech/-/blob/main/README.md&quot;&gt;README&lt;/a&gt;.&lt;/p&gt; </description> 
	<pubDate>Fri, 13 Mar 2026 14:16:02 +0000</pubDate>

</item> 
<item>
	<title>Hellen Chemtai: One week later after the Outreachy internship: Managing Work-Life Balance</title>
	<guid>http://hellenchemtai.wordpress.com/?p=81</guid>
	<link>https://hellenchemtai.wordpress.com/2026/03/13/one-week-later-after-the-outreachy-internship-managing-work-life-balance/</link>
     <description>  &lt;p class=&quot;wp-block-paragraph&quot;&gt;Hello world. I have been doing a lot after my internship with Outreachy. We are still working on some tasks : &lt;/p&gt;



&lt;ol class=&quot;wp-block-list&quot;&gt;
&lt;li&gt;I am working on running locales for my native language in live images.&lt;/li&gt;



&lt;li&gt;I am also working on points to add to talk proposals for a Debian conference.&lt;/li&gt;
&lt;/ol&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;As I am moving around constantly, there are problems I had encountered when changing my networks. I had to connect my virtual machine to different networks and the network would not reflect within the machine. From terminal I edited the virtual machine XML settings:&lt;/p&gt;


&lt;div class=&quot;wp-block-code&quot;&gt;
	&lt;div class=&quot;cm-editor&quot;&gt;
		&lt;div class=&quot;cm-scroller&quot;&gt;
			
&lt;pre&gt;&lt;code&gt;&lt;div class=&quot;cm-line&quot;&gt;su -&lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt;&lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt;// input password &lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt;&lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt;sudo virsh edit &amp;lt;machine_name&amp;gt; #its openqa for me&lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt;&lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt;// Look for the interface within devices and replace this:&lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt;&lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt;&amp;lt;interface type=&amp;amp;aposnetwork&amp;amp;apos&amp;gt;&lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt;        &amp;lt;source network=&amp;amp;aposdefault&amp;amp;apos/&amp;gt;&lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt;        #some other code in here&lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt; &amp;lt;/interface&amp;gt;&lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt;&lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt;// With just this then restart your machine:&lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt;&lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt;&amp;lt;interface type=&amp;amp;aposuser&amp;amp;apos&amp;gt;&lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt;    &amp;lt;model type=&amp;amp;aposvirtio&amp;amp;apos/&amp;gt;&lt;/div&gt;&lt;div class=&quot;cm-line&quot;&gt;&amp;lt;/interface&amp;gt;&lt;/div&gt;&lt;/code&gt;&lt;/pre&gt;
		&lt;/div&gt;
	&lt;/div&gt;
&lt;/div&gt;


&lt;p class=&quot;wp-block-paragraph&quot;&gt;Hopefully the above will help someone out there. I am still working on a lot of tasks regarding the conference, so much to do and so little time. I am hoping I won’t get any burnout during this period. I won’t be updating much further till the conference. Have a nice time&lt;/p&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;&lt;/p&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;&lt;/p&gt; </description> 
	<pubDate>Fri, 13 Mar 2026 08:10:26 +0000</pubDate>

</item> 
<item>
	<title>Reproducible Builds (diffoscope): diffoscope 314 released</title>
	<guid>https://diffoscope.org/news/diffoscope-314-released/</guid>
	<link>https://diffoscope.org/news/diffoscope-314-released/</link>
     <description>  &lt;p&gt;The diffoscope maintainers are pleased to announce the release of diffoscope
version &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;314&lt;/code&gt;. This version includes the following changes:&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;[ Chris Lamb ]
* Don&#39;t run &quot;test_code_is_black_clean&quot; test in autopkgtests.
  (Closes: #1130402)

[ Michael R. Crusoe ]
* Reformat using Black 26.1.0. (Closes: #1130073)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;You find out more by &lt;a href=&quot;https://diffoscope.org&quot;&gt;visiting the project homepage&lt;/a&gt;.&lt;/p&gt; </description> 
	<pubDate>Fri, 13 Mar 2026 00:00:00 +0000</pubDate>

</item> 
<item>
	<title>Reproducible Builds: Reproducible Builds in February 2026</title>
	<guid>https://reproducible-builds.org/reports/2026-02/</guid>
	<link>https://reproducible-builds.org/reports/2026-02/</link>
     <description>  &lt;p class=&quot;lead&quot;&gt;&lt;strong&gt;Welcome to the February 2026 report from the &lt;a href=&quot;https://reproducible-builds.org&quot;&gt;Reproducible Builds&lt;/a&gt; project!&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://reproducible-builds.org/&quot;&gt;&lt;img alt=&quot;&quot; src=&quot;https://reproducible-builds.org/images/reports/2026-02/reproducible-builds.png#right&quot; /&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;These reports outline what we’ve been up to over the past month, highlighting items of news from elsewhere in the increasingly-important area of software supply-chain security. As ever, if you are interested in contributing to the Reproducible Builds project, please see the &lt;a href=&quot;https://reproducible-builds.org/contribute/&quot;&gt;&lt;em&gt;Contribute&lt;/em&gt;&lt;/a&gt; page on our website.&lt;/p&gt;

&lt;ol&gt;
  &lt;li&gt;&lt;a href=&quot;https://reproducible-builds.org/blog/index.rss#reproducedebiannet&quot;&gt;&lt;em&gt;reproduce.debian.net&lt;/em&gt;&lt;/a&gt;&lt;/li&gt;
  &lt;li&gt;&lt;a href=&quot;https://reproducible-builds.org/blog/index.rss#tool-development&quot;&gt;Tool development&lt;/a&gt;&lt;/li&gt;
  &lt;li&gt;&lt;a href=&quot;https://reproducible-builds.org/blog/index.rss#distribution-work&quot;&gt;Distribution work&lt;/a&gt;&lt;/li&gt;
  &lt;li&gt;&lt;a href=&quot;https://reproducible-builds.org/blog/index.rss#miscellaneous-news&quot;&gt;Miscellaneous news&lt;/a&gt;&lt;/li&gt;
  &lt;li&gt;&lt;a href=&quot;https://reproducible-builds.org/blog/index.rss#upstream-patches&quot;&gt;Upstream patches&lt;/a&gt;&lt;/li&gt;
  &lt;li&gt;&lt;a href=&quot;https://reproducible-builds.org/blog/index.rss#documentation-updates&quot;&gt;Documentation updates&lt;/a&gt;&lt;/li&gt;
  &lt;li&gt;&lt;a href=&quot;https://reproducible-builds.org/blog/index.rss#four-new-academic-papers&quot;&gt;Four new academic papers&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;hr /&gt;

&lt;h3 id=&quot;reproducedebiannet&quot;&gt;&lt;a href=&quot;https://reproduce.debian.net/&quot;&gt;&lt;em&gt;reproduce.debian.net&lt;/em&gt;&lt;/a&gt;&lt;/h3&gt;

&lt;p&gt;&lt;a href=&quot;https://reproduce.debian.net&quot;&gt;&lt;img alt=&quot;&quot; src=&quot;https://reproducible-builds.org/images/reports/2026-02/reproduce.debian.net.png#right&quot; /&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The last year has seen the introduction, development and deployment of &lt;a href=&quot;https://reproduce.debian.net&quot;&gt;&lt;em&gt;reproduce.debian.net&lt;/em&gt;&lt;/a&gt;. In technical terms, this is an instance of &lt;a href=&quot;https://github.com/kpcyrd/rebuilderd&quot;&gt;&lt;em&gt;rebuilderd&lt;/em&gt;&lt;/a&gt;, our server designed monitor the official package repositories of Linux distributions and attempt to reproduce the observed results there.&lt;/p&gt;

&lt;p&gt;This month, however, Holger Levsen added suite-based navigation (eg. Debian &lt;em&gt;trixie&lt;/em&gt; vs &lt;em&gt;forky&lt;/em&gt;) to the service (in addition to the already existing architecture based navigation) which can be observed on, for instance, the &lt;a href=&quot;https://reproduce.debian.net/trixie-backports.html&quot;&gt;Debian &lt;em&gt;trixie-backports&lt;/em&gt;&lt;/a&gt; or &lt;a href=&quot;https://reproduce.debian.net/trixie-security.html&quot;&gt;&lt;em&gt;trixie-security&lt;/em&gt;&lt;/a&gt; pages.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;&lt;/p&gt;

&lt;h3 id=&quot;tool-development&quot;&gt;Tool development&lt;/h3&gt;

&lt;p&gt;&lt;a href=&quot;https://diffoscope.org/&quot;&gt;&lt;img alt=&quot;&quot; src=&quot;https://reproducible-builds.org/images/reports/2026-02/diffoscope.png#right&quot; /&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://diffoscope.org&quot;&gt;&lt;strong&gt;diffoscope&lt;/strong&gt;&lt;/a&gt; is our in-depth and content-aware diff utility that can locate and diagnose reproducibility issues. This month, Chris Lamb made a number of changes, including preparing and uploading versions, &lt;a href=&quot;https://tracker.debian.org/news/1713576/accepted-diffoscope-312-source-into-unstable/&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;312&lt;/code&gt;&lt;/a&gt; and &lt;a href=&quot;https://tracker.debian.org/news/1719459/accepted-diffoscope-313-source-into-unstable/&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;313&lt;/code&gt;&lt;/a&gt; to Debian.&lt;/p&gt;

&lt;p&gt;In particular, Chris updated the post-release deployment pipeline to ensure that the pipeline does not fail if the automatic deployment to &lt;a href=&quot;https://pypi.org/&quot;&gt;PyPI&lt;/a&gt; fails [&lt;a href=&quot;https://salsa.debian.org/reproducible-builds/diffoscope/commit/3beea8cb&quot;&gt;…&lt;/a&gt;]. In addition, Vagrant Cascadian updated an external reference for the &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;7z&lt;/code&gt; tool for &lt;a href=&quot;https://guix.gnu.org/&quot;&gt;GNU Guix&lt;/a&gt;. [&lt;a href=&quot;https://salsa.debian.org/reproducible-builds/diffoscope/commit/a826a008&quot;&gt;…&lt;/a&gt;]. Vagrant Cascadian also updated &lt;em&gt;diffoscope&lt;/em&gt; in GNU Guix to version &lt;a href=&quot;https://codeberg.org/guix/guix/commit/27255149743362496eebdccca94d5df680ef7fdd&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;312&lt;/code&gt;&lt;/a&gt; and &lt;a href=&quot;https://codeberg.org/guix/guix/commit/43f71df9ceb7f10db7d1d16a2adb46da4adc1a3f&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;313&lt;/code&gt;&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;&lt;/p&gt;

&lt;h3 id=&quot;distribution-work&quot;&gt;Distribution work&lt;/h3&gt;

&lt;p&gt;&lt;a href=&quot;https://debian.org/&quot;&gt;&lt;img alt=&quot;&quot; src=&quot;https://reproducible-builds.org/images/reports/2026-02/debian.png#right&quot; /&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In Debian this month:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;
    &lt;p&gt;26 reviews of Debian packages were added, 5 were updated and 19 were removed this month adding to &lt;a href=&quot;https://tests.reproducible-builds.org/debian/index_issues.html&quot;&gt;our extensive knowledge about identified issues&lt;/a&gt;.&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;A new &lt;a href=&quot;https://tracker.debian.org/pkg/debsbom&quot;&gt;&lt;em&gt;debsbom&lt;/em&gt;&lt;/a&gt; package was uploaded to &lt;em&gt;unstable&lt;/em&gt;. According to the package description, this package “generates SBOMs (Software Bill of Materials) for distributions based on Debian in the two standard formats, SPDX and CycloneDX. The generated SBOM includes all installed binary packages and also contains Debian Source packages.”&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;In addition, a &lt;a href=&quot;https://tracker.debian.org/pkg/sbom-toolkit&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;sbom-toolkit&lt;/code&gt;&lt;/a&gt; package was uploaded, which “provides a collection of scripts for generating SBOM. This is the tooling used in &lt;a href=&quot;https://www.apertis.org/architecture/platform/software_bill_of_materials/&quot;&gt;Apertis to generate the Licenses SBOM and the Build Dependency SBOM&lt;/a&gt;. It also includes &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;dh-setup-copyright&lt;/code&gt;, a &lt;a href=&quot;https://wiki.debian.org/Debhelper&quot;&gt;Debhelper&lt;/a&gt; addon to generate SBOMs from &lt;a href=&quot;https://en.wikipedia.org/wiki/DWARF&quot;&gt;DWARF debug information&lt;/a&gt;, which are “extracted from DWARF debug information by running &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;dwarf2sources&lt;/code&gt; on every ELF binaries in the package and saving the output.”&lt;/p&gt;
  &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href=&quot;https://www.opensuse.org/&quot;&gt;&lt;img alt=&quot;&quot; src=&quot;https://reproducible-builds.org/images/reports/2026-02/opensuse.png#right&quot; /&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Lastly, Bernhard M. Wiedemann posted another &lt;a href=&quot;https://www.opensuse.org/&quot;&gt;&lt;strong&gt;openSUSE&lt;/strong&gt;&lt;/a&gt; &lt;a href=&quot;https://lists.opensuse.org/archives/list/factory@lists.opensuse.org/thread/QH2ULPPQD5U54TEK5OMWLUEFWSGMLIS5/&quot;&gt;monthly update&lt;/a&gt; for their work there.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;&lt;/p&gt;

&lt;h3 id=&quot;miscellaneous-news&quot;&gt;Miscellaneous news&lt;/h3&gt;

&lt;ul&gt;
  &lt;li&gt;
    &lt;p&gt;&lt;a href=&quot;https://notes.8pit.net/&quot;&gt;Sören Tempel (&lt;em&gt;nmeum&lt;/em&gt;)&lt;/a&gt; wrote up their insightful notes on &lt;a href=&quot;https://notes.8pit.net/notes/iqfs.html&quot;&gt;&lt;em&gt;Debugging Reproducibility Issues in Rust Software&lt;/em&gt;&lt;/a&gt; after nondeterministic issues were &lt;a href=&quot;https://codeberg.org/guix/guix/pulls/4551#issuecomment-10997750&quot;&gt;found and investigated for &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;pimsync&lt;/code&gt; in the GNU Guix review process&lt;/a&gt;&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;Jeremy Bicha reported a bug in &lt;a href=&quot;https://apps.gnome.org/en-GB/Clocks/&quot;&gt;GNOME Clocks&lt;/a&gt; after they noticed that &lt;a href=&quot;https://gitlab.gnome.org/GNOME/gnome-clocks/-/issues/436&quot;&gt;version &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;50.beta&lt;/code&gt; regressed in reproducibility compared to &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;49.0&lt;/code&gt;&lt;/a&gt;. Specifically, “the new generated &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;.oga&lt;/code&gt; files differ in their &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;Serial No.&lt;/code&gt; and &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;Checksum&lt;/code&gt; [fields]”. However, &lt;a href=&quot;https://gitlab.gnome.org/GNOME/gnome-clocks/-/commit/dbeb4fa3502a1ab8e05069e24319a9f276f2b4e1&quot;&gt;Jeremy ended up fixing the issue&lt;/a&gt; by replacing &lt;a href=&quot;https://www.ffmpeg.org/&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;ffmpeg&lt;/code&gt;&lt;/a&gt; with &lt;a href=&quot;https://www.rarewares.org/ogg-oggenc.php&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;oggenc&lt;/code&gt;&lt;/a&gt;.&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;&lt;em&gt;kpcyrd&lt;/em&gt; &lt;a href=&quot;https://lists.reproducible-builds.org/pipermail/rb-general/2026-February/004022.html&quot;&gt;shared some information&lt;/a&gt; from the &lt;a href=&quot;https://lists.archlinux.org/archives/list/arch-dev-public@lists.archlinux.org/&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;archlinux-dev-public&lt;/code&gt;&lt;/a&gt; mailing list on our &lt;a href=&quot;https://lists.reproducible-builds.org/listinfo/rb-general/&quot;&gt;mailing list&lt;/a&gt; this month after a discussion at &lt;a href=&quot;https://reproducible-builds.org/events/vienna2025/&quot;&gt;our latest Summit meeting&lt;/a&gt; on the topic of &lt;a href=&quot;https://llvm.org/docs/LinkTimeOptimization.html&quot;&gt;Link-Time Optimisation&lt;/a&gt; (LTO) — specifically on the reasons &lt;a href=&quot;https://lists.archlinux.org/archives/list/arch-dev-public@lists.archlinux.org/message/BSAAFYOJ3KTYZXACIQ26RP5II4JULLS4/&quot;&gt;why LTO often needs to be disabled&lt;/a&gt; in relation to &lt;a href=&quot;https://archlinux.org&quot;&gt;Arch Linux&lt;/a&gt;’s approach to binary hardening.&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;Janneke Nieuwenhuizen &lt;a href=&quot;https://lists.reproducible-builds.org/pipermail/rb-general/2026-February/004037.html&quot;&gt;posed a question&lt;/a&gt; to our list about whether there might be situations where using the UNIX epoch itself (i.e. &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;0&lt;/code&gt;) may materially differ from using &lt;a href=&quot;https://reproducible-builds.org/docs/source-date-epoch/&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;SOURCE_DATE_EPOCH&lt;/code&gt;&lt;/a&gt;) when a situation demands the use of a fixed timestamp.&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;Laurent Huberdeau &lt;a href=&quot;https://lists.reproducible-builds.org/pipermail/rb-general/2026-February/004031.html&quot;&gt;announced that they had recently finished their masters thesis&lt;/a&gt; “arguing for the use of &lt;a href=&quot;https://umontreal.scholaris.ca/items/2f44323a-9f4f-482a-98be-542d8ee5b9fb&quot;&gt;POSIX shell for diverse double-compilation and reproducible builds”&lt;/a&gt;. Laurent also presents &lt;a href=&quot;https://github.com/udem-dlteam/pnut&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;pnut&lt;/code&gt;&lt;/a&gt;, a C compiler capable of bootstrapping itself and &lt;a href=&quot;https://en.wikipedia.org/wiki/Tiny_C_Compiler&quot;&gt;TCC&lt;/a&gt; from “any &lt;a href=&quot;https://en.wikipedia.org/wiki/Unix_shell#Bourne_shell&quot;&gt;POSIX-compliant shell&lt;/a&gt; and human-readable source files.”&lt;/p&gt;
  &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;br /&gt;&lt;/p&gt;

&lt;h3 id=&quot;upstream-patches&quot;&gt;Upstream patches&lt;/h3&gt;

&lt;p&gt;The Reproducible Builds project detects, dissects and attempts to fix as many currently-unreproducible packages as possible. We endeavour to send all of our patches upstream where appropriate. This month, we wrote a large number of such patches, including:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;
    &lt;p&gt;Bernhard M. Wiedemann:&lt;/p&gt;

    &lt;ul&gt;
      &lt;li&gt;&lt;a href=&quot;https://github.com/aio-libs/aiohttp/pull/12088&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;aiohttp&lt;/code&gt;&lt;/a&gt;&lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://github.com/lima-vm/lima/pull/4561&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;lima&lt;/code&gt;&lt;/a&gt;&lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://github.com/mesonbuild/meson/pull/15529&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;meson&lt;/code&gt;&lt;/a&gt;&lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://build.opensuse.org/request/show/1335456&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;obs-studio&lt;/code&gt;&lt;/a&gt;&lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://build.opensuse.org/request/show/1331443&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;paracon&lt;/code&gt;&lt;/a&gt;&lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://github.com/KimiNewt/pyshark/issues/747&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;pyshark&lt;/code&gt;&lt;/a&gt;&lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://build.opensuse.org/request/show/1331718&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;python-flake8-comprehensions&lt;/code&gt;&lt;/a&gt;&lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://github.com/MariaDB/server/pull/4667&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;server&lt;/code&gt;&lt;/a&gt;&lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://github.com/vlang/v/issues/26664&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;vlang&lt;/code&gt;&lt;/a&gt;&lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;Gioele Barabucci:&lt;/p&gt;

    &lt;ul&gt;
      &lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1127641&quot;&gt;#1127641&lt;/a&gt; filed against &lt;a href=&quot;https://tracker.debian.org/pkg/bitsnpicas&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;bitsnpicas&lt;/code&gt;&lt;/a&gt;.&lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1127643&quot;&gt;#1127643&lt;/a&gt; filed against &lt;a href=&quot;https://tracker.debian.org/pkg/fonts-topaz-unicode&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;fonts-topaz-unicode&lt;/code&gt;&lt;/a&gt;.&lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1128901&quot;&gt;#1128901&lt;/a&gt; filed against &lt;a href=&quot;https://tracker.debian.org/pkg/bitsnpicas&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;bitsnpicas&lt;/code&gt;&lt;/a&gt;.&lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;br /&gt;&lt;/p&gt;

&lt;h3 id=&quot;documentation-updates&quot;&gt;Documentation updates&lt;/h3&gt;

&lt;p&gt;&lt;a href=&quot;https://reproducible-builds.org/&quot;&gt;&lt;img alt=&quot;&quot; src=&quot;https://reproducible-builds.org/images/reports/2026-02/website.png#right&quot; /&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once again, there were a number of improvements made to our website this month including:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;
    &lt;p&gt;Aman Sharma added a Java reproducible builds paper to the &lt;a href=&quot;https://reproducible-builds.org/docs/publications/&quot;&gt;&lt;em&gt;Academic publications&lt;/em&gt;&lt;/a&gt; page. [&lt;a href=&quot;https://salsa.debian.org/reproducible-builds/reproducible-website/commit/a43a33b3&quot;&gt;…&lt;/a&gt;]&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;Chris Lamb added a reference to the &lt;a href=&quot;https://github.com/freedomofpress/repro-build&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;repro-build&lt;/code&gt;&lt;/a&gt; to the &lt;a href=&quot;https://reproducible-builds.org/tools/&quot;&gt;&lt;em&gt;Tools&lt;/em&gt;&lt;/a&gt; page. [&lt;a href=&quot;https://salsa.debian.org/reproducible-builds/reproducible-website/commit/c3ae179f&quot;&gt;…&lt;/a&gt;]&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;Michiel Hendriks corrected an issue on the &lt;a href=&quot;https://reproducible-builds.org/docs/jvm/&quot;&gt;&lt;em&gt;JVM&lt;/em&gt;&lt;/a&gt; page in relation to &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;.properties&lt;/code&gt; files. [&lt;a href=&quot;https://salsa.debian.org/reproducible-builds/reproducible-website/commit/c77b3931&quot;&gt;…&lt;/a&gt;]&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;&lt;em&gt;kpcyrd&lt;/em&gt; added &lt;a href=&quot;https://docs.brew.sh/Reproducible-Builds&quot;&gt;Homebrew&lt;/a&gt; to the &lt;a href=&quot;https://reproducible-builds.org/docs/projects/&quot;&gt;&lt;em&gt;Who is involved&lt;/em&gt;&lt;/a&gt; page. [&lt;a href=&quot;https://salsa.debian.org/reproducible-builds/reproducible-website/commit/db7a2a97&quot;&gt;…&lt;/a&gt;][&lt;a href=&quot;https://salsa.debian.org/reproducible-builds/reproducible-website/commit/86eae61a&quot;&gt;…&lt;/a&gt;]&lt;/p&gt;
  &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;br /&gt;&lt;/p&gt;

&lt;h3 id=&quot;four-new-academic-papers&quot;&gt;Four new academic papers&lt;/h3&gt;

&lt;p&gt;&lt;a href=&quot;https://arxiv.org/abs/2601.20662&quot;&gt;&lt;img alt=&quot;&quot; src=&quot;https://reproducible-builds.org/images/reports/2026-02/2601.20662.png#right&quot; /&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Julien Malka and Arnout Engelen published a paper titled &lt;a href=&quot;https://arxiv.org/abs/2601.20662&quot;&gt;&lt;em&gt;Lila: Decentralized Build Reproducibility Monitoring for the Functional Package Management Model&lt;/em&gt;&lt;/a&gt;:&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;[While] recent studies have shown that high reproducibility rates are achievable at scale — demonstrated by the Nix ecosystem achieving over 90% reproducibility on more than 80,000 packages — the problem of effective reproducibility monitoring remains largely unsolved. In this work, &lt;strong&gt;we address the reproducibility monitoring challenge by introducing &lt;em&gt;Lila&lt;/em&gt;, a decentralized system for reproducibility assessment tailored to the functional package management model.&lt;/strong&gt; Lila enables distributed reporting of build results and aggregation into a reproducibility database […].&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;A &lt;a href=&quot;https://arxiv.org/pdf/2601.20662&quot;&gt;PDF&lt;/a&gt; of their paper is available online.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://arxiv.org/abs/2602.11887&quot;&gt;&lt;img alt=&quot;&quot; src=&quot;https://reproducible-builds.org/images/reports/2026-02/2602.11887.png#right&quot; /&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Javier Ron and Martin Monperrus of &lt;a href=&quot;https://www.kth.se/en&quot;&gt;KTH Royal Institute of Technology&lt;/a&gt;, Sweden, also published a paper, titled &lt;a href=&quot;https://arxiv.org/abs/2602.11887&quot;&gt;&lt;em&gt;Verifiable Provenance of Software Artifacts with Zero-Knowledge Compilation&lt;/em&gt;&lt;/a&gt;:&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;Verifying that a compiled binary originates from its claimed source code is a fundamental security requirement, called source code provenance. Achieving verifiable source code provenance in practice remains challenging. The most popular technique, called reproducible builds, requires difficult matching and reexecution of build toolchains and environments. &lt;strong&gt;We propose a novel approach to verifiable provenance based on compiling software with zero-knowledge virtual machines (zkVMs).&lt;/strong&gt; By executing a compiler within a zkVM, our system produces both the compiled output and a cryptographic proof attesting that the compilation was performed on the claimed source code with the claimed compiler. […]&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;A &lt;a href=&quot;https://arxiv.org/pdf/2602.11887&quot;&gt;PDF&lt;/a&gt; of the paper is available online.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://arxiv.org/abs/2602.17678&quot;&gt;&lt;img alt=&quot;&quot; src=&quot;https://reproducible-builds.org/images/reports/2026-02/2602.17678.png#right&quot; /&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Oreofe Solarin of &lt;a href=&quot;https://engineering.case.edu/computer-and-data-sciences&quot;&gt;Department of Computer and Data Sciences&lt;/a&gt;, &lt;a href=&quot;https://case.edu/&quot;&gt;Case Western Reserve University&lt;/a&gt;, Cleveland, Ohio, USA, published &lt;a href=&quot;https://arxiv.org/abs/2602.17678&quot;&gt;&lt;em&gt;It’s Not Just Timestamps: A Study on Docker Reproducibility&lt;/em&gt;&lt;/a&gt;:&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;Reproducible container builds promise a simple integrity check for software supply chains: rebuild an image from its Dockerfile and compare hashes. &lt;strong&gt;We built a Docker measurement pipeline and apply it to a stratified sample of 2,000 GitHub repositories that contained a Dockerfile. We found that only 56% produce any buildable image, and just 2.7% of those are bitwise reproducible without any infrastructure configurations.&lt;/strong&gt; After modifying infrastructure configurations, we raise bitwise reproducibility by 18.6%, but 78.7% of buildable Dockerfiles remain non-reproducible.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;A &lt;a href=&quot;https://arxiv.org/pdf/2602.17678&quot;&gt;PDF&lt;/a&gt; of Oreofe’s paper is available online.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://arxiv.org/abs/2602.19383&quot;&gt;&lt;img alt=&quot;&quot; src=&quot;https://reproducible-builds.org/images/reports/2026-02/2602.19383.png#right&quot; /&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Lastly, Jens Dietrich and Behnaz Hassanshahi published &lt;a href=&quot;https://arxiv.org/abs/2602.19383&quot;&gt;&lt;em&gt;On the Variability of Source Code in Maven Package Rebuilds&lt;/em&gt;&lt;/a&gt;:&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;[In] this paper we test the assumption that the same source code is being used [by] alternative builds. To study this, we compare the sources released with packages on Maven Central, with the sources associated with independently built packages from Google’s &lt;a href=&quot;https://cloud.google.com/security/products/assured-open-source-software&quot;&gt;Assured Open Source&lt;/a&gt; and Oracle’s Build-from-Source projects. […]&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;A &lt;a href=&quot;https://arxiv.org/pdf/2602.19383&quot;&gt;PDF&lt;/a&gt; of their paper is available online.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;Finally, if you are interested in contributing to the Reproducible Builds project, please visit our &lt;a href=&quot;https://reproducible-builds.org/contribute/&quot;&gt;&lt;em&gt;Contribute&lt;/em&gt;&lt;/a&gt; page on our website. However, you can get in touch with us via:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;
    &lt;p&gt;IRC: &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;#reproducible-builds&lt;/code&gt; on &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;irc.oftc.net&lt;/code&gt;.&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;Mastodon: &lt;a href=&quot;https://fosstodon.org/@reproducible_builds&quot;&gt;@reproducible_builds@fosstodon.org&lt;/a&gt;&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;Mailing list: &lt;a href=&quot;https://lists.reproducible-builds.org/listinfo/rb-general&quot;&gt;&lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;rb-general@lists.reproducible-builds.org&lt;/code&gt;&lt;/a&gt;&lt;/p&gt;
  &lt;/li&gt;
&lt;/ul&gt; </description> 
	<pubDate>Thu, 12 Mar 2026 19:08:40 +0000</pubDate>

</item> 
<item>
	<title>Dirk Eddelbuettel: RcppBDT 0.2.8 on CRAN: Maintenance</title>
	<guid>http://dirk.eddelbuettel.com/blog/2026/03/12#rcppbdt_0.2.8</guid>
	<link>http://dirk.eddelbuettel.com/blog/2026/03/12#rcppbdt_0.2.8</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/dirk.png&quot; width=&quot;65&quot; height=&quot;90&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;Another minor maintenance release for the &lt;a href=&quot;https://dirk.eddelbuettel.com/code/rcpp.bdt.html&quot;&gt;RcppBDT&lt;/a&gt;
package is now on &lt;a href=&quot;https://cran.r-project.org&quot;&gt;CRAN&lt;/a&gt;, and had
been built as binary for &lt;a href=&quot;https://eddelbuettel.github.io/r2u&quot;&gt;r2u&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The &lt;a href=&quot;https://dirk.eddelbuettel.com/code/rcpp.bdt.html&quot;&gt;RcppBDT&lt;/a&gt;
package is an early adopter of &lt;a href=&quot;https://www.rcpp.org&quot;&gt;Rcpp&lt;/a&gt;
and was one of the first packages utilizing &lt;a href=&quot;https://www.boost.org&quot;&gt;Boost&lt;/a&gt; and its &lt;a href=&quot;https://www.boost.org/doc/libs/release/doc/html/date_time.html&quot;&gt;Date_Time&lt;/a&gt;
library. The now more widely-used package &lt;a href=&quot;https://dirk.eddelbuettel.com/code/anytime.html&quot;&gt;anytime&lt;/a&gt; is a
direct descentant of &lt;a href=&quot;https://dirk.eddelbuettel.com/code/rcpp.bdt.html&quot;&gt;RcppBDT&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;This release is again primarily maintenance. We aid &lt;a href=&quot;https://www.rcpp.org&quot;&gt;Rcpp&lt;/a&gt; in the transition away from calling
&lt;code&gt;Rf_error()&lt;/code&gt; by relying in &lt;code&gt;Rcpp::stop()&lt;/code&gt; which
has better behaviour and unwinding when errors or exceptions are
encountered. No feature or interface changes.&lt;/p&gt;
&lt;p&gt;The NEWS entry follows:&lt;/p&gt;
&lt;blockquote&gt;
&lt;h4 id=&quot;changes-in-version-0.2.8-2026-03-12&quot;&gt;Changes in version 0.2.8
(2026-03-12)&lt;/h4&gt;
&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Replaced Rf_error with Rcpp::stop in three files&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Maintenance updates to continuous integration&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/blockquote&gt;
&lt;p&gt;Courtesy of my &lt;a href=&quot;https://dirk.eddelbuettel.com/cranberries&quot;&gt;CRANberries&lt;/a&gt;, there
is also a diffstat report for &lt;a href=&quot;https://dirk.eddelbuettel.com/cranberries/2026/03/12#RcppBDT_0.2.8&quot;&gt;this
release&lt;/a&gt;. For questions, suggestions, or issues please use the &lt;a href=&quot;https://github.com/eddelbuettel/rcppbdt/issues&quot;&gt;issue tracker&lt;/a&gt;
at the &lt;a href=&quot;https://github.com/eddelbuettel/rcppbdt&quot;&gt;GitHub
repo&lt;/a&gt;.&lt;/p&gt;
&lt;p style=&quot;font-size: 80%; font-style: italic;&quot;&gt;
This post by &lt;a href=&quot;https://dirk.eddelbuettel.com&quot;&gt;Dirk
Eddelbuettel&lt;/a&gt; originated on his &lt;a href=&quot;https://dirk.eddelbuettel.com/blog&quot;&gt;Thinking inside the box&lt;/a&gt;
blog. If you like this or other open-source work I do, you can now &lt;a href=&quot;https://github.com/sponsors/eddelbuettel&quot;&gt;sponsor me at
GitHub&lt;/a&gt;.
&lt;/p&gt;&lt;p&gt;&lt;/p&gt; </description> 
	<pubDate>Thu, 12 Mar 2026 18:03:00 +0000</pubDate>

</item> 
<item>
	<title>Mike Gabriel: Debian Lomiri Tablets 2025-2027 - Project Report (Q4/2025)</title>
	<guid>https://sunweavers.net/152 at https://sunweavers.net/blog</guid>
	<link>https://sunweavers.net/blog/node/152</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/sunweaver.png&quot; width=&quot;82&quot; height=&quot;82&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;On 25th Oct 2025, I announced via my personal blog and on Mastodon that Fre(i)e Software GmbH was hiring. The hiring process was a mix of asking developers I know and waiting for new people to apply.&lt;/p&gt;

&lt;p&gt;At the beginning of November 2025 / in mid November 2025, we started with 13 developers (all part-time) to work on various topics around Lomiri (upstream and downstream).

Note that the below achievements don&#39;t document the overall activity in the Lomiri project, but that part that our team at Fre(i)e Software GmbH contributed to.&lt;/p&gt;

&lt;h3&gt;Organizational Achievements&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Setup management board for Qt6 migration in Lomiri [1]&lt;/li&gt;
&lt;li&gt;Setup management board for salsa2ubports package syncing [2]&lt;/li&gt;
&lt;li&gt;Bootstrap Qt 6.8 in UBports APT repository&lt;/li&gt;
&lt;li&gt;Bootstrap Qt 6.8 in Lomiri PPA&lt;/li&gt;
&lt;li&gt;Fix Salsa CI for all Lomiri-related Debian packages&lt;/li&gt;
&lt;li&gt;Facilitate contributor&#39;s project around XDG Desktop Portal support
for Lomiri.&lt;/li&gt;
&lt;li&gt;Plan how to bring DeltaTouch and DeltaChat core to Debian&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;Maintenance Development&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Replace libofono-qt by libqofono in telepathy-ofono&lt;/li&gt;
&lt;li&gt;Rework unit tests in telepathy-ofono utilizing ofone-phonesim&lt;/li&gt;
&lt;li&gt;Obsolete not-used-anymore u1db-qt&lt;/li&gt;
&lt;li&gt;Fixing wrong bin:pkg names regarding snapd-glib&#39;s QML module&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;Qt6 Porting&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;qmake -&amp;gt; CMake porting (if needed) and Qt6 porting of shared libraries and QML modules
consumed by Lomiri shell and Lomiri apps:

&lt;ul&gt;
&lt;li&gt;biometryd&lt;/li&gt;
&lt;li&gt;libqofono&lt;/li&gt;
&lt;li&gt;libqofonoext&lt;/li&gt;
&lt;li&gt;libqtdbusmock&lt;/li&gt;
&lt;li&gt;lomiri-account-polld&lt;/li&gt;
&lt;li&gt;lomiri-action-api&lt;/li&gt;
&lt;li&gt;lomiri-api&lt;/li&gt;
&lt;li&gt;lomiri-download-manager&lt;/li&gt;
&lt;li&gt;lomiri-location-service&lt;/li&gt;
&lt;li&gt;lomiri-online-accounts&lt;/li&gt;
&lt;li&gt;lomiri-push-qml&lt;/li&gt;
&lt;li&gt;lomiri-push-service&lt;/li&gt;
&lt;li&gt;maliit-framework&lt;/li&gt;
&lt;li&gt;mediascanner2&lt;/li&gt;
&lt;li&gt;qtlomiri-appmenutheme&lt;/li&gt;
&lt;li&gt;qtpim (started, work in progress)&lt;/li&gt;
&lt;li&gt;qwebdavlib&lt;/li&gt;
&lt;li&gt;signond (flaws spotted in Debian&#39;s porting of signond to Qt6)&lt;/li&gt;
&lt;/ul&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;Feature Development&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Continuing with Morph Browser Qt6 / LUITK

&lt;ul&gt;
&lt;li&gt;Build, run and fix LUITK unit tests for Qt6&lt;/li&gt;
&lt;li&gt;various bug fixes and improvements for Morph Qt6&lt;/li&gt;
&lt;/ul&gt;&lt;/li&gt;
&lt;li&gt;Add mbim modem support to ofono upstream&lt;/li&gt;
&lt;li&gt;Improve ofono support in Network Manager&lt;/li&gt;
&lt;li&gt;Improve mbim modem support in lomiri-indicator-network&lt;/li&gt;
&lt;li&gt;Package kazv (convergent Matrix client) and dependencies for Debian&lt;/li&gt;
&lt;li&gt;Provide Lomiri images for Mobian&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;Research&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Research on fuse-based caching Webdav client
for lomiri-cloudsync-app.&lt;/li&gt;
&lt;li&gt;Research on alternative ORM instead of QDjango in libusermetrics&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;[1] &lt;a href=&quot;https://gitlab.com/groups/ubports/development/-/boards/9895029?label_name%5B%5D=Topic%3A%20Qt%206&quot;&gt;https://gitlab.com/groups/ubports/development/-/boards/9895029?label_name%5B%5D=Topic%3A%20Qt%206  &lt;/a&gt;&lt;br /&gt;
[2] &lt;a href=&quot;https://gitlab.com/groups/ubports/development/-/boards/10037876?label_name[]=Topic%3A%20salsa2ubports%20DEB%20syncing&quot;&gt;https://gitlab.com/groups/ubports/development/-/boards/10037876?label_name[]=Topic%3A%20salsa2ubports%20DEB%20syncing&lt;/a&gt;&lt;/p&gt; </description> 
	<pubDate>Thu, 12 Mar 2026 08:59:05 +0000</pubDate>

</item> 
<item>
	<title>Sven Hoexter: RFC 9849 - Encrypted Client Hello</title>
	<guid>http://sven.stormbind.net/blog/posts/misc_rfc9849_ech/</guid>
	<link>http://sven.stormbind.net/blog/posts/misc_rfc9849_ech/</link>
     <description>  &lt;p&gt;Now that ECH is standardized I started to look into it to understand what&#39;s coming.
While generally desirable to not leak the SNI information, I&#39;m not sure if it will
ever make it to the masses of (web)servers outside of big CDNs.&lt;/p&gt;

&lt;p&gt;Beside of the extension of the TLS protocol to have an inner and outer ClientHello,
you also need (frequent) updates to your HTTPS/SVCB DNS records. The idea is to
rotate the key quickly, the OpenSSL APIs document talks about hourly rotation.
Which means you&#39;ve to have encrypted DNS in place (I guess these days DNSoverHTTPS
is the most common case), and you need to be able to distribute the private key
between all involved hosts + update DNS records in time.
In addition to that you can also use a &quot;shared mode&quot; where you handle the outer
ClientHello (the one using the public key from DNS) centrally and the inner
ClientHello on your backend servers. I&#39;m not yet sure if that makes it easier or
even harder to get it right.&lt;/p&gt;

&lt;p&gt;That all makes sense, and is feasible for setups like those at Cloudflare where the
common case is that they provide you NS servers for your domain, and terminate your
HTTPS connections. But for the average webserver setup I guess we will not see a
huge adoption rate. Or we soon see something like a Caddy webserver on steroids
which integrates a DNS server for DoH with not only automatic certificate renewal
build in, but also automatic ECHConfig updates.&lt;/p&gt;

&lt;p&gt;If you want to read up yourself here are my starting points:&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://www.rfc-editor.org/rfc/rfc9849.html&quot;&gt;RFC 9849 TLS Encrypted Client Hello&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://www.rfc-editor.org/rfc/rfc9848&quot;&gt;RFC 9848 Bootstrapping TLS Encrypted ClientHello with DNS Service Bindings&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://www.rfc-editor.org/rfc/rfc9934.html&quot;&gt;RFC 9934 Privacy-Enhanced Mail (PEM) File Format for Encrypted ClientHello (ECH)&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://github.com/openssl/openssl/blob/openssl-4.0/doc/designs/ech-api.md&quot;&gt;OpenSSL 4.0 ECH APIs&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://github.com/curl/curl/blob/master/docs/ECH.md&quot;&gt;curl ECH Support&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://blog.nginx.org/blog/encrypted-client-hello-comes-to-nginx&quot;&gt;nginx ECH Support&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://blog.cloudflare.com/encrypted-client-hello/&quot;&gt;Cloudflare Good-bye ESNI, hello ECH!&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you&#39;re looking for a test endpoint, I see one hosted by Cloudflare:&lt;/p&gt;

&lt;pre&gt;&lt;code&gt;$ dig +short IN HTTPS cloudflare-ech.com
1 . alpn=&quot;h3,h2&quot; ipv4hint=104.18.10.118,104.18.11.118 ech=AEX+DQBBFQAgACDBFqmr34YRf/8Ymf+N5ZJCtNkLm3qnjylCCLZc8rUZcwAEAAEAAQASY2xvdWRmbGFyZS1lY2guY29tAAA= ipv6hint=2606:4700::6812:a76,2606:4700::6812:b76
&lt;/code&gt;&lt;/pre&gt; </description> 
	<pubDate>Wed, 11 Mar 2026 15:42:53 +0000</pubDate>

</item> 
<item>
	<title>Dirk Eddelbuettel: RcppDE 0.1.9 on CRAN: Maintenance</title>
	<guid>http://dirk.eddelbuettel.com/blog/2026/03/11#rcppde_0.1.9</guid>
	<link>http://dirk.eddelbuettel.com/blog/2026/03/11#rcppde_0.1.9</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/dirk.png&quot; width=&quot;65&quot; height=&quot;90&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;Another maintenance release of our &lt;a href=&quot;https://github.com/eddelbuettel/rcppde&quot;&gt;RcppDE&lt;/a&gt; package arrived
at &lt;a href=&quot;https://cran.r-project.org&quot;&gt;CRAN&lt;/a&gt;, and has been built for
&lt;a href=&quot;https://eddelbuettel.github.io/r2u&quot;&gt;r2u&lt;/a&gt;. RcppDE is a “port”
of &lt;a href=&quot;https://cloud.r-project.org/package=DEoptim&quot;&gt;DEoptim&lt;/a&gt;, a
package for derivative-free optimisation using differential evolution,
from plain C to C++. By using &lt;a href=&quot;https://github.com/RcppCore/RcppArmadillo&quot;&gt;RcppArmadillo&lt;/a&gt; the
code became a lot shorter and more legible. Our other main contribution
is to leverage some of the excellence we get for free from using &lt;a href=&quot;https://www.rcpp.org&quot;&gt;Rcpp&lt;/a&gt;, in particular the ability to
optimise user-supplied &lt;em&gt;compiled&lt;/em&gt; objective functions which can
make things a lot faster than repeatedly evaluating interpreted
objective functions as &lt;a href=&quot;https://cloud.r-project.org/package=DEoptim&quot;&gt;DEoptim&lt;/a&gt; does (and
which, in fairness, most other optimisers do too). The gains can be
quite substantial.&lt;/p&gt;
&lt;p&gt;This release is again maintenance. We aid &lt;a href=&quot;https://www.rcpp.org&quot;&gt;Rcpp&lt;/a&gt; in the transition away from calling
&lt;code&gt;Rf_error()&lt;/code&gt; by relying in &lt;code&gt;Rcpp::stop()&lt;/code&gt; which
has better behaviour and unwinding when errors or exceptions are
encountered. We also overhauled the references in the vignette, added an
Armadillo version getter and made the regular updates to continuous
integration.&lt;/p&gt;
&lt;p&gt;Courtesy of my &lt;a href=&quot;https://dirk.eddelbuettel.com/cranberries/&quot;&gt;CRANberries&lt;/a&gt;, there
is also a &lt;a href=&quot;https://dirk.eddelbuettel.com/cranberries/2026/03/11/#RcppDE_0.1.9&quot;&gt;diffstat
report&lt;/a&gt;. More detailed information is on the &lt;a href=&quot;https://dirk.eddelbuettel.com/code/rcpp.de.html&quot;&gt;RcppDE page&lt;/a&gt;,
or the &lt;a href=&quot;https://github.com/eddelbuettel/rcppde&quot;&gt;repository&lt;/a&gt;.&lt;/p&gt;
&lt;p style=&quot;font-size: 80%; font-style: italic;&quot;&gt;
This post by &lt;a href=&quot;https://dirk.eddelbuettel.com&quot;&gt;Dirk
Eddelbuettel&lt;/a&gt; originated on his &lt;a href=&quot;https://dirk.eddelbuettel.com/blog/&quot;&gt;Thinking inside the box&lt;/a&gt;
blog. If you like this or other open-source work I do, you can &lt;a href=&quot;https://github.com/sponsors/eddelbuettel&quot;&gt;sponsor me at
GitHub&lt;/a&gt;.
&lt;/p&gt;&lt;p&gt;&lt;/p&gt; </description> 
	<pubDate>Wed, 11 Mar 2026 14:24:00 +0000</pubDate>

</item> 
<item>
	<title>Bits from Debian: Infomaniak Platinum Sponsor of DebConf26</title>
	<guid>tag:bits.debian.org,2026-03-11:/2026/03/infomaniak-platinum-debconf26.html</guid>
	<link>https://bits.debian.org/2026/03/infomaniak-platinum-debconf26.html</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/dwn.png&quot; width=&quot;77&quot; height=&quot;85&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;&lt;a href=&quot;https://infomaniak.com&quot;&gt;&lt;img alt=&quot;infomaniak-logo&quot; src=&quot;https://bits.debian.org/images/infomaniak-ethical-cloud.png&quot; /&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;We are pleased to announce that &lt;strong&gt;&lt;a href=&quot;https://infomaniak.com&quot;&gt;Infomaniak&lt;/a&gt;&lt;/strong&gt; has
committed to sponsor &lt;a href=&quot;https://debconf26.debconf.org/&quot;&gt;DebConf26&lt;/a&gt; as a &lt;strong&gt;Platinum
Sponsor&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;Infomaniak is an independent, employee-owned Swiss technology company that
designs, develops, and operates its own cloud infrastructure and digital
services entirely in Switzerland. With over 300 employees — more than 70%
engineers and developers — the company reinvests all profits into R&amp;amp;D. Its
public cloud is built on OpenStack, with managed Kubernetes, Database as a
Service, object storage, and sovereign AI services accessible via OpenAI-
compatible APIs, all running on its own Swiss infrastructure. Infomaniak also
develops a sovereign collaborative suite — messaging, email, storage, online
office tools, videoconferencing, and a built-in AI assistant — developed in-
house and as a privacy-respecting solution to proprietary platforms. Open
source is central to how Infomaniak operates. Its latest data center (D4) runs
on 100% renewable energy and uses no traditional cooling: all the heat
generated by its servers is captured and fed into Geneva&#39;s district heating
network, supplying up to 6,000 homes in winter and hot water year-round. The
entire project has been documented and open-sourced at &lt;a href=&quot;https://d4project.org/&quot;&gt;d4project.org&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;With this commitment as Platinum Sponsor, Infomaniak is contributing to the
Debian annual Developers&#39; conference, directly supporting the progress of
Debian and Free Software. Infomaniak contributes to strengthen the community
that collaborates on Debian projects from all around the world throughout all
of the year.&lt;/p&gt;
&lt;p&gt;Thank you very much, Infomaniak, for your support of DebConf26!&lt;/p&gt;
&lt;h2&gt;Become a sponsor too!&lt;/h2&gt;
&lt;p&gt;&lt;a href=&quot;https://debconf26.debconf.org/&quot;&gt;DebConf26&lt;/a&gt; will take place &lt;strong&gt;from 20th to July
25th 2026 in Santa Fe, Argentina,&lt;/strong&gt; and will be preceded by DebCamp, from 13th
to 19th July 2026.&lt;/p&gt;
&lt;p&gt;DebConf26 is accepting sponsors! Interested companies and organizations may
contact the DebConf team through
&lt;a href=&quot;mailto:sponsors@debconf.org&quot;&gt;sponsors@debconf.org&lt;/a&gt;, and visit the DebConf26
website at
&lt;a href=&quot;https://debconf26.debconf.org/sponsors/become-a-sponsor/&quot;&gt;https://debconf26.debconf.org/sponsors/become-a-sponsor/&lt;/a&gt;.&lt;/p&gt; </description> 
	<pubDate>Wed, 11 Mar 2026 00:12:00 +0000</pubDate>

</item> 
<item>
	<title>Freexian Collaborators: Debian Contributions: Opening DebConf 26 Registration, Debian CI improvements and more! (by Anupa Ann Joseph)</title>
	<guid>https://www.freexian.com/blog/debian-contributions-02-2026/</guid>
	<link>https://www.freexian.com/blog/debian-contributions-02-2026/</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/freexian.png&quot; width=&quot;215&quot; height=&quot;101&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;h1 id=&quot;debian-contributions-2026-02&quot;&gt;Debian Contributions: 2026-02&lt;/h1&gt;
&lt;p&gt;&lt;a href=&quot;https://www.freexian.com/about/debian-contributions/&quot;&gt;Contributing to Debian&lt;/a&gt;
is part of &lt;a href=&quot;https://www.freexian.com/about/&quot;&gt;Freexian’s mission&lt;/a&gt;. This article
covers the latest achievements of Freexian and their collaborators. All of this
is made possible by organizations subscribing to our
&lt;a href=&quot;https://www.freexian.com/lts/&quot;&gt;Long Term Support contracts&lt;/a&gt; and
&lt;a href=&quot;https://www.freexian.com/services/&quot;&gt;consulting services&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id=&quot;debconf-26-registration-by-stefano-rivera-antonio-terceiro-and-santiago-ruano-rincón&quot;&gt;DebConf 26 Registration, by Stefano Rivera, Antonio Terceiro, and Santiago Ruano Rincón&lt;/h2&gt;
&lt;p&gt;&lt;a href=&quot;https://debconf26.debconf.org/&quot;&gt;DebConf 26&lt;/a&gt;, to be held in Santa Fe Argentina
in July, has &lt;a href=&quot;https://debconf26.debconf.org/news/2026-02-13-dc26-registration-cfp-open/&quot;&gt;opened for registration and event proposals&lt;/a&gt;.
Stefano, Antonio, and Santiago all contributed to making this happen.&lt;/p&gt;
&lt;p&gt;As always, some changes needed to be made to the registration system. Bigger
changes were planned, but we ran out of time to implement them for DebConf 26.
All 3 of us have had experience in hosting local DebConf events in the past and
have been advising the DebConf 26 local team.&lt;/p&gt;
&lt;h2 id=&quot;debian-ci-improvements-by-antonio-terceiro&quot;&gt;Debian CI improvements, by Antonio Terceiro&lt;/h2&gt;
&lt;p&gt;&lt;a href=&quot;https://ci.debian.net/&quot;&gt;Debian CI&lt;/a&gt; is the platform responsible for automated
testing of packages from the Debian archive, and its results are used by the
Debian Release team automation as Quality Assurance to control the migration of
packages from Debian unstable into testing, the base for the next Debian release.
Antonio &lt;a href=&quot;https://salsa.debian.org/ci-team/debci/-/merge_requests/305&quot;&gt;started developing an incus backend&lt;/a&gt;,
and that prompted &lt;a href=&quot;https://salsa.debian.org/ci-team/debci/-/merge_requests/303&quot;&gt;two&lt;/a&gt;
&lt;a href=&quot;https://salsa.debian.org/ci-team/debci/-/merge_requests/304&quot;&gt;rounds&lt;/a&gt; of
improvements to the platform, including but not limited to allowing user to
select a job execution backend (lxc, qemu) during the job submission, reducing
the part of testbed image creation that requires superuser privileges and other
refactorings and bug fixes. The platform API was also improved to
&lt;a href=&quot;https://salsa.debian.org/ci-team/debci/-/merge_requests/306&quot;&gt;reduce disruption when reporting results&lt;/a&gt;
to the Release Team automation after service downtimes. Last, but not least, the
platform now has &lt;a href=&quot;https://salsa.debian.org/ci-team/debci/-/merge_requests/307&quot;&gt;support for testing packages against variants of autopkgtest&lt;/a&gt;,
which will allow the Debian CI team to test new versions of autopkgtest before
making releases to avoid widespread regressions.&lt;/p&gt;
&lt;h2 id=&quot;miscellaneous-contributions&quot;&gt;Miscellaneous contributions&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Carles improved &lt;a href=&quot;https://salsa.debian.org/carlespina/po-debconf-manager&quot;&gt;po-debconf-manager&lt;/a&gt;
while users requested features / found bugs. Improvements done - add packages
from “unstable” instead of just &lt;a href=&quot;https://salsa.debian.org&quot;&gt;salsa.debian.org&lt;/a&gt;,
upgrade and merge templates of upgraded packages, finished adding typing
annotations, improved deleting packages: support multiple line texts, add
–debug to see “subprocess.run” commands, etc.&lt;/li&gt;
&lt;li&gt;Carles, using po-debconf-manager, reviewed 7 Catalan translations and sent
bug reports or MRs for 11 packages. Also reviewed the translations of
&lt;code&gt;fortunes-debian-hints&lt;/code&gt; and submitted possible changes in the hints.&lt;/li&gt;
&lt;li&gt;Carles submitted MRs for reportbug (&lt;code&gt;reportbug --ui gtk&lt;/code&gt;
&lt;a href=&quot;https://salsa.debian.org/reportbug-team/reportbug/-/merge_requests/104&quot;&gt;detecting the wrong dependencies&lt;/a&gt;),
devscript (delete &lt;a href=&quot;https://salsa.debian.org/debian/devscripts/-/merge_requests/626&quot;&gt;unused code from debrebuild&lt;/a&gt;
and &lt;a href=&quot;https://salsa.debian.org/debian/devscripts/-/merge_requests/629&quot;&gt;add recommended dependency&lt;/a&gt;),
&lt;code&gt;wcurl&lt;/code&gt; (&lt;a href=&quot;https://github.com/curl/wcurl/pull/87&quot;&gt;format –help&lt;/a&gt; for 80 columns).
Carles submitted a &lt;a href=&quot;https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1127908&quot;&gt;bug report for apt&lt;/a&gt;
not showing the long descriptions of packages.&lt;/li&gt;
&lt;li&gt;Carles resumed effort for checking relations (e.g. Recommends / Suggests)
between Debian packages. A new &lt;a href=&quot;https://salsa.debian.org/carlespina/check-relations&quot;&gt;codebase&lt;/a&gt;
(still in early stages) was started with a new approach in order to detect,
report and track the broken relations.&lt;/li&gt;
&lt;li&gt;Emilio drove several transitions, most notably the haskell transition and the
&lt;code&gt;glibc&lt;/code&gt;/&lt;code&gt;gcc-15&lt;/code&gt;/&lt;code&gt;zlib&lt;/code&gt; transition for the s390 31-bit removal. This last one
included reviewing and requeueing lots of autopkgtests due to britney losing a
lot of results.&lt;/li&gt;
&lt;li&gt;Emilio reviewed and uploaded &lt;code&gt;poppler&lt;/code&gt; updates to experimental for a new transition.&lt;/li&gt;
&lt;li&gt;Emilio reviewed, merged and deployed some performance improvements proposed
for the security-tracker.&lt;/li&gt;
&lt;li&gt;Stefano prepared routine updates for &lt;code&gt;pycparser&lt;/code&gt;, &lt;code&gt;python-confuse&lt;/code&gt;,
&lt;code&gt;python-cffi&lt;/code&gt;, &lt;code&gt;python-mitogen&lt;/code&gt;, &lt;code&gt;python-pip&lt;/code&gt;, &lt;code&gt;wheel&lt;/code&gt;, &lt;code&gt;platformdirs&lt;/code&gt;,
&lt;code&gt;python-authlib&lt;/code&gt;, and &lt;code&gt;python-virtualenv&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Stefano updated Python 3.13 and 3.14 to the latest point releases, including
security updates, and did some preliminary work for Python 3.15.&lt;/li&gt;
&lt;li&gt;Stefano reviewed changes to &lt;code&gt;dh-python&lt;/code&gt; and merged MRs.&lt;/li&gt;
&lt;li&gt;Stefano did some debian.social sysadmin work, bridging additional IRC channels
to Matrix.&lt;/li&gt;
&lt;li&gt;Stefano and Antonio, as DebConf Committee Members, reviewed the DebConf 27
bids and took part in selecting &lt;a href=&quot;https://wiki.debian.org/DebConf/27/Bids/Japan&quot;&gt;the Japanese bid&lt;/a&gt;
to host DebConf 27.&lt;/li&gt;
&lt;li&gt;Helmut sent patches for 29 cross build failures.&lt;/li&gt;
&lt;li&gt;Helmut continued to maintain rebootstrap addressing issues relating to
specific architectures (such as &lt;code&gt;musl-linux&lt;/code&gt;-&lt;code&gt;any&lt;/code&gt;, &lt;code&gt;hurd-any&lt;/code&gt; or &lt;code&gt;s390x&lt;/code&gt;)
or specific packages (such as &lt;code&gt;binutils&lt;/code&gt;, &lt;code&gt;brotli&lt;/code&gt; or &lt;code&gt;fontconfig&lt;/code&gt;).&lt;/li&gt;
&lt;li&gt;Helmut worked on diagnosing bugs such as &lt;code&gt;rocblas&lt;/code&gt; &lt;a href=&quot;https://bugs.debian.org/1126608&quot;&gt;#1126608&lt;/a&gt;,
&lt;code&gt;python-memray&lt;/code&gt; &lt;a href=&quot;https://bugs.debian.org/1126944&quot;&gt;#1126944&lt;/a&gt;
&lt;a href=&quot;https://github.com/bloomberg/memray/issues/863#issuecomment-3974098020&quot;&gt;upstream&lt;/a&gt;
and &lt;code&gt;greetd&lt;/code&gt; &lt;a href=&quot;https://bugs.debian.org/1129070&quot;&gt;#1129070&lt;/a&gt; with varying success.&lt;/li&gt;
&lt;li&gt;Antonio provided support for multiple MiniDebConfs whose websites run
wafer + wafer-debconf (the same stack as DebConf itself).&lt;/li&gt;
&lt;li&gt;Antonio &lt;a href=&quot;https://salsa.debian.org/salsa/salsa-webhook/-/commit/4834a201d263cb99006e6d25c3f7af1014eeb256&quot;&gt;fixed the salsa tagpending webhook&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Antonio &lt;a href=&quot;https://github.com/mizzy/specinfra/pull/763&quot;&gt;sent specinfra upstream a patch&lt;/a&gt;
to fix detection of Debian systems in some situations.&lt;/li&gt;
&lt;li&gt;Santiago reviewed some Merge Requests for the Salsa CI pipeline, including
&lt;a href=&quot;https://salsa.debian.org/salsa-ci-team/pipeline/-/merge_requests/703&quot;&gt;!703&lt;/a&gt;
and &lt;a href=&quot;https://salsa.debian.org/salsa-ci-team/pipeline/-/merge_requests/704&quot;&gt;!704&lt;/a&gt;,
that aim to improve how the &lt;code&gt;build source&lt;/code&gt; job is handled by Salsa CI. Thanks a
lot to Jochen for his work on this.&lt;/li&gt;
&lt;li&gt;In collaboration with Emmanuel Arias, Santiago proposed a couple of projects
for the Google Summer of Code (GSoC) 2026 round. Santiago has been reviewing
applications and giving feedback to candidates.&lt;/li&gt;
&lt;li&gt;Thorsten uploaded new upstream versions of &lt;code&gt;ipp-usb&lt;/code&gt;, &lt;code&gt;brlaser&lt;/code&gt; and &lt;code&gt;gutenprint&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Raphaël updated &lt;a href=&quot;https://tracker.debian.org/news/1719747/accepted-publican-432-7-source-into-unstable/&quot;&gt;publican&lt;/a&gt;
to fix an old bug that became release critical and that happened only when
building with the nocheck profile. Publican is a build dependency of the Debian’s
Administrator Handbook and with that fix, the package is back into testing.&lt;/li&gt;
&lt;li&gt;Raphaël implemented a &lt;a href=&quot;https://salsa.debian.org/freexian-team/debusine/-/merge_requests/2755&quot;&gt;small feature&lt;/a&gt;
in Debusine that makes it possible to refer to a collection in a parent
workspace even if a collection with the same name is present in the current
workspace.&lt;/li&gt;
&lt;li&gt;Lucas updated the current status of ruby packages affecting the Ruby 3.4
transition after a bunch of updates made by team members. He will follow up on
this next month.&lt;/li&gt;
&lt;li&gt;Lucas joined the Debian orga team for GSoC this year and tried to reach out
to potential mentors.&lt;/li&gt;
&lt;li&gt;Lucas did some content work for MiniDebConf Campinas - Brazil.&lt;/li&gt;
&lt;li&gt;Colin published minor security updates to “bookworm” and “trixie” for
&lt;a href=&quot;https://bugs.debian.org/1117529&quot;&gt;CVE-2025-61984&lt;/a&gt; and &lt;a href=&quot;https://bugs.debian.org/1117530&quot;&gt;CVE-2025-61985&lt;/a&gt;
in &lt;code&gt;OpenSSH&lt;/code&gt;, both of which allowed code execution via &lt;code&gt;ProxyCommand&lt;/code&gt; in some
cases.  The “trixie” update also included a fix for
&lt;a href=&quot;https://bugs.debian.org/1080350&quot;&gt;mishandling of PerSourceMaxStartups&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Colin spotted and fixed a typo in the bug tracking system’s spam-handling rules,
which in combination with a &lt;a href=&quot;https://bugs.debian.org/1126848&quot;&gt;devscripts regression&lt;/a&gt;
caused &lt;code&gt;bts forwarded&lt;/code&gt; commands to be discarded.&lt;/li&gt;
&lt;li&gt;Colin ported 12 more Python packages away from using the deprecated (and now
removed upstream) &lt;code&gt;pkg_resources&lt;/code&gt; module.&lt;/li&gt;
&lt;li&gt;Anupa is co-organizing &lt;a href=&quot;https://kanpur2026.mini.debconf.org/&quot;&gt;MiniDebConf Kanpur&lt;/a&gt;
with Debian India team. Anupa was responsible for preparing the schedule,
publishing it on the website, co-ordination with the fiscal host in addition to
attending meetings.&lt;/li&gt;
&lt;li&gt;Anupa attended the Debian Publicity team online sprint which was a skill
sharing session.&lt;/li&gt;
&lt;/ul&gt; </description> 
	<pubDate>Tue, 10 Mar 2026 00:00:00 +0000</pubDate>

</item> 
<item>
	<title>Isoken Ibizugbe: Starting Out in Outreachy</title>
	<guid>http://isokenibizugbe.wordpress.com/?p=26</guid>
	<link>https://isokenibizugbe.wordpress.com/2026/03/09/starting-in-outreachy/</link>
     <description>  &lt;p class=&quot;wp-block-paragraph&quot;&gt;So you want to join Outreachy but you don’t understand it, you’re scared, or you don’t know what open source is about.&lt;/p&gt;



&lt;h2 class=&quot;wp-block-heading has-medium-font-size&quot;&gt;What is FOSS anyway? &lt;/h2&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;Free and Open Source Software (FOSS) refers to software that anyone can use, modify, and share freely. Think of it as a community garden; instead of one company owning the “food,” people from all over the world contribute, improve, and maintain it so everyone can benefit for free. You can read more &lt;a href=&quot;https://isokenibizugbe.wordpress.com/2026/01/05/thinking-about-my-audience/#:~:text=To%20the%20Aspiring%20Contributors&quot;&gt;here&lt;/a&gt; on what it means to contribute to open source.&lt;/p&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;Outreachy provides paid internships to anyone from any background who faces underrepresentation, systemic bias, or discrimination in the technical industry where they live. Their goal is to increase diversity in open source. Read their &lt;a href=&quot;https://www.outreachy.org/&quot;&gt;website&lt;/a&gt; for more. I spent a good amount of time reading all the guides listed, including the applicant guide and the &lt;a href=&quot;https://www.outreachy.org/apply/&quot;&gt;how-to-apply&lt;/a&gt; guide. &lt;/p&gt;



&lt;h2 class=&quot;wp-block-heading has-medium-font-size&quot;&gt;The “Secret” to Applying (Spoiler: It’s not a secret) &lt;/h2&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;I know newcomers are scared or unsure and would prefer answers from previous participants, but the Outreachy website is actually a goldmine, almost every question you have is already answered there if you look closely. I used to hate reading documentation, but I’ve learned to love it. Documentation is the “Source of Truth.”&lt;/p&gt;



&lt;ul class=&quot;wp-block-list&quot;&gt;
&lt;li&gt;My Advice: Read every single guide on their site. The applicant guide is your roadmap. Embracing documentation now will make you a much better contributor later.&lt;/li&gt;
&lt;/ul&gt;



&lt;h2 class=&quot;wp-block-heading has-medium-font-size&quot;&gt;The AI Trap: Be Yourself&lt;/h2&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;Now for the part most newcomers have asked about is the initial essay. I know it’s tempting to use AI, but I really encourage you to skip it for this. Your own story is much more powerful than a generated one. Outreachy and its mentoring organizations value your unique story. They are strongly against fabricated or AI-exaggerated essays.&lt;/p&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;For example, when I contributed to Debian using openQA, the information wasn’t well established on the web. When I tried to use AI, it suggested imaginary ideas. The project maintainers had a particular style of contributing, so I had to follow the instructions carefully, observe the codebase, and read the provided documentation. With that information, I always wrote a solution first before consulting AI, and mine was always better. AI can only be intelligent in the context of what you give it; if it doesn’t have your answer, it will look for the most similar solution (hallucinate). We do not want to increase the burden on reviewers—their time is important because they are volunteers, too. This is crucial when you qualify for the contribution phase.&lt;/p&gt;



&lt;h2 class=&quot;wp-block-heading has-medium-font-size&quot;&gt;The Application Process&lt;/h2&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;There are two main stages:&lt;/p&gt;



&lt;ul class=&quot;wp-block-list&quot;&gt;
&lt;li&gt;The initial application: Here you fill in basic details, time availability, and essay questions (you can find these on the Outreachy website).&lt;/li&gt;



&lt;li&gt;The contribution phase: This is where you show you have the skills to work on the projects. Every project will list the skills needed and the level of proficiency.&lt;/li&gt;
&lt;/ul&gt;



&lt;h3 class=&quot;wp-block-heading has-small-font-size&quot;&gt;When you qualify for the contribution phase:&lt;/h3&gt;



&lt;ul class=&quot;wp-block-list&quot;&gt;
&lt;li&gt;A lot of people will try to create buzz or even panic; you just have to focus. Once you’ve gotten the hang of the project, remember to help others along the way.&lt;/li&gt;



&lt;li&gt;You can start contributions with spelling corrections, move to medium tasks (do multiple of these), then a hard task if possible. You don’t need to be a guru on day one.&lt;/li&gt;



&lt;li&gt;It’s all about community building. Do your part to help others understand the project too; this is also a form of contribution.&lt;/li&gt;



&lt;li&gt;Lastly, every project mentor has a way of evaluating candidates. My summary is: be confident, demonstrate your skills, and learn where you are lacking. Start small and work your way up, you don’t have to prove yourself as a guru.&lt;/li&gt;
&lt;/ul&gt;



&lt;h3 class=&quot;wp-block-heading&quot;&gt;Tips&lt;/h3&gt;



&lt;ul class=&quot;wp-block-list&quot;&gt;
&lt;li&gt;Watch this: This step-by-step &lt;a href=&quot;https://youtu.be/lRegecT11k0?si=Q8wM_9PROinxjGCt&quot;&gt;video&lt;/a&gt; is a great walkthrough of the initial application process.&lt;/li&gt;



&lt;li&gt;Sign up for the email list to get updates:&lt;a href=&quot;https://lists.outreachy.org/cgi-bin/mailman/listinfo/announce&quot;&gt; https://lists.outreachy.org/cgi-bin/mailman/listinfo/announce&lt;/a&gt;&lt;/li&gt;



&lt;li&gt;Be fast: Complete your initial application in the first 3 days, as there are a lot of applicants.&lt;/li&gt;



&lt;li&gt;Back it up: In your essay about systemic bias, include some statistics to back it up.&lt;/li&gt;



&lt;li&gt;Learn Git: Even if you don’t have programming skills, contributions are pushed to GitHub or GitLab. Practice some commands and contribute to a “first open issue” to understand the flow:&lt;a href=&quot;https://github.com/firstcontributions/first-contributions&quot;&gt; https://github.com/firstcontributions/first-contributions&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;The most important tip? Apply anyway. Even if you feel underqualified, the process itself is a massive learning experience.&lt;/p&gt; </description> 
	<pubDate>Mon, 09 Mar 2026 21:10:25 +0000</pubDate>

</item> 
<item>
	<title>Dirk Eddelbuettel: nanotime 0.3.13 on CRAN: Maintenance</title>
	<guid>http://dirk.eddelbuettel.com/blog/2026/03/09#nanotime_0.3.13</guid>
	<link>http://dirk.eddelbuettel.com/blog/2026/03/09#nanotime_0.3.13</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/dirk.png&quot; width=&quot;65&quot; height=&quot;90&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;Another minor update 0.3.13 for our &lt;a href=&quot;https://dirk.eddelbuettel.com/code/nanotime.html&quot;&gt;nanotime&lt;/a&gt;
package is now on &lt;a href=&quot;https://cran.r-project.org&quot;&gt;CRAN&lt;/a&gt;, and has
been uploaded to &lt;a href=&quot;https://www.debian.org&quot;&gt;Debian&lt;/a&gt; and
compiled for &lt;a href=&quot;https://eddelbuettel.github.io/r2u&quot;&gt;r2u&lt;/a&gt;. &lt;a href=&quot;https://dirk.eddelbuettel.com/code/nanotime.html&quot;&gt;nanotime&lt;/a&gt;
relies on the &lt;a href=&quot;https://dirk.eddelbuettel.com/code/rcpp.cctz.html&quot;&gt;RcppCCTZ&lt;/a&gt;
package (as well as the &lt;a href=&quot;https://dirk.eddelbuettel.com/code/rcpp.date.html&quot;&gt;RcppDate&lt;/a&gt;
package for additional C++ operations) and offers efficient high(er)
resolution time parsing and formatting up to nanosecond resolution,
using the &lt;a href=&quot;https://cran.r-project.org/package=bit64&quot;&gt;bit64&lt;/a&gt;
package for the actual &lt;code&gt;integer64&lt;/code&gt; arithmetic. Initially
implemented using the S3 system, it has benefitted greatly from a
rigorous refactoring by &lt;a href=&quot;https://github.com/lsilvest&quot;&gt;Leonardo&lt;/a&gt; who not only rejigged
&lt;code&gt;nanotime&lt;/code&gt; internals in S4 but also added new S4 types for
&lt;em&gt;periods&lt;/em&gt;, &lt;em&gt;intervals&lt;/em&gt; and &lt;em&gt;durations&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;This release, the first in eleven months, rounds out a few internal
corners and helps &lt;a href=&quot;https://www.rcpp.org&quot;&gt;Rcpp&lt;/a&gt; with the
transition away from &lt;code&gt;Rf_error&lt;/code&gt; to only using
&lt;code&gt;Rcpp::stop&lt;/code&gt; which deals more gracefully with error
conditions and unwinding. We also updated how the vignette is made, its
references, updated the continuous integration as one does, altered how
the documentation site is built, gladly took a PR from &lt;a href=&quot;https://github.com/MichaelChirico&quot;&gt;Michael&lt;/a&gt; polishing another
small aspect, and tweaked how the compilation standard is set.&lt;/p&gt;
&lt;p&gt;The NEWS snippet below has the fuller details.&lt;/p&gt;
&lt;blockquote&gt;
&lt;h4 id=&quot;changes-in-version-0.3.13-2026-03-08&quot;&gt;Changes in version 0.3.13
(2026-03-08)&lt;/h4&gt;
&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The &lt;code&gt;methods&lt;/code&gt; package is now a Depends as WRE
recommends (Michael Chirico in &lt;a href=&quot;https://github.com/eddelbuettel/nanotime/pull/141&quot;&gt;#141&lt;/a&gt; based
on a suggestion by Dirk in &lt;a href=&quot;https://github.com/eddelbuettel/nanotime/issues/140&quot;&gt;#140&lt;/a&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The mkdocs-material documentation site is now generated via &lt;span class=&quot;pkg&quot;&gt;altdoc&lt;/span&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Continuous Integration scripts have been updated&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Replace &lt;code&gt;Rf_error&lt;/code&gt; with &lt;code&gt;Rcpp::stop&lt;/code&gt;, turn
remaining one into &lt;code&gt;(Rf_error)&lt;/code&gt; (Dirk in &lt;a href=&quot;https://github.com/eddelbuettel/nanotime/pull/143&quot;&gt;#143&lt;/a&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Vignette now uses the &lt;code&gt;Rcpp::asis&lt;/code&gt; builder for
pre-made pdfs (Dirk in &lt;a href=&quot;https://github.com/eddelbuettel/nanotime/pull/146&quot;&gt;#146&lt;/a&gt; fixing
&lt;a href=&quot;https://github.com/eddelbuettel/nanotime/issues/144&quot;&gt;#144&lt;/a&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The C++ compilation standard is explicitly set to C++17 if an R
version older than 4.3.0 is used (Dirk in &lt;a href=&quot;https://github.com/eddelbuettel/nanotime/pull/148&quot;&gt;#148&lt;/a&gt; fixing
&lt;a href=&quot;https://github.com/eddelbuettel/nanotime/issues/147&quot;&gt;#147&lt;/a&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The vignette references have been updated&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/blockquote&gt;
&lt;p&gt;Thanks to my &lt;a href=&quot;https://dirk.eddelbuettel.com/cranberries/&quot;&gt;CRANberries&lt;/a&gt;, there
is a diffstat report for &lt;a href=&quot;https://dirk.eddelbuettel.com/cranberries/2026/03/11#nanotime_0.3.13&quot;&gt;this
release&lt;/a&gt;. More details and examples are at the &lt;a href=&quot;https://dirk.eddelbuettel.com/code/nanotime.html&quot;&gt;nanotime
page&lt;/a&gt;; code, issue tickets etc at the &lt;a href=&quot;https://github.com/eddelbuettel/nanotime&quot;&gt;GitHub repository&lt;/a&gt; –
and all documentation is provided at the &lt;a href=&quot;https://eddelbuettel.github.io/nanotime/&quot;&gt;nanotime documentation
site&lt;/a&gt;.&lt;/p&gt;
&lt;p style=&quot;font-size: 80%; font-style: italic;&quot;&gt;
This post by &lt;a href=&quot;https://dirk.eddelbuettel.com&quot;&gt;Dirk
Eddelbuettel&lt;/a&gt; originated on his &lt;a href=&quot;https://dirk.eddelbuettel.com/blog/&quot;&gt;Thinking inside the box&lt;/a&gt;
blog. If you like this or other open-source work I do, you can now &lt;a href=&quot;https://github.com/sponsors/eddelbuettel&quot;&gt;sponsor me at
GitHub&lt;/a&gt;.
&lt;/p&gt;&lt;p&gt;&lt;/p&gt; </description> 
	<pubDate>Mon, 09 Mar 2026 16:38:00 +0000</pubDate>

</item> 
<item>
	<title>Colin Watson: Free software activity in February 2026</title>
	<guid>tag:www.chiark.greenend.org.uk,2026-03-09:/~cjwatson/blog/activity-2026-02.html</guid>
	<link>https://www.chiark.greenend.org.uk/~cjwatson/blog/activity-2026-02.html</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/cjwatson.png&quot; width=&quot;70&quot; height=&quot;82&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;My Debian contributions this month were all &lt;a href=&quot;https://www.freexian.com/about/debian-contributions/&quot;&gt;sponsored&lt;/a&gt; by Freexian.&lt;/p&gt;
&lt;p&gt;You can also support my work directly via &lt;a href=&quot;https://liberapay.com/cjwatson&quot;&gt;Liberapay&lt;/a&gt; or &lt;a href=&quot;https://github.com/sponsors/cjwatson&quot;&gt;GitHub Sponsors&lt;/a&gt;.&lt;/p&gt;
&lt;h2&gt;OpenSSH&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1122765&quot;&gt;openssh: Please remove/replace usage of dh_movetousr&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;I released bookworm and trixie fixes for &lt;a href=&quot;https://bugs.debian.org/1117529&quot;&gt;&lt;span class=&quot;caps&quot;&gt;CVE&lt;/span&gt;-2025-61984&lt;/a&gt; and &lt;a href=&quot;https://bugs.debian.org/1117530&quot;&gt;&lt;span class=&quot;caps&quot;&gt;CVE&lt;/span&gt;-2025-61985&lt;/a&gt;, both allowing code execution via &lt;code&gt;ProxyCommand&lt;/code&gt; in some cases.  The trixie update also included a fix for &lt;a href=&quot;https://bugs.debian.org/1080350&quot;&gt;openssh-server: refuses further connections after having handled PerSourceMaxStartups connections&lt;/a&gt;.&lt;/p&gt;
&lt;h2&gt;bugs.debian.org administration&lt;/h2&gt;
&lt;p&gt;Gioele Barabucci &lt;a href=&quot;https://bugs.debian.org/1126848&quot;&gt;reported&lt;/a&gt; that some messages to the bug tracking system generated by the &lt;code&gt;bts&lt;/code&gt; command were being discarded.  While the regression here was on the client side, I found and fixed a typo in our SpamAssassin configuration that was failing to apply a bonus specifically to &lt;code&gt;forwarded&lt;/code&gt; commands, mitigating the problem.&lt;/p&gt;
&lt;h2&gt;Python packaging&lt;/h2&gt;
&lt;p&gt;New upstream versions:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;aiosmtplib&lt;/li&gt;
&lt;li&gt;bitstruct&lt;/li&gt;
&lt;li&gt;diff-cover&lt;/li&gt;
&lt;li&gt;django-q&lt;/li&gt;
&lt;li&gt;isort&lt;/li&gt;
&lt;li&gt;multipart&lt;/li&gt;
&lt;li&gt;poetry (&lt;a href=&quot;https://bugs.debian.org/1127062&quot;&gt;adding support for Dulwich &amp;gt;= 0.25&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;poetry-core&lt;/li&gt;
&lt;li&gt;pydantic-settings&lt;/li&gt;
&lt;li&gt;python-build&lt;/li&gt;
&lt;li&gt;python-certifi&lt;/li&gt;
&lt;li&gt;python-datamodel-code-generator&lt;/li&gt;
&lt;li&gt;python-flatdict&lt;/li&gt;
&lt;li&gt;python-holidays&lt;/li&gt;
&lt;li&gt;python-maggma&lt;/li&gt;
&lt;li&gt;python-pytokens&lt;/li&gt;
&lt;li&gt;python-scruffy&lt;/li&gt;
&lt;li&gt;python-urllib3 (fixing &lt;a href=&quot;https://bugs.debian.org/1122029&quot;&gt;&lt;span class=&quot;caps&quot;&gt;CVE&lt;/span&gt;-2025-66471&lt;/a&gt; and a &lt;a href=&quot;https://bugs.debian.org/1122743&quot;&gt;chunked decoding bug&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;responses&lt;/li&gt;
&lt;li&gt;yarsync&lt;/li&gt;
&lt;li&gt;zope.component&lt;/li&gt;
&lt;li&gt;zope.deferredimport&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Porting away from the &lt;a href=&quot;https://github.com/pypa/setuptools/issues/3085&quot;&gt;deprecated&lt;/a&gt; (and now removed from upstream setuptools) &lt;code&gt;pkg_resources&lt;/code&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1083413&quot;&gt;genshi&lt;/a&gt; (&lt;a href=&quot;https://github.com/edgewall/genshi/pull/97&quot;&gt;contributed upstream&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1083414&quot;&gt;germinate&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1083483&quot;&gt;mopidy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1083509&quot;&gt;nose2&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1083553&quot;&gt;pokrok&lt;/a&gt; (&lt;a href=&quot;https://github.com/jdidion/pokrok/pull/5&quot;&gt;contributed upstream&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1125854&quot;&gt;pylama&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1125850&quot;&gt;python-flask-seeder&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1125819&quot;&gt;python-maggma&lt;/a&gt; (&lt;a href=&quot;https://github.com/materialsproject/maggma/pull/1067&quot;&gt;contributed upstream&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1125851&quot;&gt;python-pybadges&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1083705&quot;&gt;python-scruffy&lt;/a&gt; (&lt;a href=&quot;https://github.com/snare/scruffy/pull/33&quot;&gt;contributed upstream&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1125840&quot;&gt;thumbor&lt;/a&gt; (&lt;a href=&quot;https://github.com/thumbor/thumbor/pull/1789&quot;&gt;contributed upstream&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1084009&quot;&gt;zope.deprecation&lt;/a&gt; (&lt;a href=&quot;https://github.com/zopefoundation/zope.deprecation/pull/27&quot;&gt;contributed upstream&lt;/a&gt; a while ago, but there hasn’t been an upstream release yet)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Other build/test failures:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1129112&quot;&gt;flask-dance: &lt;span class=&quot;caps&quot;&gt;FTBFS&lt;/span&gt;: No module named ‘pkg_resources’&lt;/a&gt; (actually fixed by adding a missing dependency to python3-sphinxcontrib.seqdiag)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1127663&quot;&gt;paramiko: autopkgtest regression on i386&lt;/a&gt; (&lt;a href=&quot;https://github.com/paramiko/paramiko/pull/2577&quot;&gt;contributed upstream&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1128776&quot;&gt;poetry: autopkgtest regression on i386&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1123190&quot;&gt;python-argh&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1127525&quot;&gt;python-django-celery-beat: &lt;span class=&quot;caps&quot;&gt;FTBFS&lt;/span&gt;: &lt;span class=&quot;caps&quot;&gt;FAILED&lt;/span&gt; t/unit/test_models.py::HumanReadableTestCase::test_long_name&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1126157&quot;&gt;python-maturin: rust-itertools update&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1127528&quot;&gt;python-msrest: &lt;span class=&quot;caps&quot;&gt;FTBFS&lt;/span&gt;: &lt;span class=&quot;caps&quot;&gt;FAILED&lt;/span&gt; tests/asynctests/test_async_client.py::TestServiceClient::test_client_send&lt;/a&gt; (&lt;a href=&quot;https://github.com/Azure/msrest-for-python/pull/268&quot;&gt;contributed upstream&lt;/a&gt;, though not very successfully)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1123326&quot;&gt;python-typing-inspect&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Other bugs:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1129211&quot;&gt;python-datamodel-code-generator: Depends: python3-isort (&amp;lt; 8) but 8.0.0-1 is to be installed&lt;/a&gt; (&lt;a href=&quot;https://github.com/koxudaxi/datamodel-code-generator/pull/3011&quot;&gt;contributed upstream&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1125025&quot;&gt;python-typeguard: Mark python3-typeguard Multi-Arch: foreign&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1124771&quot;&gt;wheel: Mark python3-wheel Multi-Arch: foreign&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1123664&quot;&gt;zope.deferredimport: Please make the build reproducible&lt;/a&gt; (&lt;a href=&quot;https://github.com/zopefoundation/zope.deferredimport/pull/19&quot;&gt;contributed upstream&lt;/a&gt;, with a &lt;a href=&quot;https://github.com/zopefoundation/zope.deferredimport/pull/22&quot;&gt;follow-up fix&lt;/a&gt;)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;I added a &lt;a href=&quot;https://salsa.debian.org/python-team/tools/dh-python/-/merge_requests/83&quot;&gt;manual page symlink&lt;/a&gt; to make the documentation for &lt;code&gt;Testsuite: autopkgtest-pkg-pybuild&lt;/code&gt; easier to find.&lt;/p&gt;
&lt;p&gt;I backported python-pytest-unmagic and a more recent version of pytest-django to trixie.&lt;/p&gt;
&lt;h2&gt;Rust packaging&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1124553&quot;&gt;librust-pyo3-ffi-dev: Cannot be installed for foreign architectures&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;I also packaged rust-garde and rust-garde-derive, which are part of the pile of work needed to get the ruff packaging back in shape (which is a project I haven’t decided if I’m going to take on for real, but I thought I’d at least chip away at a bit of it).&lt;/p&gt;
&lt;h2&gt;Other bits and pieces&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1126132&quot;&gt;arch-test: Remove build dependency on binutils-mips64el-linux-gnuabi64&lt;/a&gt; (&lt;span class=&quot;caps&quot;&gt;NMU&lt;/span&gt;)&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Code reviews&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://salsa.debian.org/pkg-debconf/debconf/-/merge_requests/23&quot;&gt;debconf: Add &lt;span class=&quot;caps&quot;&gt;BMP&lt;/span&gt; version of debian-logo&lt;/a&gt; (merged and uploaded)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://salsa.debian.org/ssh-team/openssh/-/merge_requests/20&quot;&gt;openssh: Reorder pam_selinux(7) usage&lt;/a&gt; (merged and uploaded)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://salsa.debian.org/ssh-team/openssh/-/merge_requests/37&quot;&gt;openssh-client: use sysusers.d, drop superflous dependencies&lt;/a&gt; (merged and uploaded)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://salsa.debian.org/ssh-team/openssh/-/merge_requests/38&quot;&gt;openssh: Stop deleting system user on remove/purge&lt;/a&gt; (merged and uploaded)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1128399&quot;&gt;openssh: Do not link against libcrypt on &lt;span class=&quot;caps&quot;&gt;GNU&lt;/span&gt;/Hurd&lt;/a&gt; (merged and uploaded)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://salsa.debian.org/installer-team/partman-prep/-/merge_requests/2&quot;&gt;partman-prep: Align PReP descriptions with other partition types&lt;/a&gt; (merged)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/1124754&quot;&gt;python-better-exceptions&lt;/a&gt; (sponsored upload for Seyed Mohamad Amin Modaresi)&lt;/li&gt;
&lt;/ul&gt; </description> 
	<pubDate>Mon, 09 Mar 2026 12:22:42 +0000</pubDate>

</item> 
<item>
	<title>Sven Hoexter: Latest pflogsumm from unstable on trixie</title>
	<guid>http://sven.stormbind.net/blog/posts/deb_pflogsumm_latest_on_trixie/</guid>
	<link>http://sven.stormbind.net/blog/posts/deb_pflogsumm_latest_on_trixie/</link>
     <description>  &lt;p&gt;If you want the latest &lt;a href=&quot;https://packages.debian.org/source/unstable/pflogsumm&quot;&gt;pflogsumm&lt;/a&gt;
release form unstable on your Debian trixie/stable mailserver
you&#39;ve to rely on pining (Hint for the future: Starting with apt 3.1 there is
a new &lt;code&gt;Include&lt;/code&gt; and &lt;code&gt;Exclude&lt;/code&gt; option for your
&lt;a href=&quot;https://manpages.debian.org/unstable/apt/sources.list.5.en.html#THE_DEB_AND_DEB-SRC_TYPES:_OPTIONS&quot;&gt;sources.list&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;For trixie you&#39;ve to use e.g.:&lt;/p&gt;

&lt;pre&gt;&lt;code&gt;$ cat /etc/apt/sources.list.d/unstable.sources
Types: deb
URIs: http://deb.debian.org/debian
Suites: unstable 
Components: main
#This will work with apt 3.1 or later:
#Include: pflogsumm
Signed-By: /usr/share/keyrings/debian-archive-keyring.pgp

$ cat /etc/apt/preferences.d/pflogsumm-unstable.pref 
Package: pflogsumm
Pin: release a=unstable
Pin-Priority: 950

Package: *
Pin: release a=unstable
Pin-Priority: 50
&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;Should result in:&lt;/p&gt;

&lt;pre&gt;&lt;code&gt;$ apt-cache policy pflogsumm
pflogsumm:
  Installed: (none)
  Candidate: 1.1.14-1
  Version table:
     1.1.14-1 950
        50 http://deb.debian.org/debian unstable/main amd64 Packages
     1.1.5-8 500
       500 http://deb.debian.org/debian trixie/main amd64 Packages
&lt;/code&gt;&lt;/pre&gt;

&lt;h3&gt;Why would you want to do that?&lt;/h3&gt;

&lt;p&gt;Beside of some new features and improvements in the newer releases, the
pflogsumm version in stable has an issue with parsing the timestamps
generated by postfix itself when you write to a file via
&lt;a href=&quot;https://www.postfix.org/MAILLOG_README.html&quot;&gt;maillog_file&lt;/a&gt;. Since the
Debian default setup uses logging to stdout and writing out to &lt;code&gt;/var/log/mail.log&lt;/code&gt;
via rsyslog, I never invested time to fix that case. But since Jim picked up
pflogsumm development in 2025 that was fixed in pflogsumm 1.1.6.
Bug is &lt;a href=&quot;https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1129958&quot;&gt;#1129958&lt;/a&gt;,
originally reported in
&lt;a href=&quot;https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1068425&quot;&gt;#1068425&lt;/a&gt;
Since it&#39;s an arch:all package you can just pick from unstable, I don&#39;t think
it&#39;s a good candidate for backports, and just fetching the fixed version from
unstable is a compromise for those who run into that issue.&lt;/p&gt; </description> 
	<pubDate>Mon, 09 Mar 2026 09:09:44 +0000</pubDate>

</item> 
<item>
	<title>Gunnar Wolf: As Answers Get Cheaper, Questions Grow Dearer</title>
	<guid>https://gwolf.org/2026/03/as-answers-get-cheaper-questions-grow-dearer.html</guid>
	<link>https://gwolf.org/2026/03/as-answers-get-cheaper-questions-grow-dearer.html</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/gwolf.png&quot; width=&quot;69&quot; height=&quot;83&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;blockquote&gt;
		 
		   This post is an &lt;em&gt;unpublished&lt;/em&gt; review
		 
		     
		       
		         for &lt;em&gt;&lt;a href=&quot;https://cacm.acm.org/opinion/as-answers-get-cheaper-questions-grow-dearer/&quot;&gt;As Answers Get Cheaper, Questions Grow Dearer&lt;/a&gt;&lt;/em&gt;
		       
		     
		     
		   &lt;/blockquote&gt;
		 
		 &lt;p&gt;This opinion article tackles the much discussed issues of Large Language
Models (LLMs) both endangering jobs and improving productivity.&lt;/p&gt;

&lt;p&gt;The authors begin by making a comparison, likening the current
understanding of the effects LLMs are currently having upon
knowledge-intensive work to that of artists in the early XIX century, when
photography was first invented: they explain that photography didn’t result
in painting becoming obsolete, but undeniably changed in a fundamental
way. Realism was no longer the goal of painters, as they could no longer
compete in equal terms with photography. Painters then began experimenting
with the subjective experiences of color and light: Impressionism no longer
limits to copying reality, but adds elements of human feeling to creations.&lt;/p&gt;

&lt;p&gt;The authors argue that LLMs make getting answers terribly cheap — not
necessarily correct, but immediate and plausible. In order for the use of
LLMs to be advantageous to users, a good working knowledge of the domain in
which LLMs are queried is key. They cite as LLMs increasing productivity on
average 14% at call centers, where questions have unambiguous answers and
the knowledge domain is limited, but causing prejudice close to 10% to
inexperience entrepreneurs following their advice in an environment where
understanding of the situation and critical judgment are key. The problem,
thus, becomes that LLMs are optimized to generate &lt;em&gt;plausible&lt;/em&gt; answers. If
the user is not a domain expert, “plausibility becomes a stand-in for
truth”. They identify that, with this in mind, good questions become
strategic: Questions that continue a line of inquiry, that expand the
user’s field of awareness, that reveal where we must keep looking. They
liken this to Clayton Christensen’s 2010 text on consulting¹: A
consultant’s value is not in having all the answers, but in teaching
clients how to think.&lt;/p&gt;

&lt;p&gt;LLMs are already, and will likely become more so as they improve,
game-changing for society. The authors argue that for much of the 20th
century, an individual’s success was measured by domain mastery, but bring
to the table that the defining factor is no longer knowledge accumulation,
but the ability to formulate the right questions. Of course, the authors
acknowledge (it’s even the literal title of one of the article’s sections)
that good questions need strong theoretical foundations. Knowing a specific
domain enables users to imagine what should happen if following a specific
lead, anticipate second-order effects, and evaluate whether plausible
answers are meaningful or misleading.&lt;/p&gt;

&lt;p&gt;Shortly after I read the article I am reviewing, I came across a data point
that quite validates its claims: A short, informally published paper on
combinatorics and graph theory titled “Claude’s Cycles”² written by Donald
Knuth (one of the most respected Computer Science professors and
researchers and author of the very well known “The Art of Computer
Programming” series of books). Knuth’s text, and particularly its
“postscripts”, perfectly illustrate what the article of this review
conveys: LLMs can help a skillful researcher “connect the dots” in very
varied fields of knowledge, perform tiring and burdensome calculators, even
try mixing together some ideas that will fail — or succeed. But guided by a
true expert of the field, asking the right, insightful and informed
questions will the answers prove to be of value — and, in this case, of
immense value. Knuth writes of a particular piece of the solution, “I would
have found this solution myself if I’d taken time to look carefully at all
760 of the generalizable solutions for m=3”, but having an LLM perform all
the legwork it was surely a better use of his time.&lt;/p&gt;

&lt;p&gt;¹ Christensen, C.M. &lt;a href=&quot;https://www.truevaluemetrics.org/DBpdfs/Ideas/Christensen/How-will-you-measure-your-life.pdf&quot;&gt;How Will You Measure Your
Life?&lt;/a&gt;
Harvard Business Review Press (2017).&lt;/p&gt;

&lt;p&gt;² Knuth, D. Claude’s Cycles. &lt;a href=&quot;https://cs.stanford.edu/~knuth/papers/claude-cycles.pdf&quot;&gt;https://cs.stanford.edu/~knuth/papers/claude-cycles.pdf&lt;/a&gt;&lt;/p&gt; </description> 
	<pubDate>Sun, 08 Mar 2026 18:51:12 +0000</pubDate>

</item> 
<item>
	<title>Dirk Eddelbuettel: RProtoBuf 0.4.26 on CRAN: More Maintenance</title>
	<guid>http://dirk.eddelbuettel.com/blog/2026/03/07#rprotobuf_0.4.26</guid>
	<link>http://dirk.eddelbuettel.com/blog/2026/03/07#rprotobuf_0.4.26</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/dirk.png&quot; width=&quot;65&quot; height=&quot;90&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;A new maintenance release 0.4.26 of &lt;a href=&quot;https://dirk.eddelbuettel.com/code/rprotobuf.html&quot;&gt;RProtoBuf&lt;/a&gt;
arrived on &lt;a href=&quot;https://cran.r-project.org&quot;&gt;CRAN&lt;/a&gt; today. &lt;a href=&quot;https://dirk.eddelbuettel.com/code/rprotobuf.html&quot;&gt;RProtoBuf&lt;/a&gt;
provides &lt;a href=&quot;https://www.r-project.org&quot;&gt;R&lt;/a&gt; with bindings for the
&lt;a href=&quot;https://github.com/google/protobuf&quot;&gt;Google Protocol Buffers
(“ProtoBuf”)&lt;/a&gt; data encoding and serialization library used and
released by Google, and deployed very widely in numerous projects as a
language and operating-system agnostic protocol. The new release is also
already as a binary via &lt;a href=&quot;https://eddelbuettel.github.io/r2u&quot;&gt;r2u&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;This release brings an update to aid in an ongoing &lt;a href=&quot;https://www.rcpp.org&quot;&gt;Rcpp&lt;/a&gt; transitions from
&lt;code&gt;Rf_error&lt;/code&gt; to &lt;code&gt;Rcpp::stop&lt;/code&gt;, and includes a few
more minor cleanups including one contributed by &lt;a href=&quot;https://github.com/michaelchirico&quot;&gt;Michael&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The following section from the NEWS.Rd file has full details.&lt;/p&gt;
&lt;blockquote&gt;
&lt;h4 id=&quot;changes-in-rprotobuf-version-0.4.26-2026-03-06&quot;&gt;Changes in
RProtoBuf version 0.4.26 (2026-03-06)&lt;/h4&gt;
&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Minor cleanup in DESCRIPTION depends and imports&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Remove obsolete check for &lt;code&gt;utils::.DollarNames&lt;/code&gt;
(Michael Chirico in &lt;a href=&quot;https://github.com/eddelbuettel/rprotobuf/pull/111&quot;&gt;#111&lt;/a&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Replace &lt;code&gt;Rf_error&lt;/code&gt; with &lt;code&gt;Rcpp::stop&lt;/code&gt;, turn
remaining one into &lt;code&gt;(Rf_error)&lt;/code&gt; (Dirk in &lt;a href=&quot;https://github.com/eddelbuettel/rprotobuf/pull/112&quot;&gt;#112&lt;/a&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Update &lt;code&gt;configure&lt;/code&gt; test to check for RProtoBuf 3.3.0
or later&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/blockquote&gt;
&lt;p&gt;Thanks to my &lt;a href=&quot;https://dirk.eddelbuettel.com/cranberries/&quot;&gt;CRANberries&lt;/a&gt;, there
is a diff to the &lt;a href=&quot;https://dirk.eddelbuettel.com/cranberries/2026/03/07#RProtoBuf_0.4.26&quot;&gt;previous
release&lt;/a&gt;. The &lt;a href=&quot;https://dirk.eddelbuettel.com/code/rprotobuf.html&quot;&gt;RProtoBuf&lt;/a&gt;
page has copies of the &lt;a href=&quot;https://dirk.eddelbuettel.com/code/rprotobuf/RProtoBuf-intro.pdf&quot;&gt;(older)
package vignette&lt;/a&gt;, the &lt;a href=&quot;https://dirk.eddelbuettel.com/code/rprotobuf/RProtoBuf-quickref.pdf&quot;&gt;‘quick’
overview vignette&lt;/a&gt;, and the &lt;a href=&quot;https://cloud.r-project.org/web/packages/RProtoBuf/vignettes/RProtoBuf-paper.pdf&quot;&gt;pre-print
of our JSS paper&lt;/a&gt;. Questions, comments etc should go to the &lt;a href=&quot;https://github.com/eddelbuettel/rprotobuf/issues&quot;&gt;GitHub issue
tracker&lt;/a&gt; off the &lt;a href=&quot;https://github.com/eddelbuettel/rprotobuf&quot;&gt;GitHub repo&lt;/a&gt;.&lt;/p&gt;
&lt;p style=&quot;font-size: 80%; font-style: italic;&quot;&gt;
This post by &lt;a href=&quot;https://dirk.eddelbuettel.com&quot;&gt;Dirk
Eddelbuettel&lt;/a&gt; originated on his &lt;a href=&quot;https://dirk.eddelbuettel.com/blog/&quot;&gt;Thinking inside the box&lt;/a&gt;
blog. If you like this or other open-source work I do, you can &lt;a href=&quot;https://github.com/sponsors/eddelbuettel&quot;&gt;sponsor me at
GitHub&lt;/a&gt;.
&lt;/p&gt;&lt;p&gt;&lt;/p&gt; </description> 
	<pubDate>Sat, 07 Mar 2026 12:25:00 +0000</pubDate>

</item> 
<item>
	<title>Steinar H. Gunderson: A286874(14) = 28</title>
	<guid>http://blog.sesse.net/blog/tech/2026-03-07-10-54_a286874_14_28.html</guid>
	<link>http://blog.sesse.net/blog/tech/2026-03-07-10-54_a286874_14_28.html</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/sesse.png&quot; width=&quot;74&quot; height=&quot;85&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;There&#39;s a logic puzzle that goes like this: A king has a thousand bottles of
wine, where he knows that one is poisoned. He also has ten disposable
servants that could taste the wine, but for whatever reason (the usual
explanation is that the poison is slow-working and the feast is nearing),
they can only take one sip each, possibly mixed from multiple bottles.
How can he identify the bad bottle?&lt;/p&gt;

&lt;p&gt;The solution is well-known and not difficult; you give each bottle a number
0..999 and write it out in binary, and use the ones to assign wines to
servants. (So there&#39;s one servant that drinks a mix of all the odd-numbered
wines, and that tells you if the poisoned bottle&#39;s number is odd or even.
Another servant drinks a mix of bottles 2, 3, 6, 7, 10, 11, etc., and that
tells you the second-lowest bit. And so on.) This works because ten servants
allow you to test 2^10 = 1024 bottles.&lt;/p&gt;

&lt;p&gt;It is also easy to extend this to “&lt;em&gt;at most&lt;/em&gt; one bottle is poisoned”;
give the wines numbers from 1..1000 instead, follow the same pattern,
and if no servant dies, you know the answer is zero. (This allows you to
test at most 1023 bottles.)&lt;/p&gt;

&lt;p&gt;Now, let&#39;s tweak the puzzle: What if there&#39;s zero, one or &lt;em&gt;two&lt;/em&gt; poisoned
bottles? How many bottles can the king test with his ten servants?
(If you&#39;re looking for a more real-world application of this, replace
“poisoned bottles” with “COVID tests” and maybe it starts to sound less
arbitrary.) If course, the king can easily test ten bottles by having
each servant test exactly one bottle each, but it turns out you can
get to 13 by being a bit more clever, for instance:&lt;/p&gt;

&lt;pre&gt;   0123456789 ← Servant number

 0 0000000111
 1 0000011001
 2 0000101010
 3 0000110100
 4 0001001100
 5 0010010010
 6 0011000001
 7 0100100001
 8 0101000010
 9 0110000100
10 1001010000
11 1010100000
12 1100001000

 ↑ Bottle number
&lt;/pre&gt;

&lt;p&gt;It can be shown (simply by brute force) that no two rows here are a subset
of another row, so if you e.g. the “servant death” vector is 0110101110
(servants 1, 2, 4, 6, 7 and 8 die), the only way this could be is if
bottle 2 and 9 are poisoned (and none else). Of course, the solution is
nonunique, since you could switch around the number of servants or wines
and it would stil work. But if you don&#39;t allow that kind of permutation,
there are only five different solutions for 10 servants and 13 wines.&lt;/p&gt;

&lt;p&gt;The maximum number of possible wines to test is recorded in
&lt;a href=&quot;https://oeis.org/A286874&quot;&gt;OEIS A286874&lt;/a&gt;, and the number of different
solutions in &lt;a href=&quot;https://oeis.org/A303977&quot;&gt;A303977&lt;/a&gt;. So for A286874,
a(10) = 13 and for A303977, a(10) = 5.&lt;/p&gt;

&lt;p&gt;We&#39;d like to know what these values for higher values, in particular
A286874 (A303977 is a bit more of a curiosity, and also a convenient place
to write down all the solutions). I&#39;ve written before about how we
can create fairly &lt;em&gt;good&lt;/em&gt; solutions using error-correcting codes
(there are also other possible constructions), but &lt;em&gt;optimal&lt;/em&gt; turns out
to be hard. The only way we know of is some form of brute force.
(I used a SAT solver to confirm a(10) and a(11), but it seemed to get
entirely stuck on a(12).)&lt;/p&gt;

&lt;p&gt;I&#39;ve &lt;em&gt;also&lt;/em&gt; written about my brute-force search of a(12) and a(13),
so I&#39;m not going to repeat that, but it turned out that with a bunch
of extra optimizations and 210 calendar days of near-continuous
calculation, I could confirm that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A286874 a(14) = 28&lt;/li&gt;
&lt;li&gt;A303977 a(14) = 788 (!!)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The latter result is very surprising to me, so it was an interesting
find. I would have assumed that with this many solutions, we&#39;d find
a(14) = 29.&lt;/p&gt;

&lt;p&gt;I don&#39;t have enough CPU power to test a(15) or a(16) (do contact me
if you have a couple thousand cores to lend out for some months or more),
but I&#39;m going to do a search in a given subset of the search space (5-uniform
solutions), which is much faster; it won&#39;t allow us to fix more elements of
either of the sequences, but it&#39;s possible that we&#39;ll find some new records
and thus lower bounds for A286874. Like I already posted, we know that
a(15) &amp;gt;= 42. (Someone should also probably go find some bounds for
a(17), a(18), etc.—when the sequence was written, the posted known bounds
were far ahead of the sequence itself, but my verification has caught up
and my approach is not as good in creating solutions heuristically
out of thin air.)&lt;/p&gt; </description> 
	<pubDate>Sat, 07 Mar 2026 09:54:00 +0000</pubDate>

</item> 
<item>
	<title>Thorsten Alteholz: My Debian Activities in February 2026</title>
	<guid>http://blog.alteholz.eu/?p=2789</guid>
	<link>http://blog.alteholz.eu/2026/03/my-debian-activities-in-february-2026/</link>
     <description>  &lt;h3&gt;&lt;strong&gt;Debian LTS/ELTS&lt;/strong&gt;&lt;/h3&gt;



&lt;p&gt;&lt;/p&gt;&lt;p&gt;This was my hundred-fortieth month that I did some work for the Debian LTS initiative, started by Raphael Hertzog at Freexian.
&lt;/p&gt;
&lt;p&gt;
During my allocated time I uploaded or worked on:  
&lt;/p&gt;&lt;p&gt;&lt;/p&gt;



&lt;ul&gt;&lt;li&gt;[&lt;a href=&quot;https://lists.debian.org/debian-lts-announce/2026/02/msg00013.html&quot;&gt;DLA 4474-1&lt;/a&gt;] rlottie security update to fix three CVEs related to boundary checks.
&lt;/li&gt;&lt;li&gt;[&lt;a href=&quot;https://lists.debian.org/debian-lts-announce/2026/02/msg00015.html&quot;&gt;DLA 4477-1&lt;/a&gt;] munge security update to fix one CVE related to a buffer overflow.
&lt;/li&gt;&lt;li&gt;[&lt;a href=&quot;https://lists.debian.org/debian-lts-announce/2026/02/msg00022.html&quot;&gt;DLA 4483-1&lt;/a&gt;] gimp security update to fix four CVEs related to arbitrary code execution.
&lt;/li&gt;&lt;li&gt;[&lt;a href=&quot;https://lists.debian.org/debian-lts-announce/2026/02/msg00026.html&quot;&gt;DLA 4487-1&lt;/a&gt;] gegl security update to fix two CVEs related to heap-based buffer overflow.&lt;/li&gt;&lt;li&gt;[&lt;a href=&quot;https://lists.debian.org/debian-lts-announce/2026/02/msg00028.html&quot;&gt;DLA 4489-1&lt;/a&gt;] libvpx security update to fix one CVE related to a buffer overflow.
&lt;/li&gt;&lt;li&gt;[ELA-1649-1] gimp security update to fix three CVEs in Buster and Stretch related to arbitrary code execution.&lt;/li&gt;&lt;li&gt;[ELA-1650-1] gegl security update to fix two CVEs in Buster and Stretch related to heap-based buffer overflow.&lt;/li&gt;&lt;/ul&gt;



&lt;p&gt;
Some CVEs could be marked as &lt;i&gt;not-affected&lt;/i&gt; for one or all LTS/ELTS-releases.
I also worked on package &lt;i&gt;evolution-data-server&lt;/i&gt; and attended the monthly LTS/ELTS meeting.&lt;/p&gt;



&lt;h3&gt;&lt;strong&gt;Debian Printing&lt;/strong&gt;&lt;/h3&gt;



&lt;p&gt;This month I uploaded a new upstream versions:&lt;/p&gt;



&lt;ul&gt;&lt;li&gt;… &lt;a href=&quot;https://tracker.debian.org/ipp-usb&quot;&gt;ipp-usb&lt;/a&gt; to unstable.&lt;/li&gt;&lt;li&gt;… &lt;a href=&quot;https://tracker.debian.org/brlaser&quot;&gt;brlaser&lt;/a&gt; to unstable.&lt;/li&gt;&lt;li&gt;… &lt;a href=&quot;https://tracker.debian.org/gutenprint&quot;&gt;gutenprint&lt;/a&gt; to unstable.&lt;/li&gt;&lt;/ul&gt;



&lt;p&gt;&lt;strong&gt;This work is generously funded by &lt;a href=&quot;https://www.freexian.com&quot;&gt;Freexian&lt;/a&gt;!&lt;/strong&gt;&lt;/p&gt;



&lt;h3&gt;&lt;strong&gt;Debian Lomiri&lt;/strong&gt;&lt;/h3&gt;



&lt;p&gt;This month I continued to worked on unifying packaging on Debian and Ubuntu. This makes it easier to work on those packages independent of the used platform. &lt;/p&gt;



&lt;p&gt;&lt;strong&gt;This work is generously funded by &lt;a href=&quot;https://freiesoftware.gmbh/&quot;&gt;Fre(i)e Software GmbH&lt;/a&gt;!&lt;/strong&gt;&lt;/p&gt;



&lt;h3&gt;&lt;strong&gt;Debian Astro&lt;/strong&gt;&lt;/h3&gt;



&lt;p&gt;This month I uploaded a new upstream version  or a bugfix version of:&lt;/p&gt;



&lt;ul&gt;&lt;li&gt;… &lt;a href=&quot;https://tracker.debian.org/c-munipack&quot;&gt;c-munipack&lt;/a&gt; to unstable. This package now contains a version without GTK support. Upstream is working on a port to GTK3 but seems to need some more time to finish this.&lt;/li&gt;&lt;li&gt;… &lt;a href=&quot;https://tracker.debian.org/libasi&quot;&gt;libasi&lt;/a&gt; to unstable.&lt;/li&gt;&lt;li&gt;… &lt;a href=&quot;https://tracker.debian.org/libdfu-ahp&quot;&gt;libdfu-ahp&lt;/a&gt; to unstable.&lt;/li&gt;&lt;li&gt;… &lt;a href=&quot;https://tracker.debian.org/libfishcamp&quot;&gt;libfishcamp&lt;/a&gt; to unstable.&lt;/li&gt;&lt;li&gt;… &lt;a href=&quot;https://tracker.debian.org/libinovasdk&quot;&gt;libinovasdk&lt;/a&gt; to unstable.&lt;/li&gt;&lt;li&gt;… &lt;a href=&quot;https://tracker.debian.org/libmicam&quot;&gt;libmicam&lt;/a&gt; to unstable.&lt;/li&gt;&lt;li&gt;… &lt;a href=&quot;https://tracker.debian.org/siril&quot;&gt;siril&lt;/a&gt; to unstable (sponsored upload).&lt;/li&gt;&lt;/ul&gt;



&lt;h3&gt;&lt;strong&gt;Debian IoT&lt;/strong&gt;&lt;/h3&gt;



&lt;p&gt;This month I uploaded a new upstream version  or a bugfix version of:&lt;/p&gt;



&lt;ul&gt;&lt;li&gt;… &lt;a href=&quot;https://tracker.debian.org/pyicloud&quot;&gt;pyicloud&lt;/a&gt; to unstable.&lt;/li&gt;&lt;/ul&gt;



&lt;p&gt;Unfortunately development of &lt;i&gt;openoverlayrouter&lt;/i&gt; finally stopped, so I had to remove this package from the archive.&lt;/p&gt;



&lt;h3&gt;&lt;strong&gt;Debian Mobcom&lt;/strong&gt;&lt;/h3&gt;



&lt;p&gt;This month I uploaded a new upstream version  or a bugfix version of:&lt;/p&gt;



&lt;ul&gt;&lt;li&gt;… &lt;a href=&quot;https://tracker.debian.org/libsmpp34&quot;&gt;libsmpp34&lt;/a&gt; to unstable.&lt;/li&gt;&lt;/ul&gt;



&lt;h3&gt;&lt;strong&gt;misc&lt;/strong&gt;&lt;/h3&gt;



&lt;p&gt;This month I uploaded a new upstream version  or a bugfix version of:&lt;/p&gt;



&lt;ul&gt;&lt;li&gt;… &lt;a href=&quot;https://tracker.debian.org/nuspell&quot;&gt;nuspell&lt;/a&gt; to unstable.&lt;/li&gt;&lt;/ul&gt;



&lt;p&gt;I also sponsored the upload of some Matomo dependencies. Thanks a lot to William for preparing the packages&lt;/p&gt; </description> 
	<pubDate>Fri, 06 Mar 2026 18:27:54 +0000</pubDate>

</item> 
<item>
	<title>Russell Coker: Links March 2026</title>
	<guid>https://etbe.coker.com.au/?p=5978</guid>
	<link>https://etbe.coker.com.au/2026/03/06/links-march-2026/</link>
     <description>  &lt;p&gt;&lt;a href=&quot;https://krebsonsecurity.com/2026/01/kimwolf-botnet-lurking-in-corporate-govt-networks/&quot;&gt;Krebs has an interesting article about the Kimwolf botnet which uses residential proxy relay services [1]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://pluralistic.net/2026/01/06/1000x-liability/#graceful-failure-modes&quot;&gt;cory Doctorow wrote an insightful blog post about code being a liability not an asset [2]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://aigarius.com/blog/2026/01/07/sedan-experience/&quot;&gt;Aigars Mahinovs wrote an interesting review of the BMW i4 M50 xDrive and the BMW i5 eDrive40 which seem like very impressive vehicles [3]&lt;/a&gt;. I was wondering what BMW would do now that all the features they had in the 90s have been copied by cheaper brands but they have managed to do new and exciting things.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://arstechnica.com/space/2026/01/us-spy-satellite-agency-declassifies-high-flying-cold-war-listening-post/&quot;&gt;Arstechnica has an interesting article about the recently declassified JUMPSEAT surveillance satellites that ran from 1971 to 1987 [4]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://pluralistic.net/2026/01/30/zucksauce/#gandersauce&quot;&gt;Cory Doctorow wrote an interesting blog post about OgApp which briefly allowed viewing Instagram without ads and the issues of US corporations misusing EU copyright law [5]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.zdnet.com/article/linux-kernel-maintainers-new-way-of-authenticating-developers-and-code/&quot;&gt;ZDNet has an interesting article about new planned developments for the web of trust for Linux kernel coders (and others) [6]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.greenleft.org.au/2026/1447/world/india-300-million-take-streets-historic-national-strike&quot;&gt;Last month India had a 300 million person strike, we need more large scale strikes against governments that support predatory corporations [7]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.techdirt.com/2025/07/17/fascism-for-first-time-founders/&quot;&gt;Techdirt has an insightful article on the ways the fascism is bad for innovation and a market based economy [8]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://en.wikipedia.org/wiki/Scsh&quot;&gt;The Acknowledgements section from the Scheme Shell (scsh) reference is epic [9]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.vice.com/en/article/scientists-explain-why-doing-your-own-research-leads-to-buying-conspiracies/&quot;&gt;Vice has an insightful article on research about “do your own research” and how simple Google searches tend to reinforce conspiracy theories [10]&lt;/a&gt;. A problem with Google is that it’s most effective if you already know the answer.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.issendai.com/psychology/estrangement/why-estranged-parents-forums.html&quot;&gt;Issendai has an interesting and insightful series of blog posts about estranged parents forums which seems a lot like Incel forums in the way they promote abuse [11]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.caitlinjohnst.one/p/the-empire-has-accidentally-caused&quot;&gt;Caitlin Johnstone wrote an interesting article about how “the empire” caused the rebirth of a real counterculture by their attempts to coerce support for Israeli atrocities [12]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://radleybalko.substack.com/p/the-courage-to-be-decent&quot;&gt;Radley Balko wrote an interesting article about “the courage to be decent” concerning the Trump regime’s attempts to scare lawyers into cooperating with them [13]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://serpapi.com/blog/every-google-udm-in-the-world/&quot;&gt;Terry Tan wrote a useful resource on the API for Google search, this could be good for shell scripts and for 3rd party programs that launch a search [14]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://theproof.com/eating-oysters-and-mussels-as-a-vegan/&quot;&gt;The Proof has an interesting article about eating oysters and mussels as a vegan [15]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://allthingslinguistic.com/post/31689604490/what-is-yodas-syntax-in-other-languages&quot;&gt;All Things Linguistic has an interesting and amusing post about Yoda’s syntax in non-English languages [16]&lt;/a&gt;.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;[1]&lt;a href=&quot;https://krebsonsecurity.com/2026/01/kimwolf-botnet-lurking-in-corporate-govt-networks/&quot;&gt; https://tinyurl.com/2ypyzh5w&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[2]&lt;a href=&quot;https://pluralistic.net/2026/01/06/1000x-liability/#graceful-failure-modes&quot;&gt; https://tinyurl.com/2b9kyl5x&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[3]&lt;a href=&quot;https://aigarius.com/blog/2026/01/07/sedan-experience/&quot;&gt; https://aigarius.com/blog/2026/01/07/sedan-experience/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[4]&lt;a href=&quot;https://arstechnica.com/space/2026/01/us-spy-satellite-agency-declassifies-high-flying-cold-war-listening-post/&quot;&gt; https://tinyurl.com/23ekabmj&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[5]&lt;a href=&quot;https://pluralistic.net/2026/01/30/zucksauce/#gandersauce&quot;&gt; https://pluralistic.net/2026/01/30/zucksauce/#gandersauce&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[6]&lt;a href=&quot;https://www.zdnet.com/article/linux-kernel-maintainers-new-way-of-authenticating-developers-and-code/&quot;&gt; https://tinyurl.com/29j6zzyc&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[7]&lt;a href=&quot;https://www.greenleft.org.au/2026/1447/world/india-300-million-take-streets-historic-national-strike&quot;&gt; https://tinyurl.com/2xvfmslu&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[8]&lt;a href=&quot;https://www.techdirt.com/2025/07/17/fascism-for-first-time-founders/&quot;&gt; https://tinyurl.com/2b7m8pwa&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[9]&lt;a href=&quot;https://en.wikipedia.org/wiki/Scsh&quot;&gt; https://en.wikipedia.org/wiki/Scsh&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[10]&lt;a href=&quot;https://www.vice.com/en/article/scientists-explain-why-doing-your-own-research-leads-to-buying-conspiracies/&quot;&gt; https://tinyurl.com/2aajkoyv&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[11]&lt;a href=&quot;https://www.issendai.com/psychology/estrangement/why-estranged-parents-forums.html&quot;&gt; https://tinyurl.com/ywd3kqel&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[12]&lt;a href=&quot;https://www.caitlinjohnst.one/p/the-empire-has-accidentally-caused&quot;&gt; https://tinyurl.com/2cqep7cj&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[13]&lt;a href=&quot;https://radleybalko.substack.com/p/the-courage-to-be-decent&quot;&gt; https://radleybalko.substack.com/p/the-courage-to-be-decent&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[14]&lt;a href=&quot;https://serpapi.com/blog/every-google-udm-in-the-world/&quot;&gt; https://serpapi.com/blog/every-google-udm-in-the-world/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[15]&lt;a href=&quot;https://theproof.com/eating-oysters-and-mussels-as-a-vegan/&quot;&gt; https://theproof.com/eating-oysters-and-mussels-as-a-vegan/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[16]&lt;a href=&quot;https://allthingslinguistic.com/post/31689604490/what-is-yodas-syntax-in-other-languages&quot;&gt; https://tinyurl.com/229soykv&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;div class=&quot;yarpp yarpp-related yarpp-related-rss yarpp-template-list&quot;&gt;

&lt;p&gt;Related posts:&lt;/p&gt;&lt;ol&gt;
&lt;li&gt;&lt;a href=&quot;https://etbe.coker.com.au/2024/03/31/links-march-2024/&quot; rel=&quot;bookmark&quot; title=&quot;Links March 2024&quot;&gt;Links March 2024&lt;/a&gt; &lt;small&gt;Bruce Schneier wrote an interesting blog post about his workshop...&lt;/small&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://etbe.coker.com.au/2020/09/20/links-september-2020/&quot; rel=&quot;bookmark&quot; title=&quot;Links September 2020&quot;&gt;Links September 2020&lt;/a&gt; &lt;small&gt;MD5 cracker, find plain text that matches MD5 hash [1]....&lt;/small&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://etbe.coker.com.au/2026/02/17/links-february-2026/&quot; rel=&quot;bookmark&quot; title=&quot;Links February 2026&quot;&gt;Links February 2026&lt;/a&gt; &lt;small&gt;Charles Stross has a good theory of why “AI” is...&lt;/small&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt; </description> 
	<pubDate>Fri, 06 Mar 2026 12:23:14 +0000</pubDate>

</item> 
<item>
	<title>Antoine Beaupré: Wallabako retirement and Readeck adoption</title>
	<guid>https://anarc.at/blog/2026-03-05-wallabako-retirement/</guid>
	<link>https://anarc.at/blog/2026-03-05-wallabako-retirement/</link>
     <description>  &lt;p&gt;Today I have made the tough decision of retiring the &lt;a href=&quot;https://gitlab.com/anarcat/wallabako/&quot;&gt;Wallabako&lt;/a&gt;
project. I have rolled out a final (and trivial) 1.8.0 release which
fixes the uninstall procedure and rolls out a bunch of dependency
updates.&lt;/p&gt;

&lt;h1 id=&quot;why&quot;&gt;Why?&lt;/h1&gt;

&lt;p&gt;The main reason why I&#39;m retiring Wallabako is that I have completely
stopped using it. It&#39;s not the first time: for a while, I wasn&#39;t
reading Wallabag articles on my Kobo anymore. But I had started
working on it again &lt;a href=&quot;https://anarc.at/blog/2022-05-06-wallabako-1.4.0-released/&quot;&gt;about four years ago&lt;/a&gt;. Wallabako itself is
about to turn 10 years old.&lt;/p&gt;

&lt;p&gt;This time, I stopped using Wallabako because there&#39;s simply something
better out there. I have switched away from &lt;a href=&quot;https://wallabag.org/&quot;&gt;Wallabag&lt;/a&gt; to
&lt;a href=&quot;https://readeck.org/&quot;&gt;Readeck&lt;/a&gt;!&lt;/p&gt;

&lt;p&gt;And I&#39;m also tired of maintaining &quot;modern&quot; software. Most of the
recent commits on Wallabako are from &lt;a href=&quot;https://gitlab.com/renovate-bot-anarcat&quot;&gt;renovate-bot&lt;/a&gt;. This feels futile
and pointless. I guess it &lt;em&gt;must&lt;/em&gt; be done at some point, but it also
feels we went wrong somewhere there. Maybe &lt;a href=&quot;https://filippo.io/&quot;&gt;Filippo Valsorda&lt;/a&gt; is
right and one should &lt;a href=&quot;https://words.filippo.io/dependabot/&quot;&gt;turn dependabot off&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I did consider porting Wallabako to Readeck for a while, but there&#39;s a
&lt;a href=&quot;https://github.com/iceyear/readeck.koplugin&quot;&gt;perfectly fine Koreader plugin&lt;/a&gt; that I&#39;ve been pretty happy to
use. I was worried it would be slow (because the Wallabag plugin &lt;em&gt;is&lt;/em&gt;
slow), but it turns out that Readeck is fast enough that this doesn&#39;t
matter.&lt;/p&gt;

&lt;h1 id=&quot;moving-from-wallabag-to-readeck&quot;&gt;Moving from Wallabag to Readeck&lt;/h1&gt;

&lt;p&gt;Readeck is pretty fantastic: it&#39;s fast, it&#39;s lightweight, everything
Just Works. All sorts of concerns I had with Wallabag are just gone:
&lt;a href=&quot;https://github.com/wallabag/wallabag/issues/2800&quot;&gt;questionable authentication&lt;/a&gt;, &lt;a href=&quot;https://github.com/wallabag/wallabag/issues/2859&quot;&gt;questionable API&lt;/a&gt;, &lt;a href=&quot;https://github.com/wallabag/wallabag/issues/6532&quot;&gt;weird
bugs&lt;/a&gt;, mostly gone. I am still looking for &lt;a href=&quot;https://github.com/wallabag/wallabag/issues/1197&quot;&gt;multiple tags
filtering&lt;/a&gt; but I have a much better feeling about Readeck than
Wallabag: it&#39;s written in Golang and under active development.&lt;/p&gt;

&lt;p&gt;In any case, I don&#39;t want to throw shade at the Wallabag folks
either. They did &lt;a href=&quot;https://github.com/wallabag/wallabag/issues?q=involves%3Aanarcat&quot;&gt;solve most of the issues I raised with them&lt;/a&gt; and
even accepted &lt;a href=&quot;https://github.com/wallabag/wallabag/pull/7849&quot;&gt;my pull request&lt;/a&gt;. They have helped me collect
thousands of articles for a long time! It&#39;s just time to move on.&lt;/p&gt;

&lt;p&gt;The migration from Wallabag was impressively simple. The importer is
well-tuned, fast, and just works. I wrote about the import in &lt;a href=&quot;https://codeberg.org/readeck/readeck/issues/1119&quot;&gt;this
issue&lt;/a&gt;, but it took about 20 minutes to import essentially all
articles, and another 5 hours to refresh all the contents.&lt;/p&gt;

&lt;p&gt;There are minor issues with Readeck which I have filed (after asking!):&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://codeberg.org/gollyhatch/eckard/issues/19&quot;&gt;add justified view for articles&lt;/a&gt; (Android app)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://codeberg.org/gollyhatch/eckard/issues/20&quot;&gt;more metadata in article display&lt;/a&gt; (Android app)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://codeberg.org/readeck/readeck/issues/1126&quot;&gt;show the number of articles in the label browser&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://codeberg.org/readeck/readeck/issues/73#issuecomment-11025251&quot;&gt;ignore duplicates&lt;/a&gt; (Readeck will happily add duplicates, whereas
Wallabag at least &lt;em&gt;tries&lt;/em&gt; to deduplicate articles -- but often fails)&lt;/li&gt;
&lt;/ul&gt;


&lt;p&gt;But overall I&#39;m happy and impressed with the result.&lt;/p&gt;

&lt;p&gt;I&#39;m also both happy and sad at letting go of my first (and only,
so far) Golang project. I loved writing in Go: it&#39;s a clean language,
fast to learn, and a beauty to write parallel code in (at the cost of
a rather obscure runtime).&lt;/p&gt;

&lt;p&gt;It would have been &lt;em&gt;much&lt;/em&gt; harder to write this in Python, but my
experience in Golang helped me think about how to write more parallel
code in Python, which is kind of cool.&lt;/p&gt;

&lt;p&gt;The &lt;a href=&quot;https://gitlab.com/anarcat/wallabako/&quot;&gt;GitLab project&lt;/a&gt; will remain publicly accessible, but archived,
for the foreseeable future. If you&#39;re interested in taking over
stewardship for this project, &lt;a href=&quot;https://anarc.at/contact/&quot;&gt;contact me&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Thanks Wallabag folks, it was a great ride!&lt;/p&gt; </description> 
	<pubDate>Fri, 06 Mar 2026 03:05:31 +0000</pubDate>

</item> 
<item>
	<title>Ian Jackson: Adopting tag2upload and modernising your Debian packaging</title>
	<guid>tag:dreamwidth.org,2009-05-21:377446:20851</guid>
	<link>https://diziet.dreamwidth.org/20851.html</link>
     <description>  &lt;div style=&quot;background-color: #fff; color: #000;&quot;&gt;
&lt;h1&gt;&lt;a name=&quot;introduction&quot;&gt;Introduction&lt;/a&gt;&lt;/h1&gt;
&lt;p&gt;&lt;a href=&quot;https://wiki.debian.org/tag2upload&quot;&gt;tag2upload&lt;/a&gt; allows authorised Debian contributors to upload to Debian simply by pushing a signed git tag to Debian’s gitlab instance, Salsa.
&lt;/p&gt;&lt;p&gt;We have recently &lt;a href=&quot;https://lists.debian.org/debian-devel-announce/2026/02/msg00002.html&quot;&gt;announced&lt;/a&gt; that tag2upload is, in our opinion, now very stable, and ready for general use by all Debian uploaders.
&lt;/p&gt;&lt;p&gt;tag2upload, as part of &lt;a href=&quot;https://diziet.dreamwidth.org/20436.html&quot;&gt;Debian’s git transition programme&lt;/a&gt;, is very flexible - it needs to support a large variety of maintainer practices. And it’s relatively unopinionated, wherever that’s possible. But, during the open beta, various contributors emailed us asking for Debian packaging git workflow advice and recommendations.
&lt;/p&gt;&lt;p&gt;This post is an attempt to give some more opinionated answers, and guide you through modernising your workflow.
&lt;/p&gt;&lt;p&gt;(This article is aimed squarely at Debian contributors. Much of it will make little sense to Debian outsiders.)
&lt;/p&gt;&lt;ul&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#why&quot;&gt;Why&lt;/a&gt;&lt;ul&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#ease-of-development&quot;&gt;Ease of development&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#dont-fear-a-learning-burden-instead-start-forgetting-all-that-nonsense&quot;&gt;Don’t fear a learning burden; instead, start forgetting all that nonsense&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#properly-publishing-the-source-code&quot;&gt;Properly publishing the source code&lt;/a&gt;
&lt;/li&gt;&lt;/ul&gt;

&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#adopting-tag2upload---the-minimal-change&quot;&gt;Adopting tag2upload - the minimal change&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#overhauling-your-workflow-using-advanced-git-first-tooling&quot;&gt;Overhauling your workflow, using advanced git-first tooling&lt;/a&gt;&lt;ul&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#assumptions&quot;&gt;Assumptions&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#topics-and-tooling&quot;&gt;Topics and tooling&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#choosing-the-git-branch-format&quot;&gt;Choosing the git branch format&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#determine-upstream-git-and-stop-using-upstream-tarballs&quot;&gt;Determine upstream git and stop using upstream tarballs&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#convert-the-git-branch&quot;&gt;Convert the git branch&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#change-the-source-format&quot;&gt;Change the source format&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#sort-out-the-documentation-and-metadata&quot;&gt;Sort out the documentation and metadata&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#configure-salsa-merge-requests&quot;&gt;Configure Salsa Merge Requests&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#set-up-salsa-ci-and-use-it-to-block-merges-of-bad-changes&quot;&gt;Set up Salsa CI, and use it to block merges of bad changes&lt;/a&gt;
&lt;/li&gt;&lt;/ul&gt;

&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#day-to-day-work&quot;&gt;Day-to-day work&lt;/a&gt;&lt;ul&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#making-changes-to-the-package&quot;&gt;Making changes to the package&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#test-build&quot;&gt;Test build&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#uploading-to-debian&quot;&gt;Uploading to Debian&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#uploading-a-new-package-to-debian&quot;&gt;Uploading a NEW package to Debian&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#new-upstream-version&quot;&gt;New upstream version&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#sponsorship&quot;&gt;Sponsorship&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#incorporating-an-nmu&quot;&gt;Incorporating an NMU&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#dfsg-filtering-handling-non-free-files&quot;&gt;DFSG filtering (handling non-free files)&lt;/a&gt;
&lt;/li&gt;&lt;/ul&gt;

&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#common-issues&quot;&gt;Common issues&lt;/a&gt;
&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#further-reading&quot;&gt;Further reading&lt;/a&gt;
&lt;/li&gt;&lt;/ul&gt;
&lt;a name=&quot;cutid1&quot;&gt;&lt;/a&gt;
&lt;h1&gt;&lt;a name=&quot;why&quot;&gt;Why&lt;/a&gt;&lt;/h1&gt;
&lt;h2&gt;&lt;a name=&quot;ease-of-development&quot;&gt;Ease of development&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;git offers a far superior development experience to patches and tarballs. Moving tasks from a tarballs and patches representation to a normal, git-first, representation, makes everything simpler.
&lt;/p&gt;&lt;p&gt;dgit and tag2upload do automatically many things that have to be done manually, or with separate commands, in dput-based upload workflows.
&lt;/p&gt;&lt;p&gt;They will also save you from a variety of common mistakes. For example, you cannot accidentally overwrite an NMU, with tag2upload or dgit. These many safety catches mean that our software sometimes complains about things, or needs confirmation, when more primitive tooling just goes ahead. We think this is the right tradeoff: it’s part of the great care we take to avoid our software making messes. Software that has your back is very liberating for the user.
&lt;/p&gt;&lt;p&gt;tag2upload makes it possible to upload with very small amounts of data transfer, which is great in slow or unreliable network environments. The other week I did a git-debpush over mobile data while on a train in Switzerland; it completed in seconds.
&lt;/p&gt;&lt;p&gt;See the &lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#day-to-day-work&quot;&gt;Day-to-day work section&lt;/a&gt; below to see how simple your life could be.
&lt;/p&gt;&lt;h2&gt;&lt;a name=&quot;dont-fear-a-learning-burden-instead-start-forgetting-all-that-nonsense&quot;&gt;Don’t fear a learning burden; instead, start forgetting all that nonsense&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Most Debian contributors have spent months or years learning how to work with Debian’s tooling. You may reasonably fear that our software is yet more bizarre, janky, and mistake-prone stuff to learn.
&lt;/p&gt;&lt;p&gt;We promise (and our users tell us) that’s not how it is. We have spent a lot of effort on providing a good user experience. Our new git-first tooling, especially dgit and tag2upload, is much simpler to use than source-package-based tooling, despite being more capable.
&lt;/p&gt;&lt;p&gt;The idiosyncrasies and bugs of source packages, and of the legacy archive, have been relentlessly worked around and papered over by our thousands of lines of thoroughly-tested defensive code. You too can forget all those confusing details, like our users have! After using our systems for a while you won’t look back.
&lt;/p&gt;&lt;p&gt;And, you shouldn’t fear trying it out. dgit and tag2upload are unlikely to make a mess. If something is wrong (or even doubtful), they will typically detect it, and stop. This does mean that starting to use tag2upload or dgit can involve resolving anomalies that previous tooling ignored, or passing additional options to reassure the system about your intentions. So admittedly it &lt;em&gt;isn’t&lt;/em&gt; always trivial to get your first push to succeed.
&lt;/p&gt;&lt;h2&gt;&lt;a name=&quot;properly-publishing-the-source-code&quot;&gt;Properly publishing the source code&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;One of Debian’s foundational principles is that we publish the source code.
&lt;/p&gt;&lt;p&gt;Nowadays, the vast majority of us, and of our upstreams, are using git. We are doing this because git makes our life so much easier.
&lt;/p&gt;&lt;p&gt;But, without tag2upload or dgit, we aren’t &lt;em&gt;properly&lt;/em&gt; publishing our work! Yes, we typically put our git branch on Salsa, and point &lt;code&gt;Vcs-Git&lt;/code&gt; at it. However:
&lt;/p&gt;&lt;ul&gt;&lt;li&gt;The format of git branches on Salsa is not standardised. They might be patches-unapplied, patches-applied, bare &lt;code&gt;debian/&lt;/code&gt;, or &lt;a href=&quot;https://wiki.debian.org/GitPackagingSurvey&quot;&gt;something even stranger&lt;/a&gt;.
&lt;/li&gt;&lt;li&gt;There is no guarantee that the DEP-14 &lt;code&gt;debian/1.2.3-7&lt;/code&gt; tag on salsa corresponds precisely to what was actually uploaded. dput-based tooling (such as &lt;code&gt;gbp buildpackage&lt;/code&gt;) doesn’t cross-check the .dsc against git.
&lt;/li&gt;&lt;li&gt;There is no guarantee that the presence of a DEP-14 tag even means that that version of package is in the archive.
&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;This means that the git repositories on Salsa cannot be used by anyone who needs things that are &lt;em&gt;systematic&lt;/em&gt; and &lt;em&gt;always correct&lt;/em&gt;. They are OK for expert humans, but they are awkward (even &lt;a href=&quot;https://diziet.dreamwidth.org/9556.html&quot;&gt;hazardous&lt;/a&gt;) for Debian novices, and you cannot use them in automation. The real test is: could you use &lt;code&gt;Vcs-Git&lt;/code&gt; and Salsa to build a Debian derivative? You could not.
&lt;/p&gt;&lt;p&gt;tag2upload and dgit &lt;em&gt;do&lt;/em&gt; solve this problem. When you upload, they:
&lt;/p&gt;&lt;ol type=&quot;1&quot;&gt;&lt;li&gt;Make a canonical-form (patches-applied) derivative of your git branch;
&lt;/li&gt;&lt;li&gt;Ensure that there is a well-defined correspondence between the git tree and the source package;
&lt;/li&gt;&lt;li&gt;Publish both the DEP-14 tag and a canonical-form &lt;code&gt;archive/debian/1.2.3-7&lt;/code&gt; tag to a single central git depository, &lt;a href=&quot;https://browse.dgit.debian.org&quot;&gt;&lt;code&gt;*.dgit.debian.org&lt;/code&gt;&lt;/a&gt;;
&lt;/li&gt;&lt;li&gt;Record the git information in the &lt;code&gt;Dgit&lt;/code&gt; field in &lt;code&gt;.dsc&lt;/code&gt; so that clients can tell (using the &lt;a href=&quot;https://ftp-team.pages.debian.net/dak/docs/generated/dakweb.html#module-dakweb&quot;&gt;ftpmaster API&lt;/a&gt;) that this was a git-based upload, what the corresponding git objects are, and where to find them.
&lt;/li&gt;&lt;/ol&gt;
&lt;p&gt;This dependably conveys your git history to users and downstreams, in a standard, systematic and discoverable way. tag2upload and dgit are the only system which achieves this.
&lt;/p&gt;&lt;p&gt;(The client is &lt;code&gt;dgit clone&lt;/code&gt;, as advertised in e.g. &lt;a href=&quot;https://manpages.debian.org/testing/dgit/dgit-user.7.en.html&quot;&gt;dgit-user(7)&lt;/a&gt;. For dput-based uploads, it falls back to importing the source package.)
&lt;/p&gt;&lt;h1&gt;&lt;a name=&quot;adopting-tag2upload---the-minimal-change&quot;&gt;Adopting tag2upload - the minimal change&lt;/a&gt;&lt;/h1&gt;
&lt;p&gt;tag2upload is a substantial incremental improvement to many existing workflows. git-debpush is a drop-in replacement for building, signing, and uploading the source package.
&lt;/p&gt;&lt;p&gt;So, you can just adopt it &lt;em&gt;without&lt;/em&gt; completely overhauling your packaging practices. You and your co-maintainers can even mix-and-match tag2upload, dgit, and traditional approaches, for the same package.
&lt;/p&gt;&lt;p&gt;Start with &lt;a href=&quot;https://wiki.debian.org/tag2upload&quot;&gt;the wiki page&lt;/a&gt; and &lt;a href=&quot;https://manpages.debian.org/testing/git-debpush/git-debpush.1.en.html&quot;&gt;git-debpush(1)&lt;/a&gt; (ideally from forky aka testing).
&lt;/p&gt;&lt;p&gt;&lt;strong&gt;You &lt;em&gt;don’t&lt;/em&gt; need to do any of the other things recommended in this article.&lt;/strong&gt;
&lt;/p&gt;&lt;h1&gt;&lt;a name=&quot;overhauling-your-workflow-using-advanced-git-first-tooling&quot;&gt;Overhauling your workflow, using advanced git-first tooling&lt;/a&gt;&lt;/h1&gt;
&lt;p&gt;The rest of this article is a guide to adopting the best and most advanced git-based tooling for Debian packaging.
&lt;/p&gt;&lt;h2&gt;&lt;a name=&quot;assumptions&quot;&gt;Assumptions&lt;/a&gt;&lt;/h2&gt;
&lt;ul&gt;&lt;li&gt;&lt;p&gt;Your current approach uses the “patches-unapplied” git branch format used with &lt;code&gt;gbp pq&lt;/code&gt; and/or &lt;code&gt;quilt&lt;/code&gt;, and often used with &lt;code&gt;git-buildpackage&lt;/code&gt;. You previously used &lt;code&gt;gbp import-orig&lt;/code&gt;.

&lt;/p&gt;&lt;/li&gt;&lt;li&gt;&lt;p&gt;You are fluent with git, and know how to use Merge Requests on gitlab (Salsa). You have your &lt;code&gt;origin&lt;/code&gt; remote set to Salsa.

&lt;/p&gt;&lt;/li&gt;&lt;li&gt;&lt;p&gt;Your main Debian branch name on Salsa is &lt;code&gt;master&lt;/code&gt;. Personally I &lt;a href=&quot;https://datatracker.ietf.org/doc/statement-iab-statement-on-inclusive-language-in-iab-stream-documents/&quot;&gt;think we should use &lt;code&gt;main&lt;/code&gt;&lt;/a&gt; but changing your main branch name is outside the scope of this article.

&lt;/p&gt;&lt;/li&gt;&lt;li&gt;&lt;p&gt;You have enough familiarity with Debian packaging including concepts like source and binary packages, and NEW review.

&lt;/p&gt;&lt;/li&gt;&lt;li&gt;&lt;p&gt;Your co-maintainers are also adopting the new approach.

&lt;/p&gt;&lt;/li&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/ul&gt;
&lt;p&gt;tag2upload and dgit (and git-debrebase) are flexible tools and can help with many other scenarios too, and you can often mix-and-match different approaches. But, explaining every possibility would make this post far too confusing.
&lt;/p&gt;&lt;h2&gt;&lt;a name=&quot;topics-and-tooling&quot;&gt;Topics and tooling&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;This article will guide you in adopting:
&lt;/p&gt;&lt;ul&gt;&lt;li&gt;tag2upload
&lt;/li&gt;&lt;li&gt;Patches-applied git branch for your packaging
&lt;/li&gt;&lt;li&gt;Either plain git merge or git-debrebase
&lt;/li&gt;&lt;li&gt;dgit when a with-binaries uploaded is needed (NEW)
&lt;/li&gt;&lt;li&gt;git-based sponsorship
&lt;/li&gt;&lt;li&gt;Salsa (gitlab), including Debian Salsa CI
&lt;/li&gt;&lt;/ul&gt;
&lt;h2&gt;&lt;a name=&quot;choosing-the-git-branch-format&quot;&gt;Choosing the git branch format&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;In Debian we need to be able to modify the upstream-provided source code. Those modifications are the &lt;strong&gt;Debian delta&lt;/strong&gt;. We need to somehow represent it in git.
&lt;/p&gt;&lt;p&gt;We recommend storing the delta &lt;em&gt;as git commits to those upstream files&lt;/em&gt;, by picking one of the following two approaches.
&lt;/p&gt;&lt;blockquote style=&quot;background-color: #eee; color: #222; font-style: italic;&quot;&gt;
&lt;h6 style=&quot;margin-bottom: 0;&quot;&gt;rationale&lt;/h6&gt;

&lt;p&gt;Much traditional Debian tooling like &lt;code&gt;quilt&lt;/code&gt; and &lt;code&gt;gbp pq&lt;/code&gt; uses the “patches-unapplied” branch format, which stores the delta as patch files in &lt;code&gt;debian/patches/&lt;/code&gt;, in a git tree full of unmodified upstream files. This is clumsy to work with, and can even be an &lt;a href=&quot;https://diziet.dreamwidth.org/9556.html&quot;&gt;alarming beartrap&lt;/a&gt; for Debian outsiders.
&lt;/p&gt;&lt;/blockquote&gt;

&lt;div style=&quot;background-color: #ddf; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git merge&lt;/h5&gt;

&lt;p&gt;&lt;strong&gt;Option 1: simply use git, directly, including git merge.&lt;/strong&gt;
&lt;/p&gt;&lt;p&gt;Just make changes directly to upstream files on your Debian branch, when necessary. Use plain &lt;code&gt;git merge&lt;/code&gt; when merging from upstream.
&lt;/p&gt;&lt;p&gt;This is appropriate if your package has no or very few upstream changes. It is a good approach if the Debian maintainers and upstream maintainers work very closely, so that any needed changes for Debian are upstreamed quickly, and any desired behavioural differences can be arranged by configuration controlled from within &lt;code&gt;debian/&lt;/code&gt;.
&lt;/p&gt;&lt;p&gt;This is the approach documented more fully in our workflow tutorial &lt;a href=&quot;https://manpages.debian.org/testing/dgit/dgit-maint-merge.7.en.html&quot;&gt;dgit-maint-merge(7)&lt;/a&gt;.
&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/div&gt;
&lt;div style=&quot;background-color: #ffa; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git-debrebase&lt;/h5&gt;

&lt;p&gt;&lt;strong&gt;Option 2: Adopt git-debrebase.&lt;/strong&gt;
&lt;/p&gt;&lt;p&gt;git-debrebase helps maintain your delta as linear series of commits (very like a “topic branch” in git terminology). The delta can be reorganised, edited, and rebased. git-debrebase is designed to help you carry a significant and complicated delta series.
&lt;/p&gt;&lt;p&gt;The older versions of the Debian delta are preserved in the history. git-debrebase makes extra merges to make a fast-forwarding history out of the successive versions of the delta queue branch.
&lt;/p&gt;&lt;p&gt;This is the approach documented more fully in our workflow tutorial &lt;a href=&quot;https://manpages.debian.org/testing/dgit/dgit-maint-debrebase.7.en.html&quot;&gt;dgit-maint-debrebase(7)&lt;/a&gt;.
&lt;/p&gt;&lt;p&gt;Examples of complex packages using this approach include &lt;a href=&quot;https://salsa.debian.org/xen-team/debian-xen&quot;&gt;src:xen&lt;/a&gt; and &lt;a href=&quot;https://salsa.debian.org/common-lisp-team/sbcl&quot;&gt;src:sbcl&lt;/a&gt;.
&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/div&gt;
&lt;h5&gt;&lt;/h5&gt;

&lt;h2&gt;&lt;a name=&quot;determine-upstream-git-and-stop-using-upstream-tarballs&quot;&gt;Determine upstream git and stop using upstream tarballs&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;We recommend using upstream git, only and directly. You should ignore upstream tarballs completely.
&lt;/p&gt;&lt;blockquote style=&quot;background-color: #eee; color: #222; font-style: italic;&quot;&gt;
&lt;h6 style=&quot;margin-bottom: 0;&quot;&gt;rationale&lt;/h6&gt;

&lt;p&gt;Many maintainers have been importing upstream tarballs into git, for example by using &lt;code&gt;gbp import-orig&lt;/code&gt;. But in reality the upstream tarball is an intermediate build product, not (just) source code. Using tarballs rather than git exposes us to additional supply chain attacks; indeed, the key activation part of the xz backdoor attack was hidden only in the tarball!
&lt;/p&gt;&lt;p&gt;git offers better traceability than so-called “pristine” upstream tarballs. (The word “pristine” is even a &lt;a href=&quot;https://joeyh.name/blog/entry/upstream_git_repositories/&quot;&gt;joke&lt;/a&gt; by the author of pristine-tar!)
&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/blockquote&gt;

&lt;p&gt;First, establish which upstream git tag corresponds to the version currently in Debian. From the sake of readability, I’m going to pretend that upstream version is &lt;code&gt;1.2.3&lt;/code&gt;, and that upstream tagged it &lt;code&gt;v1.2.3&lt;/code&gt;.
&lt;/p&gt;&lt;p&gt;Edit &lt;code&gt;debian/watch&lt;/code&gt; to contain something like this:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;version=4
opts=&quot;mode=git&quot; https://codeberg.org/team/package refs/tags/v(\d\S*)&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;You may need to adjust the regexp, depending on your upstream’s tag name convention. If &lt;code&gt;debian/watch&lt;/code&gt; had a &lt;code&gt;files-excluded&lt;/code&gt;, you’ll need to make a &lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#dfsg-filtering-handling-non-free-files&quot;&gt;filtered version of upstream git&lt;/a&gt;.
&lt;/p&gt;&lt;div style=&quot;background-color: #ffa; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git-debrebase&lt;/h5&gt;

&lt;p&gt;From now on we’ll generate our own .orig tarballs directly from git.
&lt;/p&gt;&lt;blockquote style=&quot;background-color: #ee9; color: #222; font-style: italic;&quot;&gt;
&lt;h6 style=&quot;margin-bottom: 0;&quot;&gt;rationale&lt;/h6&gt;

&lt;p&gt;We need &lt;em&gt;some&lt;/em&gt; “upstream tarball” for the &lt;code&gt;3.0 (quilt)&lt;/code&gt; source format to work with. It needs to correspond to the git commit we’re using as our upstream. We &lt;em&gt;don’t&lt;/em&gt; need or want to use a tarball from upstream for this. The &lt;code&gt;.orig&lt;/code&gt; is just needed so a nice legacy Debian source package (&lt;code&gt;.dsc&lt;/code&gt;) can be generated.
&lt;/p&gt;&lt;/blockquote&gt;

&lt;p&gt;Probably, the current &lt;code&gt;.orig&lt;/code&gt; in the Debian archive, is an upstream tarball, which may be different to the output of git-archive and possibly even have different contents to what’s in git. The legacy archive has trouble with differing &lt;code&gt;.orig&lt;/code&gt;s for the “same upstream version”.
&lt;/p&gt;&lt;p&gt;So we must — until the next upstream release — change our idea of the upstream version number. We’re going to add &lt;code&gt;+git&lt;/code&gt; to Debian’s idea of the upstream version. Manually make a tag with that name:
&lt;/p&gt;&lt;pre class=&quot;shell-script&quot;&gt;&lt;code&gt;git tag -m &quot;Compatibility tag for orig transition&quot; v1.2.3+git v1.2.3~0
git push origin v1.2.3+git&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;If you are doing the packaging overhaul at the same time as a new upstream version, you can skip this part.
&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/div&gt;
&lt;h5&gt;&lt;/h5&gt;

&lt;h2&gt;&lt;a name=&quot;convert-the-git-branch&quot;&gt;Convert the git branch&lt;/a&gt;&lt;/h2&gt;
&lt;div style=&quot;background-color: #ddf; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git merge&lt;/h5&gt;

&lt;p&gt;Prepare a new branch on top of upstream git, containing what we want:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;git branch -f old-master         # make a note of the old git representation
git reset --hard v1.2.3          # go back to the real upstream git tag
git checkout old-master :debian  # take debian/* from old-master
git commit -m &quot;Re-import Debian packaging on top of upstream git&quot;
git merge --allow-unrelated-histories -s ours -m &quot;Make fast forward from tarball-based history&quot; old-master
git branch -d old-master         # it&#39;s incorporated in our history now&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;&lt;strong&gt;If there are any patches, manually apply them&lt;/strong&gt; to your &lt;code&gt;main&lt;/code&gt; branch with &lt;code&gt;git am&lt;/code&gt;, and delete the patch files (&lt;code&gt;git rm -r debian/patches&lt;/code&gt;, and commit). (If you’ve chosen this workflow, there should be hardly any patches,)
&lt;/p&gt;&lt;blockquote style=&quot;background-color: #cce; color: #222; font-style: italic;&quot;&gt;
&lt;h6 style=&quot;margin-bottom: 0;&quot;&gt;rationale&lt;/h6&gt;

&lt;p&gt;These are some pretty nasty git runes, indeed. They’re needed because we want to restart our Debian packaging on top of a possibly quite different notion of what the upstream is.
&lt;/p&gt;&lt;/blockquote&gt;

&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/div&gt;
&lt;div style=&quot;background-color: #ffa; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git-debrebase&lt;/h5&gt;

&lt;p&gt;Convert the branch to git-debrebase format and rebase onto the upstream git:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;git-debrebase -fdiverged convert-from-gbp upstream/1.2.3
git-debrebase -fdiverged -fupstream-not-ff new-upstream 1.2.3+git&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;If you had patches which patched generated files which are present only in the upstream tarball, and not in upstream git, you will encounter rebase conflicts. You can drop hunks editing those files, since those files are no longer going to be part of your view of the upstream source code at all.
&lt;/p&gt;&lt;blockquote style=&quot;background-color: #ee9; color: #222; font-style: italic;&quot;&gt;
&lt;h6 style=&quot;margin-bottom: 0;&quot;&gt;rationale&lt;/h6&gt;

&lt;p&gt;The force option &lt;code&gt;-fupstream-not-ff&lt;/code&gt; will be needed this one time because your existing Debian packaging history is (probably) not based directly on the upstream history. &lt;code&gt;-fdiverged&lt;/code&gt; may be needed because git-debrebase might spot that your branch is not based on dgit-ish git history.
&lt;/p&gt;&lt;/blockquote&gt;

&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/div&gt;
&lt;h5&gt;&lt;/h5&gt;

&lt;p&gt;Manually make your history fast forward from the git import of your previous upload.
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;dgit fetch
git show dgit/dgit/sid:debian/changelog
# check that you have the same version number
git merge -s ours --allow-unrelated-histories -m &#39;Declare fast forward from pre-git-based history&#39; dgit/dgit/sid&lt;/code&gt;&lt;/pre&gt;&lt;h2&gt;&lt;a name=&quot;change-the-source-format&quot;&gt;Change the source format&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Delete any existing &lt;code&gt;debian/source/options&lt;/code&gt; and/or &lt;code&gt;debian/source/local-options&lt;/code&gt;.
&lt;/p&gt;&lt;div style=&quot;background-color: #ddf; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git merge&lt;/h5&gt;

&lt;p&gt;Change &lt;code&gt;debian/source/format&lt;/code&gt; to &lt;code&gt;1.0&lt;/code&gt;. Add &lt;code&gt;debian/source/options&lt;/code&gt; containing &lt;code&gt;-sn&lt;/code&gt;.
&lt;/p&gt;&lt;blockquote style=&quot;background-color: #cce; color: #222; font-style: italic;&quot;&gt;
&lt;h6 style=&quot;margin-bottom: 0;&quot;&gt;rationale&lt;/h6&gt;

&lt;p&gt;We are using the “1.0 native” source format. This is the simplest possible source format - just a tarball. We would prefer “3.0 (native)”, which has some advantages, but dpkg-source between 2013 (wheezy) and 2025 (trixie) inclusive &lt;a href=&quot;https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=737634#107&quot;&gt;unjustifiably rejects&lt;/a&gt; this configuration.
&lt;/p&gt;&lt;p&gt;You may receive bug reports from over-zealous folks complaining about the use of the 1.0 source format. You should close such reports, with a reference to this article and to &lt;a href=&quot;https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1106402&quot;&gt;#1106402&lt;/a&gt;.
&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/blockquote&gt;

&lt;p&gt;&lt;/p&gt;&lt;/div&gt;
&lt;div style=&quot;background-color: #ffa; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git-debrebase&lt;/h5&gt;

&lt;p&gt;Ensure that &lt;code&gt;debian/source/format&lt;/code&gt; contains &lt;code&gt;3.0 (quilt)&lt;/code&gt;.
&lt;/p&gt;&lt;/div&gt;
&lt;h5&gt;&lt;/h5&gt;

&lt;p&gt;Now you are ready to do a &lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#test-build&quot;&gt;local test build&lt;/a&gt;.
&lt;/p&gt;&lt;h2&gt;&lt;a name=&quot;sort-out-the-documentation-and-metadata&quot;&gt;Sort out the documentation and metadata&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Edit &lt;code&gt;README.source&lt;/code&gt; to at least mention dgit-maint-merge(7) or dgit-maint-debrebase(7), and to tell people not to try to edit or create anything in &lt;code&gt;debian/patches/&lt;/code&gt;. Consider saying that uploads should be done via dgit or tag2upload.
&lt;/p&gt;&lt;p&gt;Check that your &lt;code&gt;Vcs-Git&lt;/code&gt; is correct in &lt;code&gt;debian/control&lt;/code&gt;. Consider deleting or pruning &lt;code&gt;debian/gbp.conf&lt;/code&gt;, since it isn’t used by dgit, tag2upload, or git-debrebase.
&lt;/p&gt;&lt;div style=&quot;background-color: #ddf; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git merge&lt;/h5&gt;

&lt;p&gt;Add a note to &lt;code&gt;debian/changelog&lt;/code&gt; about the git packaging change.
&lt;/p&gt;&lt;/div&gt;
&lt;div style=&quot;background-color: #ffa; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git-debrebase&lt;/h5&gt;

&lt;p&gt;&lt;code&gt;git-debrebase new-upstream&lt;/code&gt; will have added a “new upstream version” stanza to &lt;code&gt;debian/changelog&lt;/code&gt;. Edit that so that it instead describes the packaging change. (Don’t remove the &lt;code&gt;+git&lt;/code&gt; from the upstream version number there!)
&lt;/p&gt;&lt;/div&gt;
&lt;h5&gt;&lt;/h5&gt;

&lt;h2&gt;&lt;a name=&quot;configure-salsa-merge-requests&quot;&gt;Configure Salsa Merge Requests&lt;/a&gt;&lt;/h2&gt;
&lt;div style=&quot;background-color: #ffa; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git-debrebase&lt;/h5&gt;

&lt;p&gt;In “Settings” / “Merge requests”, change “Squash commits when merging” to “Do not allow”.
&lt;/p&gt;&lt;blockquote style=&quot;background-color: #ee9; color: #222; font-style: italic;&quot;&gt;
&lt;h6 style=&quot;margin-bottom: 0;&quot;&gt;rationale&lt;/h6&gt;

&lt;p&gt;Squashing could destroy your carefully-curated delta queue. It would also disrupt git-debrebase’s git branch structure.
&lt;/p&gt;&lt;/blockquote&gt;

&lt;p&gt;&lt;/p&gt;&lt;/div&gt;
&lt;h5&gt;&lt;/h5&gt;

&lt;h2&gt;&lt;a name=&quot;set-up-salsa-ci-and-use-it-to-block-merges-of-bad-changes&quot;&gt;Set up Salsa CI, and use it to block merges of bad changes&lt;/a&gt;&lt;/h2&gt;
&lt;h3&gt;&lt;a name=&quot;caveat---the-tradeoff&quot;&gt;Caveat - the tradeoff&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;gitlab is a giant pile of enterprise crap. It &lt;a href=&quot;https://gitlab.com/gitlab-org/gitlab/-/issues/429516&quot;&gt;is&lt;/a&gt; &lt;a href=&quot;https://gitlab.com/gitlab-org/gitlab/-/issues/472646&quot;&gt;full&lt;/a&gt; &lt;a href=&quot;https://gitlab.com/gitlab-org/gitlab/-/issues/581752&quot;&gt;of&lt;/a&gt; &lt;a href=&quot;https://gitlab.com/gitlab-org/gitlab/-/issues/581897&quot;&gt;startling&lt;/a&gt; &lt;a href=&quot;https://gitlab.com/gitlab-org/gitlab/-/issues/217231&quot;&gt;bugs&lt;/a&gt;, many of which reveal a fundamentally broken design. It is only barely Free Software in practice for Debian (in the sense that we are very reluctant to try to modify it). The constant-churn development approach and open-core business model are &lt;a href=&quot;https://mako.cc/writing/hill-free_tools.html&quot;&gt;serious problems&lt;/a&gt;. It’s very slow (and resource-intensive). It can be depressingly unreliable. That Salsa works as well as it does is a testament to the dedication of the Debian Salsa team (and those who support them, including DSA).
&lt;/p&gt;&lt;p&gt;However, I have found that despite these problems, Salsa CI is well worth the trouble. Yes, there are frustrating days when work is blocked because gitlab CI is broken and/or one has to keep mashing “Retry”. But, the upside is no longer having to remember to run tests, track which of my multiple dev branches tests have passed on, and so on. Automatic tests on Merge Requests are a great way of reducing maintainer review burden for external contributions, and helping uphold quality norms within a team. They’re a great boon for the lazy solo programmer.
&lt;/p&gt;&lt;p&gt;The bottom line is that I absolutely love it when the computer thoroughly checks my work. This is tremendously freeing, precisely at the point when one most needs it — deep in the code. If the price is to occasionally be blocked by a confused (or broken) computer, so be it.
&lt;/p&gt;&lt;h3&gt;&lt;a name=&quot;setup-procedure&quot;&gt;Setup procedure&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;Create &lt;code&gt;debian/salsa-ci.yml&lt;/code&gt; containing
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;include:
  - https://salsa.debian.org/salsa-ci-team/pipeline/raw/master/recipes/debian.yml&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;In your Salsa repository, under “Settings” / “CI/CD”, expand “General Pipelines” and set “CI/CD configuration file” to &lt;code&gt;debian/salsa-ci.yml&lt;/code&gt;.
&lt;/p&gt;&lt;blockquote style=&quot;background-color: #eee; color: #222; font-style: italic;&quot;&gt;
&lt;h6 style=&quot;margin-bottom: 0;&quot;&gt;rationale&lt;/h6&gt;

&lt;p&gt;Your project may have an upstream CI config in &lt;code&gt;.gitlab-ci.yml&lt;/code&gt;. But you probably want to run the Debian Salsa CI jobs.
&lt;/p&gt;&lt;p&gt;You can add various extra configuration to &lt;code&gt;debian/salsa-ci.yml&lt;/code&gt; to customise it. Consult the &lt;a href=&quot;https://salsa.debian.org/salsa-ci-team/pipeline/-/blob/master/README.md?ref_type=heads&quot;&gt;Salsa CI docs&lt;/a&gt;.
&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/blockquote&gt;

&lt;div style=&quot;background-color: #ffa; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git-debrebase&lt;/h5&gt;

&lt;p&gt;Add to &lt;code&gt;debian/salsa-ci.yml&lt;/code&gt;:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;.git-debrebase-prepare: &amp;amp;git-debrebase-prepare
  # install the tools we&#39;ll need
  - apt-get update
  - apt-get --yes install git-debrebase git-debpush
  # git-debrebase needs git user setup
  - git config user.email &quot;salsa-ci@invalid.invalid&quot;
  - git config user.name &quot;salsa-ci&quot;
  # run git-debrebase make-patches
  # https://salsa.debian.org/salsa-ci-team/pipeline/-/issues/371
  - git-debrebase --force
  - git-debrebase --noop-ok make-patches
  # make an orig tarball using the upstream tag, not a gbp upstream/ tag
  # https://salsa.debian.org/salsa-ci-team/pipeline/-/issues/541
  - git-deborig

.build-definition: &amp;amp;build-definition
  extends: .build-definition-common
  before_script: *git-debrebase-prepare

build source:
  extends: .build-source-only
  before_script: *git-debrebase-prepare

variables:
  # disable shallow cloning of git repository. This is needed for git-debrebase
  GIT_DEPTH: 0&lt;/code&gt;&lt;/pre&gt;&lt;blockquote style=&quot;background-color: #ee9; color: #222; font-style: italic;&quot;&gt;
&lt;h6 style=&quot;margin-bottom: 0;&quot;&gt;rationale&lt;/h6&gt;

&lt;p&gt;Unfortunately the Salsa CI pipeline currently lacks proper support for git-debrebase (&lt;a href=&quot;https://salsa.debian.org/salsa-ci-team/pipeline/-/issues/371&quot;&gt;salsa-ci#371&lt;/a&gt;) and has trouble directly using upstream git for orig tarballs (&lt;a href=&quot;https://salsa.debian.org/salsa-ci-team/pipeline/-/issues/541&quot;&gt;#salsa-ci#541&lt;/a&gt;).
&lt;/p&gt;&lt;p&gt;These runes were based on those &lt;a href=&quot;https://salsa.debian.org/xen-team/debian-xen/-/blob/master/debian/salsa-ci.yml?ref_type=heads&quot;&gt;in the Xen package&lt;/a&gt;. You should subscribe to the tickets #371 and #541 so that you can replace the clone-and-hack when proper support is merged.
&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/blockquote&gt;

&lt;p&gt;&lt;/p&gt;&lt;/div&gt;
&lt;h5&gt;&lt;/h5&gt;

&lt;p&gt;Push this to salsa and make the CI pass.
&lt;/p&gt;&lt;p&gt;If you configured the pipeline filename after your last push, you will need to explicitly start the first CI run. That’s in “Pipelines”: press “New pipeline” in the top right. The defaults will very probably be correct.
&lt;/p&gt;&lt;h3&gt;&lt;a name=&quot;block-untested-pushes-preventing-regressions&quot;&gt;Block untested pushes, preventing regressions&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;In your project on Salsa, go into “Settings” / “Repository”. In the section “Branch rules”, use “Add branch rule”. Select the branch &lt;code&gt;master&lt;/code&gt;. Set “Allowed to merge” to “Maintainers”. Set “Allowed to push and merge” to “No one”. Leave “Allow force push” disabled.
&lt;/p&gt;&lt;p&gt;This means that the only way to land &lt;em&gt;anything&lt;/em&gt; on your mainline is via a Merge Request. When you make a Merge Request, gitlab will offer “Set to auto-merge”. Use that.
&lt;/p&gt;&lt;p&gt;gitlab won’t normally merge an MR unless CI passes, although you can override this on a per-MR basis if you need to.
&lt;/p&gt;&lt;p&gt;(Sometimes, immediately after creating a merge request in gitlab, you will see a plain “Merge” button. &lt;a href=&quot;https://gitlab.com/gitlab-org/gitlab/-/issues/429516&quot;&gt;This is a bug.&lt;/a&gt; Don’t press that. Reload the page so that “Set to auto-merge” appears.)
&lt;/p&gt;&lt;h3&gt;&lt;a name=&quot;autopkgtests&quot;&gt;autopkgtests&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;Ideally, your package would have meaningful autopkgtests (DEP-8 tests) This makes Salsa CI more useful for you, and also helps detect and defend you against regressions in your dependencies.
&lt;/p&gt;&lt;p&gt;The &lt;a href=&quot;https://diziet.dreamwidth.org/data/documentation%20https://ci.debian.net/doc/&quot;&gt;Debian CI docs&lt;/a&gt; are a good starting point. In-depth discussion of writing autopkgtests is beyond the scope of this article.
&lt;/p&gt;&lt;h1&gt;&lt;a name=&quot;day-to-day-work&quot;&gt;Day-to-day work&lt;/a&gt;&lt;/h1&gt;
&lt;p&gt;With this capable tooling, most tasks are much easier.
&lt;/p&gt;&lt;h2&gt;&lt;a name=&quot;making-changes-to-the-package&quot;&gt;Making changes to the package&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Make all changes via a Salsa Merge Request. So start by making a branch that will become the MR branch.
&lt;/p&gt;&lt;p&gt;On your MR branch you can freely edit every file. This includes upstream files, and files in &lt;code&gt;debian/&lt;/code&gt;.
&lt;/p&gt;&lt;p&gt;For example, you can:
&lt;/p&gt;&lt;ul&gt;&lt;li&gt;Make changes with your editor and commit them.
&lt;/li&gt;&lt;li&gt;&lt;code&gt;git cherry-pick&lt;/code&gt; an upstream commit.
&lt;/li&gt;&lt;li&gt;&lt;code&gt;git am&lt;/code&gt; a patch from a mailing list or from the Debian Bug System.
&lt;/li&gt;&lt;li&gt;&lt;code&gt;git revert&lt;/code&gt; an earlier commit, even an upstream one.
&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;When you have a working state of things, tidy up your git branch:
&lt;/p&gt;&lt;div style=&quot;background-color: #ddf; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git merge&lt;/h5&gt;

&lt;p&gt;Use &lt;code&gt;git-rebase&lt;/code&gt; to squash/edit/combine/reorder commits.
&lt;/p&gt;&lt;/div&gt;
&lt;div style=&quot;background-color: #ffa; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git-debrebase&lt;/h5&gt;

&lt;p&gt;Use &lt;code&gt;git-debrebase -i&lt;/code&gt; to squash/edit/combine/reorder commits. When you are happy, run &lt;code&gt;git-debrebase conclude&lt;/code&gt;.
&lt;/p&gt;&lt;p&gt;&lt;strong&gt;Do not edit debian/patches/&lt;/strong&gt;. With git-debrebase, this is purely an output. Edit the upstream files directly instead. To reorganise/maintain the patch queue, use &lt;code&gt;git-debrebase -i&lt;/code&gt; to edit the actual commits.
&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/div&gt;
&lt;h5&gt;&lt;/h5&gt;

&lt;p&gt;Push the MR branch (topic branch) to Salsa and make a Merge Request.
&lt;/p&gt;&lt;p&gt;Set the MR to “auto-merge when all checks pass”. (Or, depending on your team policy, you could ask for an MR Review of course.)
&lt;/p&gt;&lt;p&gt;If CI fails, fix up the MR branch, squash/tidy it again, force push the MR branch, and once again set it to auto-merge.
&lt;/p&gt;&lt;h2&gt;&lt;a name=&quot;test-build&quot;&gt;Test build&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;An informal test build can be done like this:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;apt-get build-dep .
dpkg-buildpackage -uc -b&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Ideally this will leave &lt;code&gt;git status&lt;/code&gt; clean, with no modified or un-ignored untracked files. If it shows untracked files, add them to &lt;code&gt;.gitignore&lt;/code&gt; or &lt;code&gt;debian/.gitignore&lt;/code&gt; as applicable.
&lt;/p&gt;&lt;p&gt;If it dirties the tree, consider trying to make it stop doing that. The easiest way is probably to build out-of-tree, if supported upstream. If this is too difficult, you can leave the messy build arrangements as they are, but you’ll need to be disciplined about always committing, using git clean and git reset, and so on.
&lt;/p&gt;&lt;p&gt;For formal binaries builds, including for testing, use &lt;code&gt;dgit sbuild&lt;/code&gt; as &lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#uploading-a-new-package-to-debian&quot;&gt;described below for uploading to NEW&lt;/a&gt;.
&lt;/p&gt;&lt;h2&gt;&lt;a name=&quot;uploading-to-debian&quot;&gt;Uploading to Debian&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Start an MR branch for the administrative changes for the release.
&lt;/p&gt;&lt;p&gt;Document all the changes you’re going to release, in the &lt;code&gt;debian/changelog&lt;/code&gt;.
&lt;/p&gt;&lt;div style=&quot;background-color: #ddf; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git merge&lt;/h5&gt;

&lt;p&gt;gbp dch can help write the changelog for you:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;dgit fetch sid
gbp dch --ignore-branch --since=dgit/dgit/sid --git-log=^upstream/main&lt;/code&gt;&lt;/pre&gt;&lt;blockquote style=&quot;background-color: #cce; color: #222; font-style: italic;&quot;&gt;
&lt;h6 style=&quot;margin-bottom: 0;&quot;&gt;rationale&lt;/h6&gt;

&lt;p&gt;&lt;code&gt;--ignore-branch&lt;/code&gt; is needed because gbp dch wrongly thinks you ought to be running this on &lt;code&gt;master&lt;/code&gt;, but of course you’re running it on your MR branch.
&lt;/p&gt;&lt;p&gt;The &lt;code&gt;--git-log=^upstream/main&lt;/code&gt; excludes all upstream commits from the listing used to generate the changelog. (I’m assuming you have an &lt;code&gt;upstream&lt;/code&gt; remote and that you’re basing your work on their &lt;code&gt;main&lt;/code&gt; branch.) If there was a new upstream version, you’ll usually want to write a single line about that, and perhaps summarise anything really important.
&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/blockquote&gt;

&lt;p&gt;(For the first upload after switching to using tag2upload or dgit you need &lt;code&gt;--since=debian/1.2.3-1&lt;/code&gt;, where &lt;code&gt;1.2.3-1&lt;/code&gt; is your previous DEP-14 tag, because &lt;code&gt;dgit/dgit/sid&lt;/code&gt; will be a dsc import, not your actual history.)
&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/div&gt;
&lt;h5&gt;&lt;/h5&gt;

&lt;p&gt;Change &lt;code&gt;UNRELEASED&lt;/code&gt; to the target suite, and finalise the changelog. (Note that &lt;code&gt;dch&lt;/code&gt; will insist that you at least save the file in your editor.)
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;dch -r
git commit -m &#39;Finalise for upload&#39; debian/changelog&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Make an MR of these administrative changes, and merge it. (Either set it to auto-merge and wait for CI, or if you’re in a hurry double-check that it really is just a changelog update so that you can be confident about telling Salsa to “Merge unverified changes”.)
&lt;/p&gt;&lt;p&gt;Now you can perform the actual upload:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;git checkout master
git pull --ff-only # bring the gitlab-made MR merge commit into your local tree&lt;/code&gt;&lt;/pre&gt;&lt;div style=&quot;background-color: #ddf; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git merge&lt;/h5&gt;

&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;git-debpush&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;div style=&quot;background-color: #ffa; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git-debrebase&lt;/h5&gt;

&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;git-debpush --quilt=linear&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;&lt;code&gt;--quilt=linear&lt;/code&gt; is needed only the first time, but it is very important that first time, to tell the system the correct git branch layout.
&lt;/p&gt;&lt;/div&gt;
&lt;h5&gt;&lt;/h5&gt;

&lt;h2&gt;&lt;a name=&quot;uploading-a-new-package-to-debian&quot;&gt;Uploading a NEW package to Debian&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;If your package is NEW (completely new source, or has new binary packages) you can’t do a source-only upload. You have to build the source and binary packages locally, and upload those build artifacts.
&lt;/p&gt;&lt;p&gt;Happily, given the same git branch you’d tag for tag2upload, and assuming you have sbuild installed and a suitable chroot, &lt;code&gt;dgit&lt;/code&gt; can help take care of the build and upload for you:
&lt;/p&gt;&lt;p&gt;Prepare the changelog update and merge it, as above. Then:
&lt;/p&gt;&lt;div style=&quot;background-color: #ffa; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git-debrebase&lt;/h5&gt;

&lt;p&gt;Create the orig tarball and launder the git-derebase branch:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;git-deborig
git-debrebase quick&lt;/code&gt;&lt;/pre&gt;&lt;blockquote style=&quot;background-color: #ee9; color: #222; font-style: italic;&quot;&gt;
&lt;h6 style=&quot;margin-bottom: 0;&quot;&gt;rationale&lt;/h6&gt;

&lt;p&gt;Source package format 3.0 (quilt), which is what I’m recommending here for use with git-debrebase, needs an orig tarball; it would also be needed for 1.0-with-diff.
&lt;/p&gt;&lt;/blockquote&gt;

&lt;p&gt;&lt;/p&gt;&lt;/div&gt;
&lt;h5&gt;&lt;/h5&gt;

&lt;p&gt;Build the source and binary packages, locally:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;dgit sbuild
dgit push-built&lt;/code&gt;&lt;/pre&gt;&lt;blockquote style=&quot;background-color: #eee; color: #222; font-style: italic;&quot;&gt;
&lt;h6 style=&quot;margin-bottom: 0;&quot;&gt;rationale&lt;/h6&gt;

&lt;p&gt;You don’t &lt;em&gt;have to&lt;/em&gt; use &lt;code&gt;dgit sbuild&lt;/code&gt;, but it is usually convenient to do so, because unlike sbuild, dgit understands git. Also it works around a &lt;a href=&quot;https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=908747&quot;&gt;gitignore-related defect&lt;/a&gt; in dpkg-source.
&lt;/p&gt;&lt;/blockquote&gt;

&lt;h2&gt;&lt;a name=&quot;new-upstream-version&quot;&gt;New upstream version&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Find the new upstream version number and corresponding tag. (Let’s suppose it’s &lt;code&gt;1.2.4&lt;/code&gt;.) Check the provenance:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;git verify-tag v1.2.4&lt;/code&gt;&lt;/pre&gt;&lt;blockquote style=&quot;background-color: #eee; color: #222; font-style: italic;&quot;&gt;
&lt;h6 style=&quot;margin-bottom: 0;&quot;&gt;rationale&lt;/h6&gt;

&lt;p&gt;Not all upstreams sign their git tags, sadly. Sometimes encouraging them to do so can help. You may need to use some other method(s) to check that you have the right git commit for the release.
&lt;/p&gt;&lt;/blockquote&gt;

&lt;div style=&quot;background-color: #ddf; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git merge&lt;/h5&gt;

&lt;p&gt;Simply merge the new upstream version and update the changelog:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;git merge v1.2.4
dch -v1.2.4-1 &#39;New upstream release.&#39;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;&lt;/p&gt;&lt;/div&gt;
&lt;div style=&quot;background-color: #ffa; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git-debrebase&lt;/h5&gt;

&lt;p&gt;Rebase your delta queue onto the new upstream version:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;git debrebase mew-upstream 1.2.4&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;&lt;/p&gt;&lt;/div&gt;
&lt;h5&gt;&lt;/h5&gt;

&lt;p&gt;If there are conflicts between your Debian delta for 1.2.3, and the upstream changes in 1.2.4, this is when you need to resolve them, as part of &lt;code&gt;git merge&lt;/code&gt; or &lt;code&gt;git (deb)rebase&lt;/code&gt;.
&lt;/p&gt;&lt;p&gt;After you’ve completed the merge, test your package and make any further needed changes. When you have it working in a local branch, make a Merge Request, as above.
&lt;/p&gt;&lt;h2&gt;&lt;a name=&quot;sponsorship&quot;&gt;Sponsorship&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;git-based sponsorship is super easy! The sponsee can maintain their git branch on Salsa, and do all normal maintenance via gitlab operations.
&lt;/p&gt;&lt;p&gt;When the time comes to upload, the sponsee notifies the sponsor that it’s time. The sponsor fetches and checks out the git branch from Salsa, does their checks, as they judge appropriate, and when satisfied runs &lt;code&gt;git-debpush&lt;/code&gt;.
&lt;/p&gt;&lt;p&gt;As part of the sponsor’s checks, they might want to see all changes since the last upload to Debian:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;dgit fetch sid
git diff dgit/dgit/sid..HEAD&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Or to see the Debian delta of the proposed upload:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;git verify-tag v1.2.3
git diff v1.2.3..HEAD &#39;:!debian&#39;&lt;/code&gt;&lt;/pre&gt;&lt;div style=&quot;background-color: #ffa; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git-debrebase&lt;/h5&gt;

&lt;p&gt;Or to show all the delta as a series of commits:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;git log -p v1.2.3..HEAD &#39;:!debian&#39;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Don’t look at &lt;code&gt;debian/patches/&lt;/code&gt;. It can be absent or out of date.
&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/div&gt;
&lt;h5&gt;&lt;/h5&gt;

&lt;h2&gt;&lt;a name=&quot;incorporating-an-nmu&quot;&gt;Incorporating an NMU&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Fetch the NMU into your local git, and see what it contains:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;dgit fetch sid
git diff master...dgit/dgit/sid&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;If the NMUer &lt;a href=&quot;https://manpages.debian.org/testing/dgit/dgit-nmu-simple.7.en.html&quot;&gt;used dgit&lt;/a&gt;, then &lt;code&gt;git log dgit/dgit/sid&lt;/code&gt; will show you the commits they made.
&lt;/p&gt;&lt;p&gt;Normally the best thing to do is to simply merge the NMU, and then do any reverts or rework in followup commits:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;git merge dgit/dgit/sid&lt;/code&gt;&lt;/pre&gt;&lt;div style=&quot;background-color: #ffa; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git-debrebase&lt;/h5&gt;

&lt;p&gt;You should &lt;code&gt;git-debrebase quick&lt;/code&gt; at this stage, to check that the merge went OK and the package still has a lineariseable delta queue.
&lt;/p&gt;&lt;/div&gt;
&lt;h5&gt;&lt;/h5&gt;

&lt;p&gt;Then make any followup changes that seem appropriate. Supposing your previous maintainer upload was &lt;code&gt;1.2.3-7&lt;/code&gt;, you can go back and see the NMU diff again with:
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;git diff debian/1.2.3-7...dgit/dgit/sid&lt;/code&gt;&lt;/pre&gt;&lt;div style=&quot;background-color: #ffa; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git-debrebase&lt;/h5&gt;

&lt;p&gt;The actual changes made to upstream files will always show up as diff hunks to those files. diff commands will often also show you changes to &lt;code&gt;debian/patches/&lt;/code&gt;. Normally it’s best to filter them out with &lt;code&gt;git diff ... &#39;:!debian/patches&#39;&lt;/code&gt;
&lt;/p&gt;&lt;p&gt;If you’d prefer to read the changes to the delta queue as an interdiff (diff of diffs), you can do something like
&lt;/p&gt;&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;git checkout debian/1.2.3-7
git-debrebase --force make-patches
git diff HEAD...dgit/dgit/sid -- :debian/patches&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;to diff against a version with &lt;code&gt;debian/patches/&lt;/code&gt; up to date. (The NMU, in &lt;code&gt;dgit/dgit/sid&lt;/code&gt;, will necessarily have the patches already up to date.)
&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/div&gt;
&lt;h5&gt;&lt;/h5&gt;

&lt;h2&gt;&lt;a name=&quot;dfsg-filtering-handling-non-free-files&quot;&gt;DFSG filtering (handling non-free files)&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Some upstreams ship non-free files of one kind of another. Often these are just in the tarballs, in which case basing your work on upstream git avoids the problem. But if the files are in upstream’s git trees, you need to filter them out.
&lt;/p&gt;&lt;p&gt;&lt;strong&gt;This advice is not for (legally or otherwise) dangerous files&lt;/strong&gt;. If your package contains files that may be illegal, or hazardous, you need much more serious measures. In this case, even pushing the upstream git history to any Debian service, including Salsa, must be avoided. If you suspect this situation you should seek advice, privately and as soon as possible, from dgit-owner@d.o and/or the DFSG team. Thankfully, legally dangerous files are very rare in upstream git repositories, for obvious reasons.
&lt;/p&gt;&lt;p&gt;Our approach is to make a filtered git branch, based on the upstream history, with the troublesome files removed. We then treat that as the upstream for all of the rest of our work.
&lt;/p&gt;&lt;blockquote style=&quot;background-color: #eee; color: #222; font-style: italic;&quot;&gt;
&lt;h6 style=&quot;margin-bottom: 0;&quot;&gt;rationale&lt;/h6&gt;

&lt;p&gt;Yes, this will end up including the non-free files in the git history, on official Debian servers. That’s OK. What’s forbidden is non-free material in the Debianised git tree, or in the source packages.
&lt;/p&gt;&lt;/blockquote&gt;

&lt;h3&gt;&lt;a name=&quot;initial-filtering&quot;&gt;Initial filtering&lt;/a&gt;&lt;/h3&gt;
&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;git checkout -b upstream-dfsg v1.2.3
git rm nonfree.exe
git commit -m &quot;upstream version 1.2.3 DFSG-cleaned&quot;
git tag -s -m &quot;upstream version 1.2.3 DFSG-cleaned&quot; v1.2.3+ds1
git push origin upstream-dfsg&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;And now, use &lt;code&gt;1.2.3+ds1&lt;/code&gt;, and the filtered branch &lt;code&gt;upstream-dfsg&lt;/code&gt;, as the upstream version, instead of &lt;code&gt;1.2.3&lt;/code&gt; and &lt;code&gt;upstream/main&lt;/code&gt;. Follow the steps for &lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#convert-the-git-branch&quot;&gt;Convert the git branch&lt;/a&gt; or &lt;a href=&quot;https://diziet.dreamwidth.org/data/atom#new-upstream-version&quot;&gt;New upstream version&lt;/a&gt;, as applicable, adding &lt;code&gt;+ds1&lt;/code&gt; into &lt;code&gt;debian/changelog&lt;/code&gt;.
&lt;/p&gt;&lt;p&gt;If you missed something and need to filter out more a nonfree files, re-use the same &lt;code&gt;upstream-dfsg&lt;/code&gt; branch and bump the &lt;code&gt;ds&lt;/code&gt; version, eg &lt;code&gt;v1.2.3+ds2&lt;/code&gt;.
&lt;/p&gt;&lt;h3&gt;&lt;a name=&quot;subsequent-upstream-releases&quot;&gt;Subsequent upstream releases&lt;/a&gt;&lt;/h3&gt;
&lt;pre style=&quot;margin-left: 1em;&quot;&gt;&lt;code&gt;git checkout upstream-dfsg
git merge v1.2.4
git rm additional-nonfree.exe # if any
git commit -m &quot;upstream version 1.2.4 DFSG-cleaned&quot;
git tag -s -m &quot;upstream version 1.2.4 DFSG-cleaned&quot; v1.2.4+ds1
git push origin upstream-dfsg&lt;/code&gt;&lt;/pre&gt;&lt;h3&gt;&lt;a name=&quot;removing-files-by-pattern&quot;&gt;Removing files by pattern&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;If the files you need to remove keep changing, you could automate things with a small shell script &lt;code&gt;debian/rm-nonfree&lt;/code&gt; containing appropriate &lt;code&gt;git rm&lt;/code&gt; commands. If you use &lt;code&gt;git rm -f&lt;/code&gt; it will succeed even if the &lt;code&gt;git merge&lt;/code&gt; from real upstream has conflicts due to changes to non-free files.
&lt;/p&gt;&lt;blockquote style=&quot;background-color: #eee; color: #222; font-style: italic;&quot;&gt;
&lt;h6 style=&quot;margin-bottom: 0;&quot;&gt;rationale&lt;/h6&gt;

&lt;p&gt;Ideally &lt;code&gt;uscan&lt;/code&gt;, which has a way of representing DFSG filtering patterns in &lt;code&gt;debian/watch&lt;/code&gt;, would be able to do this, but sadly the relevant functionality is entangled with uscan’s tarball generation.
&lt;/p&gt;&lt;/blockquote&gt;

&lt;h1&gt;&lt;a name=&quot;common-issues&quot;&gt;Common issues&lt;/a&gt;&lt;/h1&gt;
&lt;ul&gt;&lt;li&gt;&lt;p&gt;&lt;strong&gt;Tarball contents&lt;/strong&gt;: If you are switching from upstream tarballs to upstream git, you may find that the git tree is significantly different.
&lt;/p&gt;&lt;p&gt;It may be missing files that your current build system relies on. If so, you definitely want to be using git, not the tarball. Those extra files in the tarball are intermediate built products, but in Debian we should be building from the real source! Fixing this may involve some work, though.

&lt;/p&gt;&lt;/li&gt;&lt;li&gt;&lt;p&gt;&lt;strong&gt;gitattributes&lt;/strong&gt;:
&lt;/p&gt;&lt;p&gt;For &lt;a href=&quot;https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1079434#20&quot;&gt;Reasons&lt;/a&gt; the dgit and tag2upload system disregards and disables the use of &lt;code&gt;.gitattributes&lt;/code&gt; to modify files as they are checked out.
&lt;/p&gt;&lt;p&gt;Normally this doesn’t cause a problem so long as any orig tarballs are generated the same way (as they will be by tag2upload or &lt;code&gt;git-deborig&lt;/code&gt;). But if the package or build system relies on them, you may need to institute some workarounds, or, replicate the effect of the gitattributes as commits in git.

&lt;/p&gt;&lt;/li&gt;&lt;li&gt;&lt;p&gt;&lt;strong&gt;git submodules&lt;/strong&gt;: &lt;a href=&quot;https://diziet.dreamwidth.org/14666.html&quot;&gt;git submodules are terrible&lt;/a&gt; and should never ever be used. But not everyone has got the message, so your upstream may be using them.
&lt;/p&gt;&lt;p&gt;If you’re lucky, the code in the submodule isn’t used in which case you can &lt;code&gt;git rm&lt;/code&gt; the submodule.

&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/li&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/ul&gt;
&lt;h1&gt;&lt;a name=&quot;further-reading&quot;&gt;Further reading&lt;/a&gt;&lt;/h1&gt;
&lt;p&gt;I’ve tried to cover the most common situations. But software is complicated and there are many exceptions that this article can’t cover without becoming much harder to read.
&lt;/p&gt;&lt;p&gt;You may want to look at:
&lt;/p&gt;&lt;ul&gt;&lt;li&gt;&lt;p&gt;&lt;strong&gt;dgit workflow manpages&lt;/strong&gt;: As part of the git transition project, we have written workflow manpages, which are more comprehensive than this article. They’re centered around use of dgit, but also discuss tag2upload where applicable.
&lt;/p&gt;&lt;p&gt;These cover a much wider range of possibilities, including (for example) choosing different source package formats, how to handle upstreams that publish only tarballs, etc. They are correspondingly much less opinionated.
&lt;/p&gt;&lt;p&gt;Look in &lt;a href=&quot;https://manpages.debian.org/testing/dgit/dgit-maint-merge.7.en.html&quot;&gt;dgit-maint-merge(7)&lt;/a&gt; and &lt;a href=&quot;https://manpages.debian.org/testing/dgit/dgit-maint-debrebase.7.en.html&quot;&gt;dgit-maint-debrebase(7)&lt;/a&gt;. There is also &lt;a href=&quot;https://manpages.debian.org/testing/dgit/dgit-maint-gbp.7.en.html&quot;&gt;dgit-maint-gbp(7)&lt;/a&gt; for those who want to keep using &lt;code&gt;gbp pq&lt;/code&gt; and/or &lt;code&gt;quilt&lt;/code&gt; with a patches-unapplied branch.

&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;
&lt;ul&gt;&lt;li&gt;&lt;p&gt;&lt;strong&gt;NMUs&lt;/strong&gt; are very easy with dgit. (tag2upload is usually less suitable than dgit, for an NMU.)
&lt;/p&gt;&lt;p&gt;You can work with any package, in git, in a completely uniform way, regardless of maintainer git workflow, See &lt;a href=&quot;https://manpages.debian.org/testing/dgit/dgit-nmu-simple.7.en.html&quot;&gt;dgit-nmu-simple(7)&lt;/a&gt;.

&lt;/p&gt;&lt;/li&gt;&lt;li&gt;&lt;p&gt;&lt;strong&gt;Native packages&lt;/strong&gt; (meaning packages maintained wholly within Debian) are much simpler. See &lt;a href=&quot;https://manpages.debian.org/testing/dgit/dgit-maint-native.7.en.html&quot;&gt;dgit-maint-native(7)&lt;/a&gt;.

&lt;/p&gt;&lt;/li&gt;&lt;li&gt;&lt;p&gt;&lt;strong&gt;tag2upload documentation&lt;/strong&gt;: The &lt;a href=&quot;https://wiki.debian.org/tag2upload&quot;&gt;tag2upload wiki page&lt;/a&gt; is a good starting point. There’s the &lt;a href=&quot;https://manpages.debian.org/testing/git-debpush/git-debpush.1.en.html&quot;&gt;git-debpush(1)&lt;/a&gt; manpage of course.

&lt;/p&gt;&lt;/li&gt;&lt;li&gt;&lt;p&gt;&lt;strong&gt;dgit reference documentation&lt;/strong&gt;:
&lt;/p&gt;&lt;p&gt;There is a comprehensive command-line manual in &lt;a href=&quot;https://manpages.debian.org/testing/dgit/dgit.1.en.html&quot;&gt;dgit(1)&lt;/a&gt;. Description of the dgit data model and Principles of Operation is in &lt;a href=&quot;https://manpages.debian.org/testing/dgit/dgit.7.en.html&quot;&gt;dgit(7)&lt;/a&gt;; including coverage of out-of-course situations.
&lt;/p&gt;&lt;p&gt;dgit is a complex and powerful program so this reference material can be overwhelming. So, we recommend starting with a guide like this one, or the dgit-…(7) workflow tutorials.

&lt;/p&gt;&lt;/li&gt;&lt;li&gt;&lt;p&gt;&lt;strong&gt;Design and implementation documentation for tag2upload&lt;/strong&gt; is &lt;a href=&quot;https://wiki.debian.org/tag2upload#Signatures_and_traceability&quot;&gt;linked to from the wiki&lt;/a&gt;.

&lt;/p&gt;&lt;/li&gt;&lt;li&gt;&lt;p&gt;&lt;a href=&quot;https://diziet.dreamwidth.org/20436.html&quot;&gt;&lt;strong&gt;Debian’s git transition&lt;/strong&gt;&lt;/a&gt; blog post from December.
&lt;/p&gt;&lt;p&gt;tag2upload and dgit are part of the git transition project, and aim to support a very wide variety of git workflows. tag2upload and dgit work well with existing git tooling, including git-buildpackage-based approaches.
&lt;/p&gt;&lt;p&gt;git-debrebase is conceptually separate from, and functionally independent of, tag2upload and dgit. It’s a git workflow and delta management tool, competing with &lt;code&gt;gbp pq&lt;/code&gt;, manual use of &lt;code&gt;quilt&lt;/code&gt;, &lt;code&gt;git-dpm&lt;/code&gt; and so on.

&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/li&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/ul&gt;
&lt;div style=&quot;background-color: #ffa; color: #000;&quot;&gt;
&lt;h5 style=&quot;margin-bottom: 0;&quot;&gt;git-debrebase&lt;/h5&gt;

&lt;ul&gt;&lt;li&gt;&lt;p&gt;&lt;strong&gt;git-debrebase reference documentation&lt;/strong&gt;:
&lt;/p&gt;&lt;p&gt;Of course there’s a comprehensive command-line manual in &lt;a href=&quot;https://manpages.debian.org/testing/git-debrebase/git-debrebase.1.en.html&quot;&gt;git-debrebase(1)&lt;/a&gt;.
&lt;/p&gt;&lt;p&gt;git-debrebase is quick and easy to use, but it has a complex data model and sophisticated algorithms. This is documented in &lt;a href=&quot;https://manpages.debian.org/testing/git-debrebase/git-debrebase.5.en.html&quot;&gt;git-debrebase(5)&lt;/a&gt;.

&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;
&lt;/div&gt;
&lt;h5&gt;&lt;/h5&gt;


&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/div&gt;
&lt;hr /&gt;
&lt;address&gt;
Edited 2026-03-05 18:48 UTC to add a missing &lt;code&gt;--noop-ok&lt;/code&gt; to the Salsa CI runes.  Thanks to Charlemagne Lasse for &lt;a href=&quot;https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1129577&quot;&gt;the report&lt;/a&gt;.  Apologies if this causes Debian Planet to re-post this article as if it were new.
&lt;/address&gt;&lt;br /&gt;&lt;br /&gt;&lt;img alt=&quot;comment count unavailable&quot; height=&quot;12&quot; src=&quot;https://www.dreamwidth.org/tools/commentcount?user=diziet&amp;amp;ditemid=20851&quot; style=&quot;vertical-align: middle;&quot; width=&quot;30&quot; /&gt; comments </description> 
	<pubDate>Thu, 05 Mar 2026 18:47:55 +0000</pubDate>

</item> 
<item>
	<title>Vincent Bernat: Automatic Prometheus metrics discovery with Docker labels</title>
	<guid>http://www.luffy.cx/en/blog/2026-prometheus-metrics-discovery-docker-labels.html</guid>
	<link>https://vincent.bernat.ch/en/blog/2026-prometheus-metrics-discovery-docker-labels</link>
     <description>  &lt;p&gt;&lt;a href=&quot;https://vincent.bernat.ch/en/blog/2025-akvorado-2.0&quot; title=&quot;Akvorado release 2.0&quot;&gt;Akvorado&lt;/a&gt;, a network flow collector, relies on &lt;a href=&quot;https://traefik.io/traefik&quot; title=&quot;Traefik: modern HTTP reverse proxy and load-balancer&quot;&gt;Traefik&lt;/a&gt;, a reverse HTTP
proxy, to expose HTTP endpoints for its &lt;a href=&quot;https://docs.docker.com/compose/&quot; title=&quot;Docker Compose documentation&quot;&gt;Docker Compose&lt;/a&gt; services. &lt;a href=&quot;https://docs.docker.com/engine/manage-resources/labels/&quot; title=&quot;Docker object labels&quot;&gt;Docker
labels&lt;/a&gt; attached to each service define the routing rules. Traefik picks them
up automatically when a container starts. Instead of maintaining a static
configuration file to collect &lt;a href=&quot;https://prometheus.io/docs/concepts/data_model/&quot; title=&quot;Prometheus Data Model&quot;&gt;Prometheus metrics&lt;/a&gt;, we apply the same approach
with &lt;a href=&quot;https://grafana.com/docs/alloy/latest/&quot; title=&quot;Grafana Alloy documentation&quot;&gt;Grafana Alloy&lt;/a&gt;.&lt;/p&gt;
&lt;div class=&quot;toc&quot;&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://vincent.bernat.ch#traefik-docker&quot;&gt;Traefik &amp;amp; Docker&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://vincent.bernat.ch#metrics-discovery-with-alloy&quot;&gt;Metrics discovery with Alloy&lt;/a&gt;&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://vincent.bernat.ch#discovering-docker-containers&quot;&gt;Discovering Docker containers&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://vincent.bernat.ch#relabeling-targets&quot;&gt;Relabeling targets&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://vincent.bernat.ch#scraping-and-forwarding&quot;&gt;Scraping and forwarding&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://vincent.bernat.ch#built-in-exporters&quot;&gt;Built-in exporters&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;h1 id=&quot;traefik-docker&quot;&gt;Traefik &amp;amp; Docker&lt;/h1&gt;
&lt;p&gt;Traefik &lt;a href=&quot;https://doc.traefik.io/traefik/reference/install-configuration/providers/docker/&quot; title=&quot;Traefik: Docker provider&quot;&gt;listens for events on the Docker socket&lt;/a&gt;. Each service advertises its
configuration through labels. For example, here is the Loki service in Akvorado:&lt;/p&gt;
&lt;div class=&quot;language-yaml codehilite&quot;&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;span class=&quot;nt&quot;&gt;services&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;:&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;nt&quot;&gt;loki&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;:&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;    &lt;/span&gt;&lt;span class=&quot;c1&quot;&gt;# …&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;    &lt;/span&gt;&lt;span class=&quot;nt&quot;&gt;expose&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;:&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;      &lt;/span&gt;&lt;span class=&quot;p p-Indicator&quot;&gt;-&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;l l-Scalar l-Scalar-Plain&quot;&gt;3100/tcp&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;    &lt;/span&gt;&lt;span class=&quot;nt&quot;&gt;labels&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;:&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;      &lt;/span&gt;&lt;span class=&quot;p p-Indicator&quot;&gt;-&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;l l-Scalar l-Scalar-Plain&quot;&gt;traefik.enable=true&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;      &lt;/span&gt;&lt;span class=&quot;p p-Indicator&quot;&gt;-&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;l l-Scalar l-Scalar-Plain&quot;&gt;traefik.http.routers.loki.rule=PathPrefix(`/loki`)&lt;/span&gt;
&lt;/pre&gt;&lt;/div&gt;


&lt;p&gt;Once the container is healthy, Traefik creates a router forwarding requests
matching &lt;code&gt;/loki&lt;/code&gt; to its first exposed port. Colocating Traefik configuration
with the service definition is attractive. How do we achieve the same for
Prometheus metrics?&lt;/p&gt;
&lt;h1 id=&quot;metrics-discovery-with-alloy&quot;&gt;Metrics discovery with Alloy&lt;/h1&gt;
&lt;p&gt;&lt;a href=&quot;https://grafana.com/docs/alloy/latest/&quot; title=&quot;Grafana Alloy documentation&quot;&gt;Grafana Alloy&lt;/a&gt;, a metrics collector that scrapes Prometheus endpoints,
includes a &lt;a href=&quot;https://grafana.com/docs/alloy/latest/reference/components/discovery/discovery.docker/&quot; title=&quot;Alloy: discovery.docker&quot;&gt;&lt;code&gt;discovery.docker&lt;/code&gt;&lt;/a&gt; component. Just like Traefik,
it connects to the Docker socket.&lt;sup id=&quot;fnref-socket&quot;&gt;&lt;a class=&quot;footnote-ref&quot; href=&quot;https://vincent.bernat.ch#fn-socket&quot;&gt;1&lt;/a&gt;&lt;/sup&gt; With a few relabeling rules, we teach
it to use Docker labels to locate and scrape metrics.&lt;/p&gt;
&lt;p&gt;We define three labels on each service:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;code&gt;metrics.enable&lt;/code&gt; set to &lt;code&gt;true&lt;/code&gt; enables metrics collection,&lt;/li&gt;
&lt;li&gt;&lt;code&gt;metrics.port&lt;/code&gt; specifies the port exposing the Prometheus endpoint, and&lt;/li&gt;
&lt;li&gt;&lt;code&gt;metrics.path&lt;/code&gt; specifies the path to the metrics endpoint.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;If a service exposes more than one port, &lt;code&gt;metrics.port&lt;/code&gt; is mandatory. Otherwise,
it defaults to the only exposed port. The default value for &lt;code&gt;metrics.path&lt;/code&gt; is
&lt;code&gt;/metrics&lt;/code&gt;. The Loki service from earlier becomes:&lt;/p&gt;
&lt;div class=&quot;language-yaml codehilite&quot;&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;span class=&quot;nt&quot;&gt;services&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;:&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;nt&quot;&gt;loki&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;:&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;    &lt;/span&gt;&lt;span class=&quot;c1&quot;&gt;# …&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;    &lt;/span&gt;&lt;span class=&quot;nt&quot;&gt;expose&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;:&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;      &lt;/span&gt;&lt;span class=&quot;p p-Indicator&quot;&gt;-&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;l l-Scalar l-Scalar-Plain&quot;&gt;3100/tcp&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;    &lt;/span&gt;&lt;span class=&quot;nt&quot;&gt;labels&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;:&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;      &lt;/span&gt;&lt;span class=&quot;p p-Indicator&quot;&gt;-&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;l l-Scalar l-Scalar-Plain&quot;&gt;traefik.enable=true&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;      &lt;/span&gt;&lt;span class=&quot;p p-Indicator&quot;&gt;-&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;l l-Scalar l-Scalar-Plain&quot;&gt;traefik.http.routers.loki.rule=PathPrefix(`/loki`)&lt;/span&gt;
&lt;span class=&quot;hll&quot;&gt;&lt;span class=&quot;w&quot;&gt;      &lt;/span&gt;&lt;span class=&quot;p p-Indicator&quot;&gt;-&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;l l-Scalar l-Scalar-Plain&quot;&gt;metrics.enable=true&lt;/span&gt;
&lt;/span&gt;&lt;span class=&quot;hll&quot;&gt;&lt;span class=&quot;w&quot;&gt;      &lt;/span&gt;&lt;span class=&quot;p p-Indicator&quot;&gt;-&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;l l-Scalar l-Scalar-Plain&quot;&gt;metrics.path=/loki/metrics&lt;/span&gt;
&lt;/span&gt;&lt;/pre&gt;&lt;/div&gt;


&lt;p&gt;Alloy’s configuration is split into four parts:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;discover&lt;/strong&gt; containers through the Docker socket,&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;filter and relabel&lt;/strong&gt; targets using Docker labels,&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;scrape&lt;/strong&gt; the matching endpoints, and&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;forward&lt;/strong&gt; the metrics to Prometheus.&lt;/li&gt;
&lt;/ol&gt;
&lt;h2 id=&quot;discovering-docker-containers&quot;&gt;Discovering Docker containers&lt;/h2&gt;
&lt;p&gt;The first building block discovers running containers:&lt;/p&gt;
&lt;div class=&quot;language-terraform codehilite&quot;&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;span class=&quot;nv&quot;&gt;discovery.docker&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;docker&quot;&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;host&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;             &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;unix:///var/run/docker.sock&quot;&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;refresh_interval&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;30s&quot;&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;nb&quot;&gt;filter&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;    &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;name&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;   &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;label&quot;&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;    &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;values&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;[&lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;com.docker.compose.project=akvorado&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;]&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;/pre&gt;&lt;/div&gt;


&lt;p&gt;This connects to the Docker socket and lists containers every 30
seconds.&lt;sup id=&quot;fnref-events&quot;&gt;&lt;a class=&quot;footnote-ref&quot; href=&quot;https://vincent.bernat.ch#fn-events&quot;&gt;2&lt;/a&gt;&lt;/sup&gt; The &lt;code&gt;filter&lt;/code&gt; block restricts discovery to containers belonging
to the &lt;code&gt;akvorado&lt;/code&gt; project, avoiding interference with unrelated containers on
the same host. For each discovered container, Alloy produces a target with
labels such as &lt;code&gt;__meta_docker_container_label_metrics_port&lt;/code&gt; for the
&lt;code&gt;metrics.port&lt;/code&gt; Docker label.&lt;/p&gt;
&lt;h2 id=&quot;relabeling-targets&quot;&gt;Relabeling targets&lt;/h2&gt;
&lt;p&gt;The relabeling step filters and transforms raw targets from Docker discovery
into scrape targets. The first stage keeps only targets with &lt;code&gt;metrics.enable&lt;/code&gt;
set to &lt;code&gt;true&lt;/code&gt;:&lt;/p&gt;
&lt;div class=&quot;language-terraform codehilite&quot;&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;span class=&quot;nv&quot;&gt;discovery.relabel&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;prometheus&quot;&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;targets&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;nv&quot;&gt;discovery.docker.docker.targets&lt;/span&gt;

&lt;span class=&quot;c1&quot;&gt;  // Keep only targets with metrics.enable=true&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;nb&quot;&gt;rule&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;    &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;source_labels&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;[&lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;__meta_docker_container_label_metrics_enable&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;]&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;    &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;regex&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;         &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;err&quot;&gt;`&lt;/span&gt;&lt;span class=&quot;no&quot;&gt;true&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;`&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;    &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;action&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;        &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;keep&quot;&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;

&lt;span class=&quot;c1&quot;&gt;  // …&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;/pre&gt;&lt;/div&gt;


&lt;p&gt;The second stage overrides the discovered port when the service defines
&lt;code&gt;metrics.port&lt;/code&gt;:&lt;/p&gt;
&lt;div class=&quot;language-terraform codehilite&quot;&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;span class=&quot;c1&quot;&gt;// When metrics.port is set, override __address__.&lt;/span&gt;
&lt;span class=&quot;nb&quot;&gt;rule&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;source_labels&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;[&lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;__address__&quot;, &quot;__meta_docker_container_label_metrics_port&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;]&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;regex&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;         &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;err&quot;&gt;`&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(.&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;+&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;):&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;\d+;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(.&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;+&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;)&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;`&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;target_label&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;__address__&quot;&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;replacement&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;   &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;$1:$2&quot;&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;/pre&gt;&lt;/div&gt;


&lt;p&gt;Next, we handle containers in &lt;code&gt;host&lt;/code&gt; network mode. When
&lt;code&gt;__meta_docker_network_name&lt;/code&gt; equals &lt;code&gt;host&lt;/code&gt;, Alloy rewrites the address to
&lt;code&gt;host.docker.internal&lt;/code&gt; instead of &lt;code&gt;localhost&lt;/code&gt;:&lt;sup id=&quot;fnref-hostdocker&quot;&gt;&lt;a class=&quot;footnote-ref&quot; href=&quot;https://vincent.bernat.ch#fn-hostdocker&quot;&gt;3&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
&lt;div class=&quot;language-terraform codehilite&quot;&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;span class=&quot;c1&quot;&gt;// When host networking, override __address__ to host.docker.internal.&lt;/span&gt;
&lt;span class=&quot;nb&quot;&gt;rule&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;source_labels&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;[&lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;__meta_docker_container_label_metrics_port&quot;, &quot;__meta_docker_network_name&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;]&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;regex&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;         &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;err&quot;&gt;`&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(.&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;+&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;)&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;;host`&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;target_label&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;__address__&quot;&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;replacement&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;   &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;host.docker.internal:$1&quot;&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;/pre&gt;&lt;/div&gt;


&lt;p&gt;The next stage derives the job name from the service name, stripping any
numbered suffix. The instance label is the address without the port:&lt;/p&gt;
&lt;div class=&quot;language-terraform codehilite&quot;&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;span class=&quot;nb&quot;&gt;rule&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;source_labels&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;[&lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;__meta_docker_container_label_com_docker_compose_service&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;]&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;regex&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;         &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;err&quot;&gt;`&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(.&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;+&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;)(?:&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;-\d+&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;)?&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;`&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;target_label&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;job&quot;&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;span class=&quot;nb&quot;&gt;rule&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;source_labels&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;[&lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;__address__&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;]&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;regex&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;         &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;err&quot;&gt;`&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(.&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;+&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;):&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;\d+`&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;target_label&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;instance&quot;&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;/pre&gt;&lt;/div&gt;


&lt;p&gt;If a container defines &lt;code&gt;metrics.path&lt;/code&gt;, Alloy uses it. Otherwise, it defaults to
&lt;code&gt;/metrics&lt;/code&gt;:&lt;/p&gt;
&lt;div class=&quot;language-terraform codehilite&quot;&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;span class=&quot;nb&quot;&gt;rule&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;source_labels&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;[&lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;__meta_docker_container_label_metrics_path&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;]&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;regex&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;         &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;err&quot;&gt;`&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;(.&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;+&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;)&lt;/span&gt;&lt;span class=&quot;err&quot;&gt;`&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;target_label&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;__metrics_path__&quot;&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;span class=&quot;nb&quot;&gt;rule&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;source_labels&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;[&lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;__metrics_path__&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;]&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;regex&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;         &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;&quot;&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;target_label&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;__metrics_path__&quot;&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;replacement&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;   &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;/metrics&quot;&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;/pre&gt;&lt;/div&gt;


&lt;h2 id=&quot;scraping-and-forwarding&quot;&gt;Scraping and forwarding&lt;/h2&gt;
&lt;p&gt;With the targets properly relabeled, scraping and forwarding are
straightforward:&lt;/p&gt;
&lt;div class=&quot;language-terraform codehilite&quot;&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;span class=&quot;nv&quot;&gt;prometheus.scrape&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;docker&quot;&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;targets&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;         &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;nv&quot;&gt;discovery.relabel.prometheus.output&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;forward_to&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;      &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;[&lt;/span&gt;&lt;span class=&quot;nv&quot;&gt;prometheus.remote_write.default.receiver&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;]&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;scrape_interval&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;30s&quot;&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;

&lt;span class=&quot;nv&quot;&gt;prometheus.remote_write&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;default&quot;&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;nb&quot;&gt;endpoint&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;    &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;url&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;http://prometheus:9090/api/v1/write&quot;&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;/pre&gt;&lt;/div&gt;


&lt;p&gt;&lt;code&gt;prometheus.scrape&lt;/code&gt; periodically fetches metrics from the discovered targets.
&lt;code&gt;prometheus.remote_write&lt;/code&gt; sends them to Prometheus.&lt;/p&gt;
&lt;h1 id=&quot;built-in-exporters&quot;&gt;Built-in exporters&lt;/h1&gt;
&lt;p&gt;Some services do not expose a Prometheus endpoint. Redis and Kafka are common
examples. Alloy ships built-in &lt;a href=&quot;https://grafana.com/docs/alloy/latest/reference/components/prometheus/&quot; title=&quot;Alloy: Prometheus components&quot;&gt;Prometheus exporters&lt;/a&gt; that
query these services and expose metrics on their behalf.&lt;/p&gt;
&lt;div class=&quot;language-terraform codehilite&quot;&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;span class=&quot;nv&quot;&gt;prometheus.exporter.redis&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;docker&quot;&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;redis_addr&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;redis:6379&quot;&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;span class=&quot;nv&quot;&gt;discovery.relabel&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;redis&quot;&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;targets&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;nv&quot;&gt;prometheus.exporter.redis.docker.targets&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;nb&quot;&gt;rule&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;    &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;target_label&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;job&quot;&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;    &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;replacement&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;redis&quot;&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;span class=&quot;nv&quot;&gt;prometheus.scrape&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;redis&quot;&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;targets&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;         &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;nv&quot;&gt;discovery.relabel.redis.output&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;forward_to&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;      &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;[&lt;/span&gt;&lt;span class=&quot;nv&quot;&gt;prometheus.remote_write.default.receiver&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;]&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;scrape_interval&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;30s&quot;&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;/pre&gt;&lt;/div&gt;


&lt;p&gt;The same pattern applies to Kafka:&lt;/p&gt;
&lt;div class=&quot;language-terraform codehilite&quot;&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;span class=&quot;nv&quot;&gt;prometheus.exporter.kafka&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;docker&quot;&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;kafka_uris&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;[&lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;kafka:9092&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;]&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;span class=&quot;nv&quot;&gt;discovery.relabel&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;kafka&quot;&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;targets&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;nv&quot;&gt;prometheus.exporter.kafka.docker.targets&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;nb&quot;&gt;rule&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;    &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;target_label&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;job&quot;&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;    &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;replacement&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;kafka&quot;&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;span class=&quot;nv&quot;&gt;prometheus.scrape&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;kafka&quot;&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;targets&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;         &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;nv&quot;&gt;discovery.relabel.kafka.output&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;forward_to&lt;/span&gt;&lt;span class=&quot;w&quot;&gt;      &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;[&lt;/span&gt;&lt;span class=&quot;nv&quot;&gt;prometheus.remote_write.default.receiver&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;]&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;na&quot;&gt;scrape_interval&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;30s&quot;&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;/pre&gt;&lt;/div&gt;


&lt;p&gt;Each exporter is a separate component with its own relabeling and scrape
configuration. We set the &lt;code&gt;job&lt;/code&gt; label explicitly since no Docker metadata can
provide it.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;With this setup, adding metrics to a new service with a Prometheus endpoint
requires only a few labels in &lt;code&gt;docker-compose.yml&lt;/code&gt;, just like adding a Traefik
route. Alloy picks it up automatically. You can apply the same pattern with
another discovery method, like &lt;a href=&quot;https://grafana.com/docs/alloy/latest/reference/components/discovery/discovery.kubernetes/&quot; title=&quot;Alloy: discovery.kubernetes&quot;&gt;&lt;code&gt;discovery.kubernetes&lt;/code&gt;&lt;/a&gt;,
&lt;a href=&quot;https://grafana.com/docs/alloy/latest/reference/components/discovery/discovery.scaleway/&quot; title=&quot;Alloy: discovery.scaleway&quot;&gt;&lt;code&gt;discovery.scaleway&lt;/code&gt;&lt;/a&gt;, or &lt;a href=&quot;https://grafana.com/docs/alloy/latest/reference/components/discovery/discovery.http/&quot; title=&quot;Alloy: discovery.http&quot;&gt;&lt;code&gt;discovery.http&lt;/code&gt;&lt;/a&gt;. 🩺&lt;/p&gt;
&lt;div class=&quot;footnote&quot;&gt;
&lt;hr /&gt;
&lt;ol&gt;
&lt;li id=&quot;fn-socket&quot;&gt;
&lt;p&gt;Both Traefik and Alloy require access to the Docker socket, which
grants root-level access to the host. A &lt;a href=&quot;https://github.com/Tecnativa/docker-socket-proxy&quot; title=&quot;Docker Socket Proxy: security-enhanced proxy for Docker socket&quot;&gt;Docker socket proxy&lt;/a&gt; mitigates
this by exposing only the read-only API endpoints needed for discovery. &lt;a class=&quot;footnote-backref&quot; href=&quot;https://vincent.bernat.ch#fnref-socket&quot; title=&quot;Jump back to footnote 1 in the text&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id=&quot;fn-events&quot;&gt;
&lt;p&gt;Unlike Traefik, which watches for events, Grafana Alloy polls the
container list at regular intervals—a behavior inherited from Prometheus. &lt;a class=&quot;footnote-backref&quot; href=&quot;https://vincent.bernat.ch#fnref-events&quot; title=&quot;Jump back to footnote 2 in the text&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id=&quot;fn-hostdocker&quot;&gt;
&lt;p&gt;The Alloy service needs &lt;code&gt;extra_hosts:
[&quot;host.docker.internal:host-gateway&quot;]&lt;/code&gt; in its definition. &lt;a class=&quot;footnote-backref&quot; href=&quot;https://vincent.bernat.ch#fnref-hostdocker&quot; title=&quot;Jump back to footnote 3 in the text&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt; </description> 
	<pubDate>Thu, 05 Mar 2026 15:40:24 +0000</pubDate>

</item> 
<item>
	<title>Sean Whitton: Southern Biscuits with British ingredients</title>
	<guid>https://spwhitton.name//blog/entry/southernbiscuits/</guid>
	<link>https://spwhitton.name//blog/entry/southernbiscuits/</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/spwhitton.png&quot; width=&quot;65&quot; height=&quot;85&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;I miss the US more and more, and have recently been trying to perfect Southern
Biscuits using British ingredients.  It took me eight or nine tries before I
was consistently getting good results.  Here is my recipe.&lt;/p&gt;

&lt;h2&gt;Ingredients&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;190g plain flour&lt;/li&gt;
&lt;li&gt;60g strong white bread flour&lt;/li&gt;
&lt;li&gt;4 tsp baking powder&lt;/li&gt;
&lt;li&gt;¼ tsp bicarbonate of soda&lt;/li&gt;
&lt;li&gt;1 tsp cream of tartar (optional)&lt;/li&gt;
&lt;li&gt;1 tsp salt&lt;/li&gt;
&lt;li&gt;100g unsalted butter&lt;/li&gt;
&lt;li&gt;180ml buttermilk, chilled

&lt;ul&gt;
&lt;li&gt;If your buttermilk is thicker than the consistency of ordinary milk,
you’ll need around 200ml.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;extra buttermilk for brushing&lt;/li&gt;
&lt;/ul&gt;


&lt;h2&gt;Method&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Slice and then chill the butter in the freezer for at least fifteen
minutes.&lt;/li&gt;
&lt;li&gt;Preheat oven to 220°C with the fan turned off.&lt;/li&gt;
&lt;li&gt;Twice sieve together the flours, leaveners and salt.  Some salt may not go
through the sieve; just tip it back into the bowl.&lt;/li&gt;
&lt;li&gt;Cut cold butter slices into the flour with a pastry blender until the
mixture resembles &lt;em&gt;coarse&lt;/em&gt; crumbs: some small lumps of fat remaining is
desirable.  In particular, the fine crumbs you are looking for when making
British scones are not wanted here.  Rubbing in with fingertips just won’t
do; biscuits demand keeping things cold even more than shortcrust pastry
does.&lt;/li&gt;
&lt;li&gt;Make a well in the centre, pour in the buttermilk, and stir with a metal
spoon until the dough comes together and pulls away from the sides of the
bowl.  Avoid overmixing, but I’ve found that so long as the ingredients are
cold, you don’t have to be too gentle at this stage and can make sure all
the crumbs are mixed in.&lt;/li&gt;
&lt;li&gt;Flour your hands, turn dough onto a floured work surface, and pat together
into a rectangle.  Some suggest dusting the top of the dough with flour,
too, here.&lt;/li&gt;
&lt;li&gt;Fold the dough in half, then gather any crumbs and pat it back into the
same shape.  Turn ninety degrees and do the same again, until you have
completed a total of eight folds, two in each cardinal direction.  The
dough should now be a little springy.&lt;/li&gt;
&lt;li&gt;Roll to about ½ inch thick.&lt;/li&gt;
&lt;li&gt;Cut out biscuits.  If using a round cutter, do not twist it, as that seals
the edges of the biscuits and so spoils the layering.&lt;/li&gt;
&lt;li&gt;Transfer to a baking sheet, placed close together (helps them rise).
Flour your thumb and use it to press an indent into the top of each
biscuit (helps them rise straight), brush with buttermilk.&lt;/li&gt;
&lt;li&gt;Bake until flaky and golden brown: about fifteen minutes.&lt;/li&gt;
&lt;/ol&gt;


&lt;h2&gt;Gravy&lt;/h2&gt;

&lt;p&gt;It turns out that the “pepper gravy” that one commonly has with biscuits is
just a white/béchamel sauce made with lots of black pepper.  I haven’t got a
recipe I really like for this yet.  Better is a “sausage gravy”; again this
has a white sauce as its base, I believe.  I have a vegetarian recipe for this
to try at some point.&lt;/p&gt;

&lt;h2&gt;Variations&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;These biscuits do come out fluffy but not so flaky.  For that you can try
using lard instead of butter, if you’re not vegetarian (vegetable shortening
is hard to find here).&lt;/li&gt;
&lt;li&gt;If you don’t have a pastry blender and don’t want to buy one you can try not
slicing the butter and instead coarsely grating it into the flour out of the
freezer.&lt;/li&gt;
&lt;li&gt;An alternative to folding is cutting and piling the layers.&lt;/li&gt;
&lt;li&gt;You can try rolling out to 1–1½ inches thick.&lt;/li&gt;
&lt;li&gt;Instead of cutting out biscuits you can just slice the whole piece of dough
into equal pieces. An advantage of this is that you don’t have to re-roll,
which latter also spoils the layering.&lt;/li&gt;
&lt;li&gt;Instead of brushing with buttermilk, you can take them out after they’ve
started to rise but before they’ve browned, brush them with melted butter
and put them back in.&lt;/li&gt;
&lt;/ul&gt;


&lt;h2&gt;Notes&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;I’ve had more success with Dale Farm’s buttermilk than Sainsbury’s own.  The
former is much runnier.&lt;/li&gt;
&lt;li&gt;Southern culture calls for biscuits to be made the size of cat’s heads.&lt;/li&gt;
&lt;li&gt;Bleached flour is apparently usual in the South, but is illegal(!) here.
Apparently bleaching can have some effect on the development of the gluten
which would affect the texture.&lt;/li&gt;
&lt;li&gt;&lt;p&gt;British plain flour is made from soft wheat and has a lower percentage of
protein/gluten, while American all-purpose flour is often(?) made from
harder wheat and has more protein.  In this recipe I mix plain and strong
white flour, in a ratio of 3:1, to emulate American all-purpose flour.&lt;/p&gt;

&lt;p&gt;I am not sure why this works best.  In the South they have soft wheats too,
and lower protein percentages.  The famous White Lily flour is 9%.
(Apparently you can mix US cake flour and US all-purpose flour in a ratio of
1:1 to achieve that; in the UK, Shipton Mill sell a “soft cake and pastry
flour” which has been recommended to me as similar.)&lt;/p&gt;

&lt;p&gt;This would suggest that British plain flour ought to be closer to Southern
flour than the standard flour available in most of the US.  But my
experience has been that the biscuits taste better with the plain and strong
white 3:1 mix.  Possibly Southeners would disprefer them.  I got some
feedback that good biscuits are about texture and moistness and not flavour.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;Baking powder in the US is usually double-acting but ours is always
single-acting, so we need double quantities of that.&lt;/li&gt;
&lt;/ul&gt; </description> 
	<pubDate>Wed, 04 Mar 2026 20:47:28 +0000</pubDate>

</item> 
<item>
	<title>Sean Whitton: dgit-as-a-service retrospective</title>
	<guid>https://spwhitton.name//blog/entry/tag2upload_retrospective/</guid>
	<link>https://spwhitton.name//blog/entry/tag2upload_retrospective/</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/spwhitton.png&quot; width=&quot;65&quot; height=&quot;85&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;We recently &lt;a href=&quot;https://lists.debian.org/debian-devel-announce/2026/02/msg00002.html&quot;&gt;launched tag2upload&lt;/a&gt;, aka &lt;em&gt;cloud dgit&lt;/em&gt; or &lt;em&gt;dgit-as-a-service&lt;/em&gt;.
This was something of a culmination of work I’ve been doing since 2016 towards
modernising Debian workflows, so I thought I’d write a short personal
retrospective.&lt;/p&gt;

&lt;p&gt;When I started contributing to Debian in 2015, I was not impressed with how
packages were represented in Git by most package maintainers, and wanted a
pure Git workflow.  I read a couple of Joey Hess’s blog posts on the matter,
&lt;a href=&quot;https://joeyh.name/blog/entry/a_rope_ladder_to_the_dgit_treehouse/&quot;&gt;a rope ladder to the dgit treehouse&lt;/a&gt; and &lt;a href=&quot;https://joeyh.name/blog/entry/upstream_git_repositories/&quot;&gt;upstream git repositories&lt;/a&gt;
and made &lt;a href=&quot;https://bugs.debian.org/817951&quot;&gt;a bug report against dgit&lt;/a&gt; hoping to tie some things together.&lt;/p&gt;

&lt;p&gt;The results of that early work were the &lt;a href=&quot;https://manpages.debian.org/git-deborig&quot;&gt;git-deborig(1)&lt;/a&gt; program and the
&lt;a href=&quot;https://manpages.debian.org/dgit-maint-merge&quot;&gt;dgit-maint-merge(7)&lt;/a&gt; tutorial manpage.  Starting with Joey’s workflow
pointers, I developed a complete, pure Git workflow that I thought would be
suitable for all package maintainers in Debian.  It was certainly well-suited
for my own packages.  It took me a while to learn that there are packages for
which this workflow is too simple.  We now also have the
&lt;a href=&quot;https://manpages.debian.org/dgit-maint-debrebase&quot;&gt;dgit-maint-debrebase(7)&lt;/a&gt; workflow which uses git-debrebase, something
which wasn’t invented until several years later.  Where dgit-maint-merge(7)
won’t do, you can use dgit-maint-debrebase(7), and still be doing pretty much
pure Git.  &lt;a href=&quot;https://diziet.dreamwidth.org/20851.html&quot;&gt;Here’s a full, recent guide to modernisation&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The next most significant contribution of my own was the &lt;code&gt;push-source&lt;/code&gt;
subcommand for dgit.  &lt;code&gt;dgit push&lt;/code&gt; required a preexisting &lt;code&gt;.changes&lt;/code&gt; file
produced from the working tree.  I wanted to make &lt;code&gt;dgit push-source&lt;/code&gt; prepare
that &lt;code&gt;.changes&lt;/code&gt; file for you, but &lt;em&gt;also&lt;/em&gt; not use the working tree, instead
consulting &lt;code&gt;HEAD&lt;/code&gt;.  The idea was that you were doing a git push – which
doesn’t care about the working tree – direct to the Debian archive, or as
close as we could get.  I implemented that at DebConf18 in Taiwan, I think,
with Ian, and we also did &lt;a href=&quot;https://debconf18.debconf.org/talks/60-git-debrebase-new-tool-for-managing-debian-packaging-in-git/&quot;&gt;a talk on git-debrebase&lt;/a&gt;.  We ended up having to
change it to look at the working tree in addition to &lt;code&gt;HEAD&lt;/code&gt; to make it work as
well as possible, but I think that the idea of a command which was like doing
a Git push direct to the archive was perhaps foundational for us later wanting
to develop tag2upload.  Indeed, while tag2upload’s client-side tool
git-debpush does look at the working tree, it doesn’t do so in a way that is
essential to its operation.  tag2upload is &lt;code&gt;dgit push-source&lt;/code&gt;-as-a-service.&lt;/p&gt;

&lt;p&gt;And finally we come to tag2upload, a system Ian and I designed in 2019 during
a two-person sprint at his place in Cambridge, while I was visiting the UK
from Arizona.  With tag2upload, appropriately authorised Debian package
maintainers can upload to Debian with only pure Git operations – namely,
making and pushing a signed Git tag to Debian’s GitLab instance.  Although we
had a solid prototype in 2019, we only finally launched it last month,
February 2026.  This was mostly due to &lt;a href=&quot;https://lwn.net/Articles/978324/&quot;&gt;political delays&lt;/a&gt;, but also
because we have put in a lot of hours making it better in various ways.&lt;/p&gt;

&lt;p&gt;Looking back, one thing that seems notable to me is that the core elements of
the pure Git workflows haven’t changed much at all.  Working out all the
details of dgit-maint-merge(7), designing and writing git-debrebase (Ian’s
work), and then working out all the details of dgit-maint-debrebase(7), are
the important parts, to me.  The rest is mostly just large amounts of
compatibility code.  git-debrebase and dgit-maint-debrebase(7) are very novel
but dgit-maint-merge(7) is mostly just an extrapolation of Joey’s thoughts
&lt;em&gt;from 13 years ago&lt;/em&gt;.  And yet, adoption of these workflows remains low.&lt;/p&gt;

&lt;p&gt;People prefer to use what they are used to using, even if the workflows have
significant inconveniences.  That’s completely understandable; I’m really
interested in good workflows, but most other contributors care less about it.
But you would expect enough newcomers to have arrived in 13 years that the new
workflows would have a higher uptake.  That is, packages maintained by
contributors that got involved after these workflows became available would be
maintained using newer workflows, at least.  But the inertia seems to be too
strong even for that.  Instead, new contributors used to working purely out of
Git are told they need to learn Debian’s strange ways of representing things,
tarballs and all.  It doesn’t have to be that way.  We hope that tag2upload
will make the pure Git workflows seem more appealing to people.&lt;/p&gt; </description> 
	<pubDate>Wed, 04 Mar 2026 20:45:47 +0000</pubDate>

</item> 
<item>
	<title>Jonathan Dowland: More lava lamps</title>
	<guid>https://jmtd.net/log/lavalamps/more/</guid>
	<link>https://jmtd.net/log/lavalamps/more/</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/jmtd.png&quot; width=&quot;65&quot; height=&quot;85&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;div class=&quot;image&quot;&gt;
&lt;a href=&quot;https://jmtd.net/log/lavalamps/rocket.jpg&quot;&gt;&lt;img alt=&quot;photograph of a Mathmos Telstar rocket lava lamp with orange wax and purple water&quot; class=&quot;img&quot; height=&quot;225&quot; src=&quot;https://jmtd.net/log/lavalamps/more/300x-rocket.jpg&quot; width=&quot;300&quot; /&gt;&lt;/a&gt;

&lt;/div&gt;


&lt;p&gt;&lt;a href=&quot;https://mathmos.com/&quot;&gt;Mathmos&lt;/a&gt; had a sale on spare Lava lamp bottles around Christmas, so I
bought a couple of new-to-me colour combinations.&lt;/p&gt;

&lt;div class=&quot;image&quot;&gt;
&lt;a href=&quot;https://jmtd.net/log/lavalamps/blue_in_purple.jpg&quot;&gt;&lt;img alt=&quot;photograph of a Mathmos Telstar rocket lava lamp with blue wax in purple water&quot; class=&quot;img&quot; height=&quot;225&quot; src=&quot;https://jmtd.net/log/lavalamps/more/300x-blue_in_purple.jpg&quot; width=&quot;300&quot; /&gt;&lt;/a&gt;

&lt;/div&gt;




&lt;div class=&quot;image&quot;&gt;
&lt;a href=&quot;https://jmtd.net/log/lavalamps/pink_in_clear.jpg&quot;&gt;&lt;img alt=&quot;photograph of a Mathmos Telstar rocket lava lamp with pink wax in clear water&quot; class=&quot;img&quot; height=&quot;400&quot; src=&quot;https://jmtd.net/log/lavalamps/more/300x-pink_in_clear.jpg&quot; width=&quot;300&quot; /&gt;&lt;/a&gt;

&lt;/div&gt;


&lt;p&gt;The lamp I have came with orange wax in purple liquid, which gives a strong red
glow in a dark room. I bought blue wax in purple liquid, which I think looks
fantastic and works really nicely with my &lt;a href=&quot;https://robsheridan.storenvy.com/collections/1237392-analog-glitch-prints&quot;&gt;Rob Sheridan
print&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The other one I bought was pink in clear, which is nice, but I think the coloured
liquids add a lot to the tone of lighting in a room.&lt;/p&gt;

&lt;p&gt;Recently,
UK vid-blogger Techmoan did some really nice videos about Mathmos lava lamps:
&lt;a href=&quot;https://www.youtube.com/watch?v=ipFePT4Z270&quot;&gt;Best Lava Lamp?&lt;/a&gt;
and &lt;a href=&quot;https://www.youtube.com/watch?v=3uWTf74mGGY&amp;amp;&quot;&gt;LAVA LAMPS Giant, Mini &amp;amp; Neo&lt;/a&gt;.&lt;/p&gt; </description> 
	<pubDate>Wed, 04 Mar 2026 16:21:23 +0000</pubDate>

</item> 
<item>
	<title>Matthew Garrett: To update blobs or not to update blobs</title>
	<guid>https://codon.org.uk/~mjg59/blog/p/to-update-blobs-or-not-to-update-blobs/</guid>
	<link>https://codon.org.uk/~mjg59/blog/p/to-update-blobs-or-not-to-update-blobs/</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/mjg59.png&quot; width=&quot;69&quot; height=&quot;85&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;A lot of hardware runs non-free software. Sometimes that non-free software is in ROM. Sometimes it’s in flash. Sometimes it’s not stored on the device at all, it’s pushed into it at runtime by another piece of hardware or by the operating system. We typically refer to this software as “firmware” to differentiate it from the software run on the CPU after the OS has started&lt;sup id=&quot;fnref:1&quot;&gt;&lt;a class=&quot;footnote-ref&quot; href=&quot;https://codon.org.uk/~mjg59/blog/index.xml#fn:1&quot;&gt;1&lt;/a&gt;&lt;/sup&gt;, but a lot of it (and, these days, probably most of it) is software written in C or some other systems programming language and targeting Arm or RISC-V or maybe MIPS and even sometimes x86&lt;sup id=&quot;fnref:2&quot;&gt;&lt;a class=&quot;footnote-ref&quot; href=&quot;https://codon.org.uk/~mjg59/blog/index.xml#fn:2&quot;&gt;2&lt;/a&gt;&lt;/sup&gt;. There’s no real distinction between it and any other bit of software you run, except it’s generally not run within the context of the OS&lt;sup id=&quot;fnref:3&quot;&gt;&lt;a class=&quot;footnote-ref&quot; href=&quot;https://codon.org.uk/~mjg59/blog/index.xml#fn:3&quot;&gt;3&lt;/a&gt;&lt;/sup&gt;. Anyway. It’s code. I’m going to simplify things here and stop using the words “software” or “firmware” and just say “code” instead, because that way we don’t need to worry about semantics.&lt;/p&gt;
&lt;p&gt;A fundamental problem for free software enthusiasts is that almost all of the code we’re talking about here is non-free. In some cases, it’s cryptographically signed in a way that makes it difficult or impossible to replace it with free code. In some cases it’s even encrypted, such that even examining the code is impossible. But because it’s code, sometimes the vendor responsible for it will provide updates, and now you get to choose whether or not to apply those updates.&lt;/p&gt;
&lt;p&gt;I’m now going to present some things to consider. These are not in any particular order and are not intended to form any sort of argument in themselves, but are representative of the opinions you will get from various people and I would like you to read these, think about them, and come to your own set of opinions before I tell you what my opinion is.&lt;/p&gt;
&lt;p&gt;THINGS TO CONSIDER&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;Does this blob do what it claims to do? Does it suddenly introduce functionality you don’t want? Does it introduce security flaws? Does it introduce deliberate backdoors? Does it make your life better or worse?&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;You’re almost certainly being provided with a blob of compiled code, with no source code available. You can’t just diff the source files, satisfy yourself that they’re fine, and then install them. To be fair, even though you (as someone reading this) are probably more capable of doing that than the average human, you’re likely not doing that even if you &lt;strong&gt;are&lt;/strong&gt; capable because you’re also likely installing kernel upgrades that contain vast quantities of code beyond your ability to understand&lt;sup id=&quot;fnref:4&quot;&gt;&lt;a class=&quot;footnote-ref&quot; href=&quot;https://codon.org.uk/~mjg59/blog/index.xml#fn:4&quot;&gt;4&lt;/a&gt;&lt;/sup&gt;. We don’t rely on our personal ability, we rely on the ability of those around us to do that validation, and we rely on an existing (possibly transitive) trust relationship with those involved. You don’t know the people who created this blob, you likely don’t know people who do know the people who created this blob, these people probably don’t have an online presence that gives you more insight. Why should you trust them?&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;If it’s in ROM and it turns out to be hostile then nobody can fix it ever&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;The people creating these blobs largely work for the same company that built the hardware in the first place. When they built that hardware they could have backdoored it in any number of ways. And if the hardware has a built-in copy of the code it runs, why do you trust that that copy isn’t backdoored? Maybe it isn’t and updates &lt;em&gt;would&lt;/em&gt; introduce a backdoor, but in that case if you buy new hardware that runs new code aren’t you putting yourself at the same risk?&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Designing hardware where you’re able to provide updated code and nobody else can is just a dick move&lt;sup id=&quot;fnref:5&quot;&gt;&lt;a class=&quot;footnote-ref&quot; href=&quot;https://codon.org.uk/~mjg59/blog/index.xml#fn:5&quot;&gt;5&lt;/a&gt;&lt;/sup&gt;. We shouldn’t encourage vendors who do that.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Humans are bad at writing code, and code running on ancilliary hardware is no exception. It contains bugs. These bugs are sometimes very bad. &lt;a class=&quot;link&quot; href=&quot;https://cs.ru.nl/~cmeijer/publications/Self_Encrypting_Deception_Weaknesses_in_the_Encryption_of_Solid_State_Drives.pdf&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;This paper&lt;/a&gt; describes a set of vulnerabilities identified in code running on SSDs that made it possible to bypass encryption secrets. The SSD vendors released updates that fixed these issues. If the code couldn’t be replaced then anyone relying on those security features would need to replace the hardware.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Even if blobs are signed and can’t easily be replaced, the ones that aren’t encrypted can still be examined. The SSD vulnerabilities above were identifiable because researchers were able to reverse engineer the updates. It can be more annoying to audit binary code than source code, but it’s still possible.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Vulnerabilities in code running on other hardware &lt;a class=&quot;link&quot; href=&quot;https://i.blackhat.com/us-18/Thu-August-9/us-18-Grassi-Exploitation-of-a-Modern-Smartphone-Baseband-wp.pdf&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;can still compromise the OS&lt;/a&gt;. If someone can compromise the code running on your wifi card then if you don’t have a strong &lt;a class=&quot;link&quot; href=&quot;https://en.wikipedia.org/wiki/Input%E2%80%93output_memory_management_unit&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;IOMMU&lt;/a&gt; setup they’re going to be able to overwrite your running OS.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Replacing one non-free blob with another non-free blob increases the total number of non-free blobs involved in the whole system, but doesn’t increase the number that are actually executing at any point in time.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Ok we’re done with the things to consider. Please spend a few seconds thinking about what the tradeoffs are here and what your feelings are. Proceed when ready.&lt;/p&gt;
&lt;p&gt;I trust my CPU vendor. I don’t trust my CPU vendor because I want to, I trust my CPU vendor because I have no choice. I don’t think it’s likely that my CPU vendor has designed a CPU that identifies when I’m generating cryptographic keys and biases the RNG output so my keys are significantly weaker than they look, but it’s not literally impossible. I generate keys on it anyway, because what choice do I have? At some point I will buy a new laptop because Electron will no longer fit in 32GB of RAM and I will have to make the same affirmation of trust, because the alternative is that I just don’t have a computer. And in any case, I will be communicating with other people who generated their keys on CPUs I have no control over, and I will also be relying on them to be trustworthy. If I refuse to trust my CPU then I don’t get to computer, and if I don’t get to computer then I will be sad. I suspect I’m not alone here.&lt;/p&gt;
&lt;p&gt;Why would I install a code update on my CPU when my CPU’s job is to run my code in the first place? Because it turns out that CPUs are complicated and messy and they have their own bugs, and those bugs may be functional (for example, some performance counter functionality was broken on Sandybridge at release, and was then &lt;a class=&quot;link&quot; href=&quot;https://groups.google.com/g/linux.kernel/c/Bk3lNiC0Ys0&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;fixed&lt;/a&gt; with a microcode blob update) and if you update it your hardware works better. Or it might be that you’re running a CPU with &lt;a class=&quot;link&quot; href=&quot;https://en.wikipedia.org/wiki/Transient_execution_CPU_vulnerability&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;speculative execution bugs&lt;/a&gt; and there’s a microcode update that provides a mitigation for that even if your CPU is slower when you enable it, but at least now you can run virtual machines without code in those virtual machines being able to reach outside the hypervisor boundary and extract secrets from other contexts. When it’s put that way, why would I &lt;em&gt;not&lt;/em&gt; install the update?&lt;/p&gt;
&lt;p&gt;And the straightforward answer is that theoretically it could include new code that doesn’t act in my interests, either deliberately or not. And, yes, this is theoretically possible. Of course, if you don’t trust your CPU vendor, why are you buying CPUs from them, but well maybe they’ve been corrupted (in which case don’t buy any new CPUs from them either) or maybe they’ve just introduced a new vulnerability by accident, and also you’re in a position to determine whether the alleged security improvements matter to you at all. Do you care about speculative execution attacks if all software running on your system is trustworthy? Probably not! Do you need to update a blob that fixes something you don’t care about and which might introduce some sort of vulnerability? Seems like no!&lt;/p&gt;
&lt;p&gt;But there’s a difference between a recommendation for a fully informed device owner who has a full understanding of threats, and a recommendation for an average user who just wants their computer to work and to not be ransomwared. A code update on a wifi card may introduce a backdoor, or it may fix the ability for someone to compromise your machine with a hostile access point. Most people are just not going to be in a position to figure out which is more likely, and there’s no single answer that’s correct for everyone. What we &lt;em&gt;do&lt;/em&gt; know is that where vulnerabilities in this sort of code have been discovered, updates have tended to fix them - but nobody has flagged such an update as a real-world vector for system compromise.&lt;/p&gt;
&lt;p&gt;My personal opinion? You should make your own mind up, but also you shouldn’t impose that choice on others, because your threat model is not necessarily their threat model. Code updates are a reasonable default, but they shouldn’t be unilaterally imposed, and nor should they be blocked outright. And the best way to shift the balance of power away from vendors who insist on distributing non-free blobs is to demonstrate the benefits gained from them being free - a vendor who ships free code on their system enables their customers to improve their code and enable new functionality and make their hardware more attractive.&lt;/p&gt;
&lt;p&gt;It’s impossible to say with absolute certainty that your security will be improved by installing code blobs. It’s also impossible to say with absolute certainty that it won’t. So far evidence tends to support the idea that most updates that claim to fix security issues do, and there’s not a lot of evidence to support the idea that updates add new backdoors. Overall I’d say that providing the updates is likely the right default for most users - and that that should never be strongly enforced, because people should be allowed to define their own security model, and whatever set of threats I’m worried about, someone else may have a good reason to focus on different ones.&lt;/p&gt;
&lt;div class=&quot;footnotes&quot;&gt;
&lt;hr /&gt;
&lt;ol&gt;
&lt;li id=&quot;fn:1&quot;&gt;
&lt;p&gt;Code that runs on the CPU &lt;em&gt;before&lt;/em&gt; the OS is still usually described as firmware - UEFI is firmware even though it’s executing on the CPU, which should give a strong indication that the difference between “firmware” and “software” is largely arbitrary &lt;a class=&quot;footnote-backref&quot; href=&quot;https://codon.org.uk/~mjg59/blog/index.xml#fnref:1&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id=&quot;fn:2&quot;&gt;
&lt;p&gt;And, obviously &lt;a class=&quot;link&quot; href=&quot;https://www.google.com/search?q=foone+8051&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;8051&lt;/a&gt; &lt;a class=&quot;footnote-backref&quot; href=&quot;https://codon.org.uk/~mjg59/blog/index.xml#fnref:2&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id=&quot;fn:3&quot;&gt;
&lt;p&gt;Because UEFI makes everything more complicated, UEFI makes this more complicated. Triggering a UEFI runtime service involves your OS jumping into firmware code at runtime, in the same context as the OS kernel. Sometimes this will trigger a jump into System Management Mode, but other times it won’t, and it’s just your kernel executing code that got dumped into RAM when your system booted. &lt;a class=&quot;footnote-backref&quot; href=&quot;https://codon.org.uk/~mjg59/blog/index.xml#fnref:3&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id=&quot;fn:4&quot;&gt;
&lt;p&gt;&lt;em&gt;I&lt;/em&gt; don’t understand most of the diff between one kernel version and the next, and I don’t have time to read all of it either. &lt;a class=&quot;footnote-backref&quot; href=&quot;https://codon.org.uk/~mjg59/blog/index.xml#fnref:4&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id=&quot;fn:5&quot;&gt;
&lt;p&gt;There’s a bunch of reasons to do this, the most reasonable of which is probably not wanting customers to replace the code and break their hardware and deal with the support overhead of that, but not being able to replace code running on hardware I own is always going to be an affront to me. &lt;a class=&quot;footnote-backref&quot; href=&quot;https://codon.org.uk/~mjg59/blog/index.xml#fnref:5&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt; </description> 
	<pubDate>Tue, 03 Mar 2026 03:09:48 +0000</pubDate>

</item> 
<item>
	<title>Michael Ablassmeier: pbsindex - file backup index</title>
	<guid>https://abbbi.github.io//pbsindex</guid>
	<link>https://abbbi.github.io//pbsindex/</link>
     <description>  &lt;p&gt;If you take backups using the proxmox-backup-client and you wondered what
backup may include a specific file, the only way to find out is to mount the
backup and search for the files.&lt;/p&gt;

&lt;p&gt;For regular file backups, the Proxmox Backup Server frontend provides a pcat1
file for download, whose binary format is somewhat
&lt;a href=&quot;https://bugzilla.proxmox.com/show_bug.cgi?id=5748&quot;&gt;undocumented&lt;/a&gt; but actually
includes a listing of the files backed up.&lt;/p&gt;

&lt;p&gt;A Proxmox backup server datastore includes the same pcat1 file as blob index
(.pcat1.didx). So to actually beeing able to tell which backup contains which
files, one needs to:&lt;/p&gt;

&lt;p&gt;1) Open the .pcat1.didx file and find out required blobs, see &lt;a href=&quot;https://pbs.proxmox.com/docs/file-formats.html#dynamic-index-format-didx&quot;&gt;format documentation&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;2) Reconstruct the .pcat1 file from the blobs&lt;/p&gt;

&lt;p&gt;3) Parse the pcat1 file and output the directory listing.&lt;/p&gt;

&lt;p&gt;I’ve implemented this in &lt;a href=&quot;https://github.com/abbbi/pbsindex&quot;&gt;pbsindex&lt;/a&gt;
which lets you create a central file index for your backups by scanning a
complete PBS datastore.&lt;/p&gt;

&lt;p&gt;Lets say you want to have a file listing for a specific backup,
use:&lt;/p&gt;

&lt;figure class=&quot;highlight&quot;&gt;&lt;pre&gt;&lt;code class=&quot;language-bash&quot;&gt; pbsindex &lt;span class=&quot;nt&quot;&gt;--chunk-dir&lt;/span&gt; /backup/.chunks/ /backup/host/vm178/2026-03-02T10:47:57Z/catalog.pcat1.didx
 didx &lt;span class=&quot;nv&quot;&gt;uuid&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;7e4086a9-4432-4184-a21f-0aeec2b2de93 &lt;span class=&quot;nv&quot;&gt;ctime&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;2026-03-02T10:47:57Z &lt;span class=&quot;nv&quot;&gt;chunks&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;2 &lt;span class=&quot;nv&quot;&gt;total_size&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;1037386
 chunk[0] &lt;span class=&quot;nv&quot;&gt;start&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;0 &lt;span class=&quot;nv&quot;&gt;end&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;344652 &lt;span class=&quot;nv&quot;&gt;size&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;344652 &lt;span class=&quot;nv&quot;&gt;digest&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;af3851419f5e74fbb4d7ca6ac3bc7c5cbbdb7c03d3cb489d57742ea717972224
 chunk[1] &lt;span class=&quot;nv&quot;&gt;start&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;344652 &lt;span class=&quot;nv&quot;&gt;end&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;1037386 &lt;span class=&quot;nv&quot;&gt;size&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;692734 &lt;span class=&quot;nv&quot;&gt;digest&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;e400b13522df02641c2d9934c3880ae78ebb397c66f9b4cf3b931d309da1a7cc
 d ./usr.pxar.didx
 d ./usr.pxar.didx/bin
 l ./usr.pxar.didx/bin/Mail
 f ./usr.pxar.didx/bin/[ &lt;span class=&quot;nv&quot;&gt;size&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;55720 &lt;span class=&quot;nv&quot;&gt;mtime&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;2025-06-04T15:14:05Z
 f ./usr.pxar.didx/bin/aa-enabled &lt;span class=&quot;nv&quot;&gt;size&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;18672 &lt;span class=&quot;nv&quot;&gt;mtime&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;2025-04-10T15:06:25Z
 f ./usr.pxar.didx/bin/aa-exec &lt;span class=&quot;nv&quot;&gt;size&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;18672 &lt;span class=&quot;nv&quot;&gt;mtime&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;2025-04-10T15:06:25Z
 f ./usr.pxar.didx/bin/aa-features-abi &lt;span class=&quot;nv&quot;&gt;size&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;18664 &lt;span class=&quot;nv&quot;&gt;mtime&lt;/span&gt;&lt;span class=&quot;o&quot;&gt;=&lt;/span&gt;2025-04-10T15:06:25Z
 l ./usr.pxar.didx/bin/apropos&lt;/code&gt;&lt;/pre&gt;&lt;/figure&gt;

&lt;p&gt;It also lets you scan a complete datastore for all existing .pcat1.didx files
and store the directory listings in a SQLite database for easier searching.&lt;/p&gt; </description> 
	<pubDate>Tue, 03 Mar 2026 00:00:00 +0000</pubDate>

</item> 
<item>
	<title>Isoken Ibizugbe: Wrapping Up My Outreachy Internship at Debian</title>
	<guid>http://isokenibizugbe.wordpress.com/?p=22</guid>
	<link>https://isokenibizugbe.wordpress.com/2026/03/02/wrapping-up-my-outreachy-internship-at-debian/</link>
     <description>  &lt;h2 class=&quot;wp-block-heading&quot;&gt;&lt;/h2&gt;



&lt;p class=&quot;has-poppins-font-family wp-block-paragraph&quot;&gt;Twelve weeks ago, I stepped into the Debian ecosystem as an Outreachy intern with a curiosity for Quality Assurance. It feels like just yesterday, and time has flown by so fast! Now, I am wrapping up that journey, not just with a completed project, but with improved technical reasoning.&lt;/p&gt;



&lt;p class=&quot;has-poppins-font-family wp-block-paragraph&quot;&gt;I have learned how to use documentation to understand a complex project, how to be a good collaborator, and that learning is a continuous process. These experiences have helped me grow much more confident in my skills as an engineer.&lt;/p&gt;



&lt;h3 class=&quot;wp-block-heading has-poppins-font-family&quot;&gt;&lt;strong&gt;My Achievements&lt;/strong&gt;&lt;/h3&gt;



&lt;p class=&quot;has-poppins-font-family wp-block-paragraph&quot;&gt;As I close this chapter, I am leaving a permanent “Proof-of-Work” in the Debian repositories:&lt;/p&gt;



&lt;ul class=&quot;wp-block-list has-poppins-font-family&quot;&gt;
&lt;li&gt;&lt;strong&gt;Full Test Coverage:&lt;/strong&gt; I automated apps_startstop tests for Cinnamon, LXQt, and XFCE, covering both Live images and Netinst installations.&lt;/li&gt;



&lt;li&gt;&lt;strong&gt;Synergy:&lt;/strong&gt; I used symbolic links and a single Perl script to handle common application tests across different desktops, which reduces code redundancy.&lt;/li&gt;



&lt;li&gt;&lt;strong&gt;The Contributor Style Guide:&lt;/strong&gt; I created a guide for future contributors to make documentation clearer and reviews faster, helping to reduce the burden on reviewers.&lt;/li&gt;
&lt;/ul&gt;



&lt;h3 class=&quot;wp-block-heading has-poppins-font-family&quot;&gt;&lt;strong&gt;Final Month: Wrap Up&lt;/strong&gt;&lt;/h3&gt;



&lt;p class=&quot;has-poppins-font-family wp-block-paragraph&quot;&gt;In this final month, things became easier as my understanding of the project grew. I focused on stability and finishing my remaining tasks:&lt;/p&gt;



&lt;ul class=&quot;wp-block-list has-poppins-font-family&quot;&gt;
&lt;li&gt;I spent time exploring different QEMU video options like VGA, qxl, and virtio on KDE desktop environment . This was important to ensure screen rendering remained stable so that our “needles” (visual test markers) wouldn’t fail because of minor glitches.&lt;/li&gt;



&lt;li&gt;I successfully moved from familiarizing to test automation for the XFCE desktop. This included writing “prepare” steps and creating the visual needles needed to make the tests reliable.&lt;/li&gt;



&lt;li&gt;One of my final challenges was the app launcher function. Originally, my code used else if blocks for each desktop. I proposed a unified solution, but hit a blocker: XFCE has two ways to launch apps (App Finder and the Application Menu). Because using different methods sometimes caused failures, I chose to use the application menu button across the board.&lt;/li&gt;
&lt;/ul&gt;



&lt;h3 class=&quot;wp-block-heading has-poppins-font-family&quot;&gt;&lt;strong&gt;What’s Next?&lt;/strong&gt;&lt;/h3&gt;



&lt;p class=&quot;has-poppins-font-family wp-block-paragraph&quot;&gt;I don’t want my journey with Debian to end here. I plan to stay involved in the community and extend these same tests to the &lt;strong&gt;LXDE&lt;/strong&gt; desktop to complete the coverage for all major Debian desktop environments. I am excited to keep exploring and learning more about the Debian ecosystem.&lt;/p&gt;



&lt;h3 class=&quot;wp-block-heading has-poppins-font-family&quot;&gt;&lt;strong&gt;Thank You&lt;/strong&gt;&lt;/h3&gt;



&lt;p class=&quot;has-poppins-font-family wp-block-paragraph&quot;&gt;This journey wouldn’t have been possible without the steady guidance of my mentors: &lt;strong&gt;Tassia Camoes Araujo, Roland Clobus, and Philip Hands.&lt;/strong&gt; Thank you for teaching me that in the world of Free and Open Source Software (FOSS), your voice and your code are equally important.&lt;/p&gt;



&lt;p class=&quot;has-poppins-font-family wp-block-paragraph&quot;&gt;To my fellow intern &lt;strong&gt;Hellen&lt;/strong&gt; and the entire Outreachy community, thank you for the shared learning and support. It has been an incredible 12 weeks.&lt;/p&gt; </description> 
	<pubDate>Mon, 02 Mar 2026 20:51:19 +0000</pubDate>

</item> 
<item>
	<title>Hellen Chemtai: The Last Week of My Journey as an Outreachy Intern at Debian OpenQA</title>
	<guid>http://hellenchemtai.wordpress.com/?p=69</guid>
	<link>https://hellenchemtai.wordpress.com/2026/03/02/the-last-week-of-my-journey-as-an-outreachy-intern-at-debian-openqa/</link>
     <description>  &lt;p class=&quot;wp-block-paragraph&quot;&gt;Hello world &lt;img alt=&quot;😀&quot; class=&quot;wp-smiley&quot; src=&quot;https://s0.wp.com/wp-content/mu-plugins/wpcom-smileys/twemoji/2/72x72/1f600.png&quot; style=&quot;height: 1em;&quot; /&gt;. I’m Hellen Chemtai, an intern at Outreachy working with the Debian OpenQA team on Images Testing. This is the final week of the internship. This is just a start for me as I will continue contributing to the community .I am grateful for the opportunity to work with the Debian OpenQA team as an Outreachy intern. I have had the best welcoming team to Open Source.&lt;/p&gt;



&lt;h4 class=&quot;wp-block-heading&quot;&gt;My tasks and contributions&lt;/h4&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;I have been working on network install and live images tasks :&lt;/p&gt;



&lt;ol class=&quot;wp-block-list&quot;&gt;
&lt;li&gt;Install live Installers ( Ventoy , Rufus and Balenaetcher) and test the live USBs made by these live installers. – These tasks were completed and is running on the server.&lt;/li&gt;



&lt;li&gt;Use different file systems (btrfs , jfs , xfs) for installation and then test. – This task was completed and running on the server. It still needs some changes to ensure automation for each file system&lt;/li&gt;



&lt;li&gt;Use speech synthesis to capture all audio. – This task is complete. We are refining it to run well in server.&lt;/li&gt;



&lt;li&gt;Publish temporary assets. – This task is being worked on.&lt;/li&gt;
&lt;/ol&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;I have enjoyed working on testing both live images and net install images. This was one of the goals that I had highlighted in my application. I have also been working with fellow contributors in this project. &lt;/p&gt;



&lt;h4 class=&quot;wp-block-heading&quot;&gt;My team&lt;/h4&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;As I had stated , I have had the best welcoming team to Open Source . They have been working with me and ensuring I have the proper resources for contributions. I am grateful to my three mentors and the work they have done. &lt;/p&gt;



&lt;ol class=&quot;wp-block-list&quot;&gt;
&lt;li&gt;Roland Clobus is a project maintainer. He is in charge of code review , pointing out what we need to learn and works on technical issues. He considers every solution we contributors think of and will go into detailed explanations for any issue we have.&lt;/li&gt;



&lt;li&gt;Tassia Camoes is a community coordinator. She is in charge of communication, co-ordination between contributors and networking within the community. She on-boarded us and introduced us to the community.&lt;/li&gt;



&lt;li&gt;Philip Hands is also a project maintainer. He is in charge of technical code , ensuring sources work and also working on server and its issues. He also gives detailed explanations for any issue we have.&lt;/li&gt;
&lt;/ol&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;I wish to learn more with the team. On my to do list,  I would  like to gain more skills on ports and packages so to contribute more technically. I have enjoyed working on the tasks and learning&lt;/p&gt;



&lt;h4 class=&quot;wp-block-heading&quot;&gt;The impact of this project&lt;/h4&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;The automated tests done by the team help the community in some of the following examples:&lt;/p&gt;



&lt;ol class=&quot;wp-block-list&quot;&gt;
&lt;li&gt;Check the installation and system behavior of the Operating System images versions&lt;/li&gt;



&lt;li&gt;Help developers and users of Operating Systems know which versions of applications e.g live installers run well on system&lt;/li&gt;



&lt;li&gt;Check for any issues during installation and running of Operating Systems and their flavors&lt;/li&gt;
&lt;/ol&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;I have also networked with the greater community and other contributors. During the contribution phase, I found many friends who were learning together with me . I hope to continue networking with the community and continue learning. &lt;/p&gt;



&lt;p class=&quot;wp-block-paragraph&quot;&gt;&lt;/p&gt; </description> 
	<pubDate>Mon, 02 Mar 2026 17:23:32 +0000</pubDate>

</item> 
<item>
	<title>Ben Hutchings: FOSS activity in February 2026</title>
	<guid>https://www.decadent.org.uk/ben/blog/2026/03/02/foss-activity-in-february-2026</guid>
	<link>https://www.decadent.org.uk/ben/blog/2026/03/02/foss-activity-in-february-2026.html</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/benh.png&quot; width=&quot;109&quot; height=&quot;100&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;ul&gt;
  &lt;li&gt;Debian packages:
    &lt;ul&gt;
      &lt;li&gt;&lt;a href=&quot;https://tracker.debian.org/pkg/firmware-free&quot;&gt;firmware-free&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/src:firmware-free&quot;&gt;Bugs&lt;/a&gt;:
            &lt;ul&gt;
              &lt;li&gt;closed &lt;a href=&quot;https://bugs.debian.org/890601&quot;&gt;#890601: firmware-linux-free uses prebuilt blobs instead of building from source&lt;/a&gt;&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
          &lt;li&gt;&lt;a href=&quot;https://tracker.debian.org/pkg/firmware-free/news/&quot;&gt;Uploads&lt;/a&gt;:
            &lt;ul&gt;
              &lt;li&gt;uploaded version 20241210-3 to unstable&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://tracker.debian.org/pkg/firmware-nonfree&quot;&gt;firmware-nonfree&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/src:firmware-nonfree&quot;&gt;Bugs&lt;/a&gt;:
            &lt;ul&gt;
              &lt;li&gt;closed &lt;a href=&quot;https://bugs.debian.org/481234&quot;&gt;#481234: firmware-nonfree: Include firmware for p54 driver&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;closed &lt;a href=&quot;https://bugs.debian.org/484177&quot;&gt;#484177: firmware-nonfree: keyspan&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;closed &lt;a href=&quot;https://bugs.debian.org/534379&quot;&gt;#534379: [firmware-nonfree] Please consider including  dvb-usb-af9015.fw&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;closed &lt;a href=&quot;https://bugs.debian.org/548745&quot;&gt;#548745: firmware-linux: Fix licence and include edgeport firmware&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;closed &lt;a href=&quot;https://bugs.debian.org/588142&quot;&gt;#588142: Add r8192u_usb (aka rtl8192u) firmware&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;closed &lt;a href=&quot;https://bugs.debian.org/597897&quot;&gt;#597897: RFP: alsa-firmware – firmware binaries used by each alsa-firmware-loader program&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;closed &lt;a href=&quot;https://bugs.debian.org/999485&quot;&gt;#999485: Please add brcmfmac43456-sdio.* files as it’s not just used in RPi devices&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;opened and closed &lt;a href=&quot;https://bugs.debian.org/1126794&quot;&gt;#1126794: Undistributable file under qcom/qdu100&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;closed &lt;a href=&quot;https://bugs.debian.org/1126846&quot;&gt;#1126846: Qualcomm AudioReach topology files are covered by separate licence&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;replied to &lt;a href=&quot;https://bugs.debian.org/1126896&quot;&gt;#1126896: firmware-nvidia-graphics: Cannot upgrade from bookworm-backports to trixie-backports&lt;/a&gt;&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
          &lt;li&gt;Merge requests:
            &lt;ul&gt;
              &lt;li&gt;closed &lt;a href=&quot;https://salsa.debian.org/kernel-team/firmware-nonfree/-/merge_requests/128&quot;&gt;!128: Draft: Add Provides: based ABI versioning mechanism&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;merged &lt;a href=&quot;https://salsa.debian.org/kernel-team/firmware-nonfree/-/merge_requests/134&quot;&gt;!134: Update to 20251125&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;reviewed and merged &lt;a href=&quot;https://salsa.debian.org/kernel-team/firmware-nonfree/-/merge_requests/135&quot;&gt;!135: Drop DSP firmware, migrated to hexagon-dsp-binaries source&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;reviewed and merged &lt;a href=&quot;https://salsa.debian.org/kernel-team/firmware-nonfree/-/merge_requests/136&quot;&gt;!136: debian/copyright: correct licence issues&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;opened and closed &lt;a href=&quot;https://salsa.debian.org/kernel-team/firmware-nonfree/-/merge_requests/137&quot;&gt;!137: d/copyright, qcom-soc: Exclude undistributable QDU100 firmware&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;opened and merged &lt;a href=&quot;https://salsa.debian.org/kernel-team/firmware-nonfree/-/merge_requests/138&quot;&gt;!138: Update to 20260110&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;opened and merged &lt;a href=&quot;https://salsa.debian.org/kernel-team/firmware-nonfree/-/merge_requests/139&quot;&gt;!139: Update to 20260221&lt;/a&gt;&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
          &lt;li&gt;&lt;a href=&quot;https://tracker.debian.org/pkg/firmware-nonfree/news/&quot;&gt;Uploads&lt;/a&gt;:
            &lt;ul&gt;
              &lt;li&gt;uploaded version 20251111-1~bpo13+1 to trixie-backports&lt;/li&gt;
              &lt;li&gt;uploaded version 20251125-1 to unstable&lt;/li&gt;
              &lt;li&gt;uploaded version 20260110-1 to unstable&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://tracker.debian.org/pkg/hexagon-dsp-binaries&quot;&gt;hexagon-dsp-binaries&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/src:hexagon-dsp-binaries&quot;&gt;Bugs&lt;/a&gt;:
            &lt;ul&gt;
              &lt;li&gt;opened &lt;a href=&quot;https://bugs.debian.org/1129001&quot;&gt;#1129001: Missing binaries - should this package use XS-Autobuild?&lt;/a&gt;&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://tracker.debian.org/pkg/initramfs-tools&quot;&gt;initramfs-tools&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/src:initramfs-tools&quot;&gt;Bugs&lt;/a&gt;:
            &lt;ul&gt;
              &lt;li&gt;closed &lt;a href=&quot;https://bugs.debian.org/1126611&quot;&gt;#1126611: mkinitramfs: failed to determine device for /&lt;/a&gt;&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
          &lt;li&gt;Merge requests:
            &lt;ul&gt;
              &lt;li&gt;merged &lt;a href=&quot;https://salsa.debian.org/kernel-team/initramfs-tools/-/merge_requests/191&quot;&gt;!191: tests fail on arm64 because they call qemu-system-arm64&lt;/a&gt;&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://tracker.debian.org/pkg/iptables&quot;&gt;iptables&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/src:iptables&quot;&gt;Bugs&lt;/a&gt;:
            &lt;ul&gt;
              &lt;li&gt;replied to &lt;a href=&quot;https://bugs.debian.org/1128561&quot;&gt;#1128561: iptables: virsh net-start no longer works: Failed to run firewall command iptables -w –table filter –list-rules&lt;/a&gt;&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://tracker.debian.org/pkg/ktls-utils&quot;&gt;ktls-utils&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;Merge requests:
            &lt;ul&gt;
              &lt;li&gt;merged &lt;a href=&quot;https://salsa.debian.org/kernel-team/ktls-utils/-/merge_requests/3&quot;&gt;!3: d/t/test-common: Move inclusion of extensions when signing the certificate&lt;/a&gt;&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://tracker.debian.org/pkg/libvirt&quot;&gt;libvirt&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/src:libvirt&quot;&gt;Bugs&lt;/a&gt;:
            &lt;ul&gt;
              &lt;li&gt;replied to &lt;a href=&quot;https://bugs.debian.org/1124549&quot;&gt;#1124549: libvirt passes invalid flags for network interface deletion&lt;/a&gt;&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://tracker.debian.org/pkg/linux&quot;&gt;linux&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/src:linux&quot;&gt;Bugs&lt;/a&gt;:
            &lt;ul&gt;
              &lt;li&gt;replied to &lt;a href=&quot;https://bugs.debian.org/1121192&quot;&gt;#1121192: kworker: Events_unbound, kworker processes, continually using CPU.&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;replied to &lt;a href=&quot;https://bugs.debian.org/1126710&quot;&gt;#1126710: linux-image-6.18.5+deb14-amd64: unable to mount existing XFS V4 filesystem because kernel CONFIG_XFS_SUPPORT_V4 is not set&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;replied to &lt;a href=&quot;https://bugs.debian.org/1128397&quot;&gt;#1128397: linux-image-6.18.10+deb14-amd64: open(/proc/$pid/maps) is empty after $pid exec()s, unless you read a partial line from the fd before, in which case it has the rest of the line only&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;replied to and closed &lt;a href=&quot;https://bugs.debian.org/1128567&quot;&gt;#1128567: linux-image-6.18.5+deb13-amd64: amdgpu.dc=0 causes Xorg 1:7.7+24 error “no screens found”&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;closed &lt;a href=&quot;https://bugs.debian.org/1129029&quot;&gt;#1129029: Bug on VirtualBox and KVM conflict kernel 6.12 (Debian 12)&lt;/a&gt;&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
          &lt;li&gt;Merge requests:
            &lt;ul&gt;
              &lt;li&gt;reviewed &lt;a href=&quot;https://salsa.debian.org/kernel-team/linux/-/merge_requests/1682&quot;&gt;!1682: Unsplit configs for some kernel architectures&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;reviewed &lt;a href=&quot;https://salsa.debian.org/kernel-team/linux/-/merge_requests/1821&quot;&gt;!1821: riscv64 config update for linux 6.19&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;reviewed and merged &lt;a href=&quot;https://salsa.debian.org/kernel-team/linux/-/merge_requests/1824&quot;&gt;!1824: db-mok: Remove unused function&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;opened &lt;a href=&quot;https://salsa.debian.org/kernel-team/linux/-/merge_requests/1831&quot;&gt;!1831: CI: Update build job to work after another common pipeline change&lt;/a&gt;&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
          &lt;li&gt;&lt;a href=&quot;https://tracker.debian.org/pkg/linux/news/&quot;&gt;Uploads&lt;/a&gt;:
            &lt;ul&gt;
              &lt;li&gt;(LTS) uploaded version 5.10.249-1 to bullseye-security&lt;/li&gt;
              &lt;li&gt;uploaded version 6.12.63-1~bpo12+1 to bookworm-backports&lt;/li&gt;
              &lt;li&gt;uploaded version 6.12.69-1~bpo12+1 to bookworm-backports&lt;/li&gt;
              &lt;li&gt;uploaded version 6.12.73-1~bpo12+1 to bookworm-backports&lt;/li&gt;
              &lt;li&gt;uploaded version 6.18.12-1~bpo13+1 to trixie-backports&lt;/li&gt;
              &lt;li&gt;uploaded version 6.18.5-1~bpo13+1 to trixie-backports&lt;/li&gt;
              &lt;li&gt;uploaded version 6.18.9-1~bpo13+1 to trixie-backports&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
          &lt;li&gt;(LTS) updated the bullseye-security branch to 5.10.251, but did not upload it&lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;(LTS) &lt;a href=&quot;https://tracker.debian.org/pkg/linux-6.1&quot;&gt;linux-6.1&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;&lt;a href=&quot;https://tracker.debian.org/pkg/linux-6.1/news/&quot;&gt;Uploads&lt;/a&gt;:
            &lt;ul&gt;
              &lt;li&gt;uploaded version 6.1.162-1~deb11u1 to bullseye-security&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://tracker.debian.org/pkg/linux-base&quot;&gt;linux-base&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/src:linux-base&quot;&gt;Bugs&lt;/a&gt;:
            &lt;ul&gt;
              &lt;li&gt;closed &lt;a href=&quot;https://bugs.debian.org/1128355&quot;&gt;#1128355: linux-base: indirectly missing perl dependency?&lt;/a&gt;&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://tracker.debian.org/pkg/nfs-utils&quot;&gt;nfs-utils&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;Merge requests:
            &lt;ul&gt;
              &lt;li&gt;reviewed and merged &lt;a href=&quot;https://salsa.debian.org/kernel-team/nfs-utils/-/merge_requests/36&quot;&gt;!36: Drop installation of blkmapd and nfs-blkmap.service systemd service&lt;/a&gt;&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://tracker.debian.org/pkg/wireless-regdb&quot;&gt;wireless-regdb&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/src:wireless-regdb&quot;&gt;Bugs&lt;/a&gt;:
            &lt;ul&gt;
              &lt;li&gt;replied to and closed &lt;a href=&quot;https://bugs.debian.org/1104022&quot;&gt;#1104022: wireless-regdb: Consider importing setregdomain and udev rule from Fedora&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;closed &lt;a href=&quot;https://bugs.debian.org/1122785&quot;&gt;#1122785: wireless-regdb: Please remove/replace usage of dh_movetousr&lt;/a&gt;&lt;/li&gt;
              &lt;li&gt;closed &lt;a href=&quot;https://bugs.debian.org/1126431&quot;&gt;#1126431: wireless-regdb: Unnecessary Build-Depends: python3-m2crypto&lt;/a&gt;&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
          &lt;li&gt;&lt;a href=&quot;https://tracker.debian.org/pkg/wireless-regdb/news/&quot;&gt;Uploads&lt;/a&gt;:
            &lt;ul&gt;
              &lt;li&gt;uploaded version 2026.02.04-1 to unstable&lt;/li&gt;
              &lt;li&gt;uploaded version 2026.02.04-1~deb12u1 to bookworm&lt;/li&gt;
              &lt;li&gt;uploaded version 2026.02.04-1~deb13u1 to trixie&lt;/li&gt;
            &lt;/ul&gt;
          &lt;/li&gt;
          &lt;li&gt;(LTS) updated the bullseye-security branch to 2026.02.04-1, but did not upload it&lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
  &lt;li&gt;Debian non-package bugs:
    &lt;ul&gt;
      &lt;li&gt;&lt;a href=&quot;https://bugs.debian.org/release.debian.org&quot;&gt;release.debian.org&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;opened &lt;a href=&quot;https://bugs.debian.org/1128507&quot;&gt;#1128507: trixie-pu: package wireless-regdb/2026.02.04-1~deb13u1&lt;/a&gt;&lt;/li&gt;
          &lt;li&gt;opened &lt;a href=&quot;https://bugs.debian.org/1128510&quot;&gt;#1128510: bookworm-pu: package wireless-regdb/2026.02.04-1~deb12u1&lt;/a&gt;&lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
  &lt;li&gt;Mailing lists:
    &lt;ul&gt;
      &lt;li&gt;&lt;a href=&quot;https://lists.debian.org/debian-kernel/&quot;&gt;debian-kernel&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;posted &lt;a href=&quot;https://lists.debian.org/a08f3fe6a71e72d2331c9ec67665202967b89b7c.camel@decadent.org.uk&quot;&gt;Agenda items for kernel-team meeting on 2026-02-04&lt;/a&gt;&lt;/li&gt;
          &lt;li&gt;posted &lt;a href=&quot;https://lists.debian.org/47040935c4dfc30df75fe884fa052e4bccd258e4.camel@decadent.org.uk&quot;&gt;Agenda items for kernel-team meeting on 2026-02-25&lt;/a&gt;&lt;/li&gt;
          &lt;li&gt;(LTS) replied to &lt;a href=&quot;https://lists.debian.org/4d93621100ce80eef243f5cf888a55822b3ee198.camel@decadent.org.uk&quot;&gt;Discrepancies between Commits list in changelog of debian and upstream linux git repo.&lt;/a&gt;&lt;/li&gt;
          &lt;li&gt;(LTS) replied to &lt;a href=&quot;https://lists.debian.org/2239236482b61aa744ea6e2eb6af13c45c45385f.camel@debian.org&quot;&gt;[Pkg-libvirt-maintainers] Processed: retitle 1124549 to libvirt passes invalid flags for network interface deletion …, tagging 1124549&lt;/a&gt;&lt;/li&gt;
          &lt;li&gt;replied to &lt;a href=&quot;https://lists.debian.org/cef74b23084521476b1e3531dfe18439c2feb3da.camel@decadent.org.uk&quot;&gt;linux 7.0&lt;/a&gt;&lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://lists.debian.org/debian-lts-announce/&quot;&gt;debian-lts-announce&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;posted &lt;a href=&quot;https://lists.debian.org/aYx6fIVKGUd8vm_0@decadent.org.uk&quot;&gt;[SECURITY] [DLA 4475-1] linux security update&lt;/a&gt;&lt;/li&gt;
          &lt;li&gt;posted &lt;a href=&quot;https://lists.debian.org/aYx6kDKKWKgYTGwo@decadent.org.uk&quot;&gt;[SECURITY] [DLA 4476-1] linux-6.1 security update&lt;/a&gt;&lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://lists.zytor.com/archives/klibc/&quot;&gt;klibc&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;replied to [PATCH 1/2] [klibc] explicitly close arm64 syscall stub generator output&lt;/li&gt;
          &lt;li&gt;replied to [PATCH] [klibc] fix arm stub alignment&lt;/li&gt;
          &lt;li&gt;replied to [PATCH] [klibc] remove unneeded syscalls.mk dependencies&lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://lore.kernel.org/linux-hwmon/&quot;&gt;linux-hwmon&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;replied to &lt;a href=&quot;https://lore.kernel.org/linux-hwmon/f6710a1f44d2b32df1cb9b09cddc6695bf76eec2.camel@decadent.org.uk/T/&quot;&gt;[PATCH] hwmon: (max16065) Use READ/WRITE_ONCE to avoid compiler optimization induced race&lt;/a&gt;&lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://lore.kernel.org/linux-wireless/&quot;&gt;linux-wireless&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;posted &lt;a href=&quot;https://lore.kernel.org/linux-wireless/aZN3thlmaiBxYVQQ@decadent.org.uk/T/&quot;&gt;[PATCH] wireless-regdb: Fix regulatory.bin signing with new M2Crypto&lt;/a&gt;&lt;/li&gt;
          &lt;li&gt;posted &lt;a href=&quot;https://lore.kernel.org/linux-wireless/aZN4FltUUWKUh6rp@decadent.org.uk/T/&quot;&gt;[PATCH] wireless-regdb: Replace M2Crypto with cryptography package&lt;/a&gt;&lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://lore.kernel.org/platform-driver-x86/&quot;&gt;platform-driver-x86&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;replied to &lt;a href=&quot;https://lore.kernel.org/platform-driver-x86/46bc9091ac8d36a237c575a0cd140752872b44bc.camel@decadent.org.uk/T/&quot;&gt;[PATCH] platform/x86: hp-bioscfg: Support allocations of larger data&lt;/a&gt;&lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
      &lt;li&gt;&lt;a href=&quot;https://lore.kernel.org/stable/&quot;&gt;stable&lt;/a&gt;:
        &lt;ul&gt;
          &lt;li&gt;(LTS) replied to &lt;a href=&quot;https://lore.kernel.org/stable/781f4e83b6a111cfd3c8a331ea75824d9238fe0f.camel@decadent.org.uk/T/&quot;&gt;Please apply commit 9990ddf47d41 (“net: tunnel: make skb_vlan_inet_prepare() return drop reasons”) down to 6.1.y at least&lt;/a&gt;&lt;/li&gt;
          &lt;li&gt;(LTS) reviewed and replied to
&lt;a href=&quot;https://lore.kernel.org/stable/03a74299797f4864d0e563cd9517276f690a4bf0.camel@decadent.org.uk/T/&quot;&gt;various&lt;/a&gt;
&lt;a href=&quot;https://lore.kernel.org/stable/5cee4d2e571b3132a95cca6f6230c769b8618836.camel@decadent.org.uk/T/&quot;&gt;patches&lt;/a&gt;
&lt;a href=&quot;https://lore.kernel.org/stable/24fb4c47ea6f4c8025f6b0592088c1a9d10741a4.camel@decadent.org.uk/T/&quot;&gt;for&lt;/a&gt;
&lt;a href=&quot;https://lore.kernel.org/stable/ec318a7c1b9a06836b8694a1b63e187d3f53bd80.camel@decadent.org.uk/T/&quot;&gt;5.10&lt;/a&gt;
&lt;a href=&quot;https://lore.kernel.org/stable/eb892614c9cd28aa03922567f8a6d75ed2f594bc.camel@decadent.org.uk/T/&quot;&gt;…&lt;/a&gt;
&lt;a href=&quot;https://lore.kernel.org/stable/bce38fd1f10ecc0ae3ec3ccf95da89f58ca3e623.camel@decadent.org.uk/T/&quot;&gt;…&lt;/a&gt;
&lt;a href=&quot;https://lore.kernel.org/stable/1a11526ae3d8664f705b541b8d6ea57b847b49a8.camel@decadent.org.uk/T/&quot;&gt;…&lt;/a&gt;&lt;/li&gt;
          &lt;li&gt;(LTS) posted &lt;a href=&quot;https://lore.kernel.org/stable/aZyfxkHromvUPszw@decadent.org.uk/T/&quot;&gt;[PATCH 5.10,5.15] ip6_tunnel: Fix usage of skb_vlan_inet_prepare()&lt;/a&gt;&lt;/li&gt;
          &lt;li&gt;replied to &lt;a href=&quot;https://lore.kernel.org/stable/94cad986396d5a231a60d41cb6f86da146a6b435.camel@decadent.org.uk/T/&quot;&gt;[PATCH 6.12 519/567] gpiolib: acpi: Move quirks to a separate file&lt;/a&gt;&lt;/li&gt;
        &lt;/ul&gt;
      &lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
&lt;/ul&gt; </description> 
	<pubDate>Mon, 02 Mar 2026 16:28:58 +0000</pubDate>

</item> 
<item>
	<title>Valhalla&#39;s Things: A Pen Case (or a Few)</title>
	<guid>https://blog.trueelena.org/blog/2026/03/02-a_pen_case_or_a_few/index.html</guid>
	<link>https://blog.trueelena.org/blog/2026/03/02-a_pen_case_or_a_few/index.html</link>
     <description>  &lt;article&gt;
    &lt;section class=&quot;header&quot;&gt;
        Posted on March  2, 2026
        &lt;br /&gt;
        
        Tags: &lt;a href=&quot;https://blog.trueelena.org/tags/madeof%3Aatoms.html&quot; title=&quot;All pages tagged &#39;madeof:atoms&#39;.&quot;&gt;madeof:atoms&lt;/a&gt;, &lt;a href=&quot;https://blog.trueelena.org/tags/FreeSoftWear.html&quot; title=&quot;All pages tagged &#39;FreeSoftWear&#39;.&quot;&gt;FreeSoftWear&lt;/a&gt;, &lt;a href=&quot;https://blog.trueelena.org/tags/craft%3Asewing.html&quot; title=&quot;All pages tagged &#39;craft:sewing&#39;.&quot;&gt;craft:sewing&lt;/a&gt;
        
    &lt;/section&gt;
    &lt;section&gt;
        &lt;p&gt;&lt;img alt=&quot;A pen case made of two pieces of a relatively stiff black material with a flat base and three separate channels on top, plus a flap covering everything and a band to keep the flap closed; there is visible light blue stitching all around the channels.&quot; class=&quot;align-center&quot; src=&quot;https://blog.trueelena.org/blog/2026/03/02-a_pen_case_or_a_few/pen_case.jpg&quot; style=&quot;width: 80.0%;&quot; /&gt;&lt;/p&gt;
&lt;p&gt;For my birthday, I’ve bought myself a fancy new expensive&lt;a class=&quot;footnote-ref&quot; href=&quot;https://blog.trueelena.org#fn1&quot; id=&quot;fnref1&quot;&gt;&lt;sup&gt;1&lt;/sup&gt;&lt;/a&gt;
fountain pen.&lt;/p&gt;
&lt;p&gt;&lt;img alt=&quot;A two slot pen case in the same material as above, but brown: the flap is too short to cover the pens, and there isn&#39;t a band to keep it closed.&quot; class=&quot;align-center&quot; src=&quot;https://blog.trueelena.org/blog/2026/03/02-a_pen_case_or_a_few/failed_prototype.jpg&quot; style=&quot;width: 80.0%;&quot; /&gt;&lt;/p&gt;
&lt;p&gt;Such a fancy pen, of course requires a suitable case: I couldn’t use the
failed prototype of a case I’ve been keeping my Preppys in, so I had to
get out the nice vegetable tanned leather… Yeah, nope, I don’t have that
(yet). I got out the latex and cardboard material that is sold as a
(cheaper) leather substitute, doesn’t look like leather at all, but is
quite nice (and easy) to work with. The project is not vegan anyway,
because I used waxed linen thread, waxing it myself with a lot of very
nicely smelling beeswax.&lt;/p&gt;
&lt;p&gt;&lt;img alt=&quot;a case similar to the one above, but this one only has two slots, and there is a a Faber Castell pen nested on top of the case between the two slots. Here the stitches are white, and in a coarser thread.&quot; class=&quot;align-center&quot; src=&quot;https://blog.trueelena.org/blog/2026/03/02-a_pen_case_or_a_few/2_pen_case.jpg&quot; style=&quot;width: 80.0%;&quot; /&gt;&lt;/p&gt;
&lt;p&gt;I got the measurements&lt;a class=&quot;footnote-ref&quot; href=&quot;https://blog.trueelena.org#fn2&quot; id=&quot;fnref2&quot;&gt;&lt;sup&gt;2&lt;/sup&gt;&lt;/a&gt; from the less failed prototype
where I keep my desktop pens, and this time I made a &lt;a href=&quot;https://sewing-patterns.trueelena.org/accessories/cases/pen_case/index.html&quot;&gt;proper pattern I
could share online&lt;/a&gt;,
under the usual Free Culture license.&lt;/p&gt;
&lt;p&gt;&lt;img alt=&quot;A case like the one above, except that the stitches are in black, and not as regular. This one has also been scrunched up a bit for a different look, and now the band is a bit too wide.&quot; class=&quot;align-center&quot; src=&quot;https://blog.trueelena.org/blog/2026/03/02-a_pen_case_or_a_few/second_case.jpg&quot; style=&quot;width: 80.0%;&quot; /&gt;&lt;/p&gt;
&lt;p&gt;From the width of the material I could conveniently cut two cases, so
that’s what I did, started sewing the first one, realized that I got the
order of stitching wrong, and also that if I used light blue thread
instead of the black one it would look nice, and be easier to see in the
pictures for the published pattern, started sewing the second one, and
kept alternating between the two, depending on the availability of light
for taking pictures.&lt;/p&gt;
&lt;p&gt;&lt;img alt=&quot;The open pen case, showing two pens, a blue Preppy and a gunmetal Plaisir cosily nested in the two outer slots, while the middle slot is ominously empty.&quot; class=&quot;align-center&quot; src=&quot;https://blog.trueelena.org/blog/2026/03/02-a_pen_case_or_a_few/case_with_a_missing_pen.jpg&quot; style=&quot;width: 80.0%;&quot; /&gt;&lt;/p&gt;
&lt;p&gt;One of the two took the place of my desktop one, where I had one more
pen than slots, and one of the old prototypes was moved to keep my
bedside pen, and the other new case was used for the new pen in my
handbag, together with a Preppy, and now I have a free slot and you can
see how this is going to go wrong, right? :D&lt;/p&gt;
&lt;section class=&quot;footnotes footnotes-end-of-document&quot;&gt;
&lt;hr /&gt;
&lt;ol&gt;
&lt;li id=&quot;fn1&quot;&gt;&lt;p&gt;16€. plus a 9€ converter, and another 6€ pen to get the
EF nib from, since it wasn’t available for the expensive pen.&lt;a class=&quot;footnote-back&quot; href=&quot;https://blog.trueelena.org#fnref1&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li id=&quot;fn2&quot;&gt;&lt;p&gt;I have them written down somewhere. I couldn’t find
them. So I measured the real thing, with some approximation.&lt;a class=&quot;footnote-back&quot; href=&quot;https://blog.trueelena.org#fnref2&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;/section&gt;
    &lt;/section&gt;
&lt;/article&gt; </description> 
	<pubDate>Mon, 02 Mar 2026 00:00:00 +0000</pubDate>

</item> 
<item>
	<title>Benjamin Mako Hill: Pronunciation</title>
	<guid>https://mako.cc/copyrighteous/?p=3344</guid>
	<link>https://mako.cc/copyrighteous/pronunciation-lesson</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/mako.gif&quot; width=&quot;65&quot; height=&quot;93&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;Had a discussion about how to pronounce the name of Google’s chatbot. Turns out, we were all wrong.&lt;/p&gt;



&lt;figure class=&quot;wp-block-video&quot;&gt;&lt;video controls=&quot;controls&quot; height=&quot;360&quot; src=&quot;https://mako.cc/copyrighteous/wp-content/uploads/2026/03/IMG_5667.mp4&quot; width=&quot;640&quot;&gt;&lt;/video&gt;&lt;/figure&gt; </description> 
	<pubDate>Sun, 01 Mar 2026 18:51:22 +0000</pubDate>

</item> 
<item>
	<title>Junichi Uekawa: The next Debconf happens in Japan.</title>
	<guid>http://www.netfort.gr.jp/~dancer/diary/daily/2026-Mar-1.html.en#2026-Mar-1-13:16:01</guid>
	<link>http://www.netfort.gr.jp/~dancer/diary/daily/2026-Mar-1.html.en#2026-Mar-1-13:16:01</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/dancer.png&quot; width=&quot;75&quot; height=&quot;97&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  The next Debconf happens in Japan. Great news. Feels like we came a long way, but I didn&#39;t personally do much, I just made the first moves.
        &lt;p&gt;&lt;/p&gt; </description> 
	<pubDate>Sun, 01 Mar 2026 04:16:01 +0000</pubDate>

</item> 
<item>
	<title>Mike Gabriel: Debian Lomiri Tablets 2025-2027 - Project Report (Q3/2025)</title>
	<guid>https://sunweavers.net/151 at https://sunweavers.net/blog</guid>
	<link>https://sunweavers.net/blog/node/151</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/sunweaver.png&quot; width=&quot;82&quot; height=&quot;82&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;h3&gt;Debian Lomiri for Debian 13 (previous project)&lt;/h3&gt;

&lt;p&gt;In our previous project around Debian and Lomiri (lasting until July
2025), we achieved to get Lomiri 0.5.0 (and with it another 130
packages) into Debian (with two minor exceptions [1]) just in time
for the Debian 13 release in August 2025.&lt;/p&gt;

&lt;h3&gt;Debian Lomiri for Debian 14&lt;/h3&gt;

&lt;p&gt;At DebConf in Brest, a follow-up project has been designed between the
project sponsor and Fre(i)e Software GmbH [2]. The new project (on paper)
started on 1st August 2025 and project duration was agreed on to be 2
years, allowing our company to work with an equivalent of ~5 FTE on
Lomiri targetting the Debian 14 release some time in the second half of
2027 (an assumed date, let&#39;s see what happens).&lt;/p&gt;

&lt;p&gt;Ongoing work would be covered from day one of the new project and once
all contract details had been properly put on paper end of September,
Fre(i)e Software GmbH started hiring a new team of software developers
and (future) Debian maintainers. (More of that new team in our next
Q4/2025 report).&lt;/p&gt;

&lt;p&gt;The ongoing work of Q3/2025 was basically Guido Berhörster and myself
working on Morph Browser Qt6 (mostly Guido together with Bhushan from
MiraLab [3]) and package maintenance in Debian (mostly me).&lt;/p&gt;

&lt;h3&gt;Morph Browser Qt6&lt;/h3&gt;

&lt;p&gt;The first milestone we could reach with the Qt6 porting of Morph Browser [4]
and related components (LUITK aka lomiri-ui-toolkit (big chunk! [5]),
lomiri-content-hub, lomiri-download-manager and a few other components)
was reached on 21st Sep 2025 with an upload of Morph Browser
1.2.0~git20250813.1ca2aa7+dfsg-1~exp1 to Debian experimental and the
Lomiri PPA [6]).&lt;/p&gt;

&lt;h3&gt;Preparation of Debian 13 Updates (still pending)&lt;/h3&gt;

&lt;p&gt;In background, various Lomiri updates for Debian 13 have been prepared
during Q3/2025 (with a huge patchset), but publishing those to Debian 13
are still pending as tests are still not satisfying.&lt;/p&gt;

&lt;p&gt;[1] lomiri-push-service and nuntium&lt;br /&gt;
[2] &lt;a href=&quot;https://freiesoftware.gmbh&quot; title=&quot;https://freiesoftware.gmbh&quot;&gt;https://freiesoftware.gmbh&lt;/a&gt;&lt;br /&gt;
[3] &lt;a href=&quot;https://miralab.one/&quot; title=&quot;https://miralab.one/&quot;&gt;https://miralab.one/&lt;/a&gt;&lt;br /&gt;
[4] &lt;a href=&quot;https://gitlab.com/ubports/development/core/morph-browser/-/merge_requests/591&quot; title=&quot;https://gitlab.com/ubports/development/core/morph-browser/-/merge_requests/591&quot;&gt;https://gitlab.com/ubports/development/core/morph-browser/-/merge_reques...&lt;/a&gt; et al.&lt;br /&gt;
[5] &lt;a href=&quot;https://gitlab.com/ubports/development/core/lomiri-ui-toolkit/-/merge_requests/94&quot; title=&quot;https://gitlab.com/ubports/development/core/lomiri-ui-toolkit/-/merge_requests/94&quot;&gt;https://gitlab.com/ubports/development/core/lomiri-ui-toolkit/-/merge_re...&lt;/a&gt; et al.&lt;br /&gt;
[6] &lt;a href=&quot;https://launchpad.net/~lomiri&quot; title=&quot;https://launchpad.net/~lomiri&quot;&gt;https://launchpad.net/~lomiri&lt;/a&gt;&lt;/p&gt; </description> 
	<pubDate>Sat, 28 Feb 2026 18:36:37 +0000</pubDate>

</item> 
<item>
	<title>Utkarsh Gupta: FOSS Activites in February 2026</title>
	<guid>https://utkarsh2102.org/posts/foss-in-feb-26/</guid>
	<link>https://utkarsh2102.org/posts/foss-in-feb-26/</link>
     <description>  &lt;p&gt;Here’s my monthly but brief update about the activities I’ve done in the FOSS world.&lt;/p&gt;
&lt;h2 id=&quot;debian&quot;&gt;Debian&lt;/h2&gt;
&lt;figure&gt;
&lt;img src=&quot;https://utkarsh2102.org/images/debian-logo-small.png&quot; /&gt;
&lt;/figure&gt;
&lt;p&gt;Whilst I didn’t get a chance to do much, here are still a few things that I worked on:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Uploaded node-lodash/4.17.21+dfsg+~cs8.31.198.20210220-10 to fix CVE-2025-13465 in unstable. Pinged Xavier and Praveen to see how they feel about the backport for the stable releases.&lt;/li&gt;
&lt;li&gt;Assited a few folks in getting their patches submitted via Salsa.
&lt;ul&gt;
&lt;li&gt;Reviewing pyenv MR for Ujjwal.&lt;/li&gt;
&lt;li&gt;Reviewing and assisting Anshul with his Golang stuff.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Mentoring for newcomers.&lt;/li&gt;
&lt;li&gt;Started to be a bit involved with DC 26 bits.&lt;/li&gt;
&lt;li&gt;Moderation of -project mailing list.&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h2 id=&quot;ubuntu&quot;&gt;Ubuntu&lt;/h2&gt;
&lt;figure&gt;
&lt;img src=&quot;https://utkarsh2102.org/images/ubuntu-logo-small.png&quot; /&gt;
&lt;/figure&gt;
&lt;p&gt;I joined &lt;a href=&quot;https://utkarsh2102.org/posts/hello-canonical/&quot;&gt;Canonical to work on Ubuntu full-time&lt;/a&gt; back in February 2021.&lt;/p&gt;
&lt;p&gt;Whilst I can’t give a full, detailed list of things I did, here’s a quick TL;DR of what I did:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Successfully released &lt;a href=&quot;https://discourse.ubuntu.com/t/resolute-snapshot-4-released/77687&quot;&gt;Resolute Snapshot 4&lt;/a&gt;!
&lt;ul&gt;
&lt;li&gt;This one was also done without the ISO tracker and cdimage access.&lt;/li&gt;
&lt;li&gt;We also worked very hard to build and promote all the image in due time.&lt;/li&gt;
&lt;li&gt;This was one done with the Test Observer. The first one.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Worked further on the whole artifact signing story for cdimage.&lt;/li&gt;
&lt;li&gt;Assisted a bunch of folks with my Archive Admin and Release team hats to:
&lt;ul&gt;
&lt;li&gt;Review and grant FFes.&lt;/li&gt;
&lt;li&gt;Coordinating weekly syncs.&lt;/li&gt;
&lt;li&gt;Promoting binaries to main.&lt;/li&gt;
&lt;li&gt;Taking care of package removals and so on.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Was pretty occupied with the new release processs architecture.&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;h2 id=&quot;debian-elts&quot;&gt;Debian (E)LTS&lt;/h2&gt;
&lt;figure&gt;
&lt;img src=&quot;https://utkarsh2102.org/images/debian-lts-small.png&quot; /&gt;
&lt;/figure&gt;
&lt;p&gt;This month I have worked 42 hours
on &lt;a href=&quot;https://www.freexian.com/lts/debian/&quot;&gt;Debian Long Term Support (LTS)&lt;/a&gt;
and on its sister &lt;a href=&quot;https://www.freexian.com/lts/extended/&quot;&gt;Extended LTS&lt;/a&gt;
project and did the following things:&lt;/p&gt;
&lt;h3 id=&quot;released-security-updates&quot;&gt;Released Security Updates&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;xrdp&lt;/strong&gt;: Unauthenticated stack-based buffer overflow in the RDP server.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;[LTS]&lt;/strong&gt;: Fixed &lt;a href=&quot;https://security-tracker.debian.org/tracker/CVE-2025-68670&quot;&gt;CVE-2025-68670&lt;/a&gt; via &lt;a href=&quot;https://tracker.debian.org/news/1712208/accepted-xrdp-09211-1deb11u3-source-into-oldoldstable-security/&quot;&gt;&lt;strong&gt;0.9.21.1-1~deb11u3&lt;/strong&gt;&lt;/a&gt; for bullseye. This has been released as &lt;a href=&quot;https://www.debian.org/lts/security/2026/DLA-4464-1&quot;&gt;&lt;strong&gt;DLA 4464-1&lt;/strong&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;[ELTS]&lt;/strong&gt;: Fixed &lt;a href=&quot;https://security-tracker.debian.org/tracker/CVE-2025-68670&quot;&gt;CVE-2025-68670&lt;/a&gt; for buster via &lt;strong&gt;0.9.9-1+deb10u5&lt;/strong&gt;. This has been released as &lt;a href=&quot;https://www.freexian.com/lts/extended/updates/ela-1636-1-xrdp/&quot;&gt;&lt;strong&gt;ELA 1636-1&lt;/strong&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;phpunit&lt;/strong&gt;: Unsafe deserialization of code coverage data in PHPT test execution.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;[LTS]&lt;/strong&gt;: Fixed &lt;a href=&quot;https://security-tracker.debian.org/tracker/CVE-2026-24765&quot;&gt;CVE-2026-24765&lt;/a&gt; via &lt;a href=&quot;https://tracker.debian.org/news/1713511/accepted-phpunit-952-1deb11u1-source-into-oldoldstable-security/&quot;&gt;&lt;strong&gt;9.5.2-1+deb11u1&lt;/strong&gt;&lt;/a&gt; for bullseye. This has been released as &lt;a href=&quot;https://www.debian.org/lts/security/2026/DLA-4470-1&quot;&gt;&lt;strong&gt;DLA 4470-1&lt;/strong&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;[ELTS]&lt;/strong&gt;: Fixed &lt;a href=&quot;https://security-tracker.debian.org/tracker/CVE-2026-24765&quot;&gt;CVE-2026-24765&lt;/a&gt; for buster via &lt;strong&gt;7.5.6-1+deb10u1&lt;/strong&gt;. This has been released as &lt;a href=&quot;https://www.freexian.com/lts/extended/updates/ela-1638-1-phpunit/&quot;&gt;&lt;strong&gt;ELA 1638-1&lt;/strong&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id=&quot;work-in-progress&quot;&gt;Work in Progress&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;knot-resolver&lt;/strong&gt;: Affected by CVE-2023-26249, CVE-2023-46317, and CVE-2022-40188, leading to Denial of Service.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;[LTS]&lt;/strong&gt;: Some back and forth discussion with maintainers on the best way to proceed for the bullseye upload.
&lt;ul&gt;
&lt;li&gt;Git repository for bullseye: &lt;a href=&quot;https://salsa.debian.org/lts-team/packages/knot-resolver/-/tree/debian/bullseye&quot;&gt;https://salsa.debian.org/lts-team/packages/knot-resolver/-/tree/debian/bullseye&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;ruby-rack&lt;/strong&gt;: There were multiple vulnerabilities reported in Rack, leading to DoS (memory exhaustion) and proxy bypass.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;[ELTS]&lt;/strong&gt;: I’ve partially reviewed patches by Bastien but whilst doing so, new ones came up for all the releases.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;[sid]&lt;/strong&gt;: Uploaded ruby-rack/3.2.5-1 to unstable to fix the new CVEs, CVE-2026-25500 and CVE-2026-22860.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;[stable]&lt;/strong&gt;: Prepared the upload for stable releases, as requested by the Security Team. Will upload them in March.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;[bullseye]&lt;/strong&gt;: Backport for CVE-2026-22860 hasn’t been as trivial as it appears to be. There are test failures, which aren’t really ignore-able:
&lt;blockquote&gt;
&lt;p&gt;The tests requests &lt;code&gt;/cgi/../test&lt;/code&gt;, which &lt;code&gt;File.expand_path&lt;/code&gt; resolves to &lt;code&gt;&amp;lt;root&amp;gt;/test&lt;/code&gt; - firmly inside &lt;code&gt;@root&lt;/code&gt;. With the old code, it hit the return &lt;code&gt;unless path_info.include? &quot;..&quot;&lt;/code&gt; guard and fell through to 403. But it would also have returned early via &lt;code&gt;start_with?(@root)&lt;/code&gt; being true…&lt;/p&gt;
&lt;/blockquote&gt;
&lt;/li&gt;
&lt;li&gt;…anyway, this needs more investigation - I’ll continue to do so in March as P1.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;node-lodash&lt;/strong&gt;: Affected by CVE-2025-13465, lrototype pollution in baseUnset function.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;[stable]&lt;/strong&gt;: The patch for trixie and bookworm are ready and have been awaiting the maintainer review.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;[LTS]&lt;/strong&gt;: The bullseye upload will follow once the stable uploads are in and ACK’d by the SRMs.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;vlc&lt;/strong&gt;: Affected by CVE-2025-51602, an out-of-bounds read and denial of service via a crafted 0x01 response from an MMS server.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;[LTS]&lt;/strong&gt;: 3.0.23 backport is ready but not tested. I’ll get this over the line in March.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;[ELTS]&lt;/strong&gt;: 3.0.23 backport is ready but not very clean. Would like to complete LTS and get back to this.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id=&quot;other-activities&quot;&gt;Other Activities&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;[LTS]&lt;/strong&gt; Coordinated the libvirt sitaution with Ben Hutchings. The upload has already been prepped, will upload in March.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;[ELTS]&lt;/strong&gt; Continued to help Bastien and Markus with the tomcat9 regression for buster.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;[ELTS]&lt;/strong&gt; Partially reviewed ruby-rack CVEs to help Bastien. This took more time than expected with some more develpments hapening at the end of the month - new CVEs that aren’t as easy to backport as I expected them to be. Will continue with these in March.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;[E/LTS] Front Desk duties&lt;/strong&gt;: Performed a large batch of CVE triage, marking numerous packages for bullseye, buster, and stretch as either &lt;code&gt;postponed&lt;/code&gt;, &lt;code&gt;end-of-life&lt;/code&gt;, &lt;code&gt;not-affected&lt;/code&gt;, or added them to the Xla-needed.txt files.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Also replied to certain mails and IRC texts around FD related tasks.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;[E/LTS]&lt;/strong&gt; Monitored discussions on mailing lists, IRC, and all the documentation updates.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;[E/LTS]&lt;/strong&gt; Attended the monthly LTS meeting on Jitsi. &lt;a href=&quot;https://lists.debian.org/debian-lts/2026/02/msg00026.html&quot;&gt;Summary here&lt;/a&gt;.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr /&gt;
&lt;p&gt;Until next time.&lt;br /&gt;
&lt;code&gt;:wq&lt;/code&gt; for today.&lt;/p&gt; </description> 
	<pubDate>Sat, 28 Feb 2026 05:41:11 +0000</pubDate>

</item> 
<item>
	<title>Daniel Baumann: Debian Fast Forward: An alternative backports repository</title>
	<guid>https://blog.daniel-baumann.ch/posts/20260228-1.html</guid>
	<link>https://blog.daniel-baumann.ch/posts/20260228-1.html</link>
     <description>  &lt;section id=&quot;debian-fast-forward-an-alternative-backports-repository&quot;&gt;

&lt;p&gt;The &lt;a class=&quot;reference external&quot; href=&quot;https://debian.org&quot;&gt;Debian&lt;/a&gt; project releases a new &lt;code class=&quot;docutils literal notranslate&quot;&gt;&lt;span class=&quot;pre&quot;&gt;stable&lt;/span&gt;&lt;/code&gt; version of its &lt;a class=&quot;reference external&quot; href=&quot;https://en.wikipedia.org/wiki/Linux&quot;&gt;Linux&lt;/a&gt; distribution approximately every two years. During its life time, a &lt;code class=&quot;docutils literal notranslate&quot;&gt;&lt;span class=&quot;pre&quot;&gt;stable&lt;/span&gt;&lt;/code&gt; release usually gets security updates only, but in general no feature updates.&lt;/p&gt;
&lt;p&gt;For some packages it is desirable to get feature updates earlier than with the next &lt;code class=&quot;docutils literal notranslate&quot;&gt;&lt;span class=&quot;pre&quot;&gt;stable&lt;/span&gt;&lt;/code&gt; release. Some new packages included in Debian after the initial release of a &lt;code class=&quot;docutils literal notranslate&quot;&gt;&lt;span class=&quot;pre&quot;&gt;stable&lt;/span&gt;&lt;/code&gt; distribution are desirable for &lt;code class=&quot;docutils literal notranslate&quot;&gt;&lt;span class=&quot;pre&quot;&gt;stable&lt;/span&gt;&lt;/code&gt; too.&lt;/p&gt;
&lt;p&gt;Both use-cases can be solved by recompiling the newer version of a package from &lt;code class=&quot;docutils literal notranslate&quot;&gt;&lt;span class=&quot;pre&quot;&gt;testing/unstable&lt;/span&gt;&lt;/code&gt; on &lt;code class=&quot;docutils literal notranslate&quot;&gt;&lt;span class=&quot;pre&quot;&gt;stable&lt;/span&gt;&lt;/code&gt; (aka &lt;a class=&quot;reference external&quot; href=&quot;https://en.wikipedia.org/wiki/Backporting&quot;&gt;backporting&lt;/a&gt;). Packages are backported together with only the minimal amount of required build-depends or depends not already fulfilled in &lt;code class=&quot;docutils literal notranslate&quot;&gt;&lt;span class=&quot;pre&quot;&gt;stable&lt;/span&gt;&lt;/code&gt; (if any), and without any changes unless required to fix building on &lt;code class=&quot;docutils literal notranslate&quot;&gt;&lt;span class=&quot;pre&quot;&gt;stable&lt;/span&gt;&lt;/code&gt; (if needed).&lt;/p&gt;
&lt;p&gt;There are official &lt;a class=&quot;reference external&quot; href=&quot;https://backports.debian.org&quot;&gt;Debian Backports&lt;/a&gt; available, as well as several well-known unofficial backports repositories. I have been involved in one of these unofficial repositories since 2005 which subsequently turned 2010 into its own &lt;a class=&quot;reference external&quot; href=&quot;https://debian.org/derivatives&quot;&gt;Debian derivative&lt;/a&gt;, mixing both backports and modified packages in one repository for simplicity.&lt;/p&gt;
&lt;p&gt;Starting with the &lt;a class=&quot;reference external&quot; href=&quot;https://www.debian.org/releases/trixie&quot;&gt;Debian 13 (trixie)&lt;/a&gt; release, the (otherwise unmodified) backports of this derivative have been split out from the derivative distribution into a separate repository. This way the backports are more accessible and useful for all interested Debian users too.&lt;/p&gt;
&lt;section id=&quot;tl-dr-debian-fast-forward-https-fastforward-debian-net&quot;&gt;
&lt;h2&gt;TL;DR: &lt;a class=&quot;reference external&quot; href=&quot;https://fastforward.debian.net&quot;&gt;Debian Fast Forward&lt;/a&gt; - &lt;a class=&quot;reference external&quot; href=&quot;https://fastforward.debian.net&quot;&gt;https://fastforward.debian.net&lt;/a&gt;&lt;/h2&gt;
&lt;blockquote&gt;
&lt;div&gt;&lt;ul class=&quot;simple&quot;&gt;
&lt;li&gt;&lt;p&gt;is an alternative Debian repository containing complementary backports from &lt;code class=&quot;docutils literal notranslate&quot;&gt;&lt;span class=&quot;pre&quot;&gt;testing/unstable&lt;/span&gt;&lt;/code&gt; to &lt;code class=&quot;docutils literal notranslate&quot;&gt;&lt;span class=&quot;pre&quot;&gt;stable&lt;/span&gt;&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;with packages organized in a curated, self-contained selection of coherent sets&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;supporting &lt;code class=&quot;docutils literal notranslate&quot;&gt;&lt;span class=&quot;pre&quot;&gt;amd64&lt;/span&gt;&lt;/code&gt;, &lt;code class=&quot;docutils literal notranslate&quot;&gt;&lt;span class=&quot;pre&quot;&gt;i386&lt;/span&gt;&lt;/code&gt;, and &lt;code class=&quot;docutils literal notranslate&quot;&gt;&lt;span class=&quot;pre&quot;&gt;arm64&lt;/span&gt;&lt;/code&gt; architectures&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;containing around 400 &lt;a class=&quot;reference external&quot; href=&quot;https://packages.fastforward.debian.net&quot;&gt;packages&lt;/a&gt; in &lt;code class=&quot;docutils literal notranslate&quot;&gt;&lt;span class=&quot;pre&quot;&gt;trixie-fastforward-backports&lt;/span&gt;&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;with 1’800 &lt;a class=&quot;reference external&quot; href=&quot;https://changes.fastforward.debian.net&quot;&gt;uploads&lt;/a&gt; since July 2025&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;&lt;/blockquote&gt;
&lt;p&gt;End user documentation about how to enable Debian Fast Forwards is &lt;a class=&quot;reference external&quot; href=&quot;https://fastforward.debian.net/doc/installation&quot;&gt;available&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Have fun!&lt;/p&gt;
&lt;/section&gt;
&lt;/section&gt; </description> 
	<pubDate>Sat, 28 Feb 2026 00:00:00 +0000</pubDate>

</item> 
<item>
	<title>Petter Reinholdtsen: Free software toolchain for the simplest RISC-V CPU in a small FPGA?</title>
	<guid>http://www.hungry.com/~pere/blog/Free_software_toolchain_for_the_simplest_RISC_V_CPU_in_a_small_FPGA_.html</guid>
	<link>http://www.hungry.com/~pere/blog/Free_software_toolchain_for_the_simplest_RISC_V_CPU_in_a_small_FPGA_.html</link>
     <description>  &lt;p&gt;On Wednesday I had the pleasure of attending a presentation organized
by &lt;a href=&quot;https://www.nuug.no/&quot;&gt;the Norwegian Unix Users Group&lt;/a&gt; on
&lt;a href=&quot;https://nuug.no/aktiviteter/20260225-Open-RISC-V-prosessor-i-FPGA/&quot;&gt;implementing
RISC-V using a small FPGA&lt;/a&gt;.  This project is the result of a
university teacher wanting to teach students assembly programming
using a real instruction set, while still providing a simple and
transparent CPU environment.  The CPU in question implements the
smallest set of opcodes needed to still call the CPU a RISC-V CPU,
&lt;a href=&quot;https://docs.riscv.org/reference/isa/unpriv/rv32.html&quot;&gt;the
RV32I base set&lt;/a&gt;.  The author and presenter, Kristoffer Robin
Stokke, demonstrated how to build both the FPGA setup and a small
startup code providing a &quot;Hello World&quot; message over both serial port
and a small LCD display.  The FPGA is programmed using VHDL,
&lt;a href=&quot;https://github.com/memstick/fpga.git&quot;&gt;the entire source
code&lt;/a&gt; is available from github, but unfortunately the target FPGA
setup is compiled using the proprietary tool Quartus.  It is such
a pity that such a cool little piece of free software should be
chained down by non-free software, so my friend Jon Nordby set out to
see if we can liberate this small RISC-V CPU.  After all, it would be
unforgivable sin to force students to use non-free software to study
at the University of Oslo.&lt;/p&gt;

&lt;p&gt;The VHDL code for the CPU instructions itself is only 1138 lines,
if I am to believe &lt;tt&gt;wc -l lib/riscv_common/* lib/rv32i/*&lt;/tt&gt;.  On
the small FPGA used during the talk, the entire CPU, ROM, display and
serial port driver only used up half the capacity.  These days, there
exists a free software toolchain for FPGA programming not only in
Verilog but also in VHDL, and we hope the support in
&lt;a href=&quot;https://tracker.debian.org/pkg/yosys&quot;&gt;yosys&lt;/a&gt;,
&lt;a href=&quot;https://tracker.debian.org/pkg/ghdl&quot;&gt;ghdl&lt;/a&gt;, and
&lt;a href=&quot;https://tracker.debian.org/pkg/yosys-plugin-ghdl&quot;&gt;yosys-plugin-ghdl&lt;/a&gt;
(sadly and strangely enough, removed from Debian unstable) is complete
enough to at least build this small and simple project with some minor
portability fixes.  Or perhaps there are other approaches that work
better?  The first patches are already floating on github, to make the
VHDL code more portable and to test out the build.  If you are
interested in running your own little RISC-V CPU on a FPGA chip,
please get in touch.&lt;/p&gt;

&lt;p&gt;At the moment we sadly have hit a GHDL bug, which we do not quite
know how to work around or fix:&lt;/p&gt;

&lt;blockquote&gt;&lt;pre&gt;******************** GHDL Bug occurred ***************************
Please report this bug on https://github.com/ghdl/ghdl/issues
GHDL release: 5.0.1 (Debian 5.0.1+dfsg-1+b1) [Dunoon edition]
Compiled with unknown compiler version
Target: x86_64-linux-gnu
/scratch/pere/src/fpga/memstick-fpga-riscv-upstream/
Command line:

Exception CONSTRAINT_ERROR raised
Exception information:
raised CONSTRAINT_ERROR : synth-vhdl_expr.adb:1763 discriminant check failed
******************************************************************
&lt;/pre&gt;&lt;/blockquote&gt;

&lt;p&gt;Thus more work is needed.  For me, this simple project is the first
stepping stone for a larger dream I have of converting &lt;a href=&quot;http://wiki.linuxcnc.org/cgi-bin/wiki.pl?Mesa_Cards&quot;&gt;the MESA
machine controller system&lt;/a&gt; to build its firmware using a free
software toolchain.  I just need to learn more FPGA programming
first. :)&lt;/p&gt;

&lt;p&gt;As usual, if you use Bitcoin and want to show your support of my
activities, please send Bitcoin donations to my address
&lt;b&gt;&lt;a&gt;15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b&lt;/a&gt;&lt;/b&gt;.&lt;/p&gt; </description> 
	<pubDate>Fri, 27 Feb 2026 21:30:00 +0000</pubDate>

</item> 
<item>
	<title>Sahil Dhiman: Publicly Available NKN Data Traffic Graphs</title>
	<guid>https://blog.sahilister.in/2026/02/publicly-available-nkn-data-traffic-graphs/</guid>
	<link>https://blog.sahilister.in/2026/02/publicly-available-nkn-data-traffic-graphs/</link>
     <description>  &lt;p&gt;National Knowledge Network (NKN) is one of India’s main National Research and Educational Network (NREN). The other being the less prevalent Education and Research Network (ERNET).&lt;/p&gt;
&lt;p&gt;This post grew out of &lt;a href=&quot;https://toots.sahilister.in/@sahil/115308911066583184&quot;&gt;this Mastodon thread&lt;/a&gt; where I kept on adding various public graphs (from various global research and educational entities) that peer or connect with NKN. This was to get some purview about traffic data between them and NKN.&lt;/p&gt;
&lt;h3 id=&quot;cern&quot;&gt;CERN&lt;/h3&gt;
&lt;p&gt;CERN, birthplace of the World Wide Web (WWW) and home of the Large Hadron Collider (LHC).&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Graph - &lt;a href=&quot;https://monit-grafana-open.cern.ch/d/XoND3VQ4k/nkn?orgId=16&amp;amp;from=now-30d&amp;amp;to=now&amp;amp;timezone=browser&amp;amp;var-source=raw&amp;amp;var-bin=5m&quot;&gt;https://monit-grafana-open.cern.ch/d/XoND3VQ4k/nkn?orgId=16&amp;amp;from=now-30d&amp;amp;to=now&amp;amp;timezone=browser&amp;amp;var-source=raw&amp;amp;var-bin=5m&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Connection seems to be 2x10 G through CERNLight.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;India participates in the LHCONE project, which carries LHC data over these links for scientific research purposes. &lt;a href=&quot;https://indico.cern.ch/event/1411901/contributions/6017912/attachments/2919948/5125000/KolkataTier2_ATCFatTIFR2024.pdf&quot;&gt;This presentation from Vikas Singhal&lt;/a&gt; from Variable Energy Cyclotron Centre (VECC), Kolkata, at the 8th Asian Tier Center Forum in 2024 gives some details.&lt;/p&gt;
&lt;h3 id=&quot;géant&quot;&gt;GÉANT&lt;/h3&gt;
&lt;p&gt;GÉANT is pan European Union’s collaboration of NRENs.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Graph &lt;a href=&quot;https://public-brian.geant.org/d/ZyVgYFgVk/nkn?orgId=5&amp;amp;from=now-30d&amp;amp;to=now&quot;&gt;https://public-brian.geant.org/d/ZyVgYFgVk/nkn?orgId=5&amp;amp;from=now-30d&amp;amp;to=now&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;NKN connects at Amsterdam POP.&lt;/li&gt;
&lt;li&gt;From the &lt;a href=&quot;https://web.archive.org/web/20161013145202/https://www.geant.org/News_and_Events/Pages/New-NKN-international-network-to-boost-EU-India-collaboration.aspx&quot;&gt;2016 press release&lt;/a&gt; from GÉANT, NKN seems to have 2x10 G capacity towards GÉANT. Things might have changed since.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id=&quot;learn&quot;&gt;LEARN&lt;/h3&gt;
&lt;p&gt;Lanka Education and Research Network (LEARN) is Sri Lanka’s NREN.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Graph &lt;a href=&quot;https://kirigal.learn.ac.lk/traffic/details.php?desc=slt-b1-new&amp;amp;data_id=1788&amp;amp;data_id_v6=&amp;amp;sub_bw=1000&amp;amp;name=National%20Knowledge%20Network%20of%20India%20(1G)&amp;amp;swap_inout=&quot;&gt;https://kirigal.learn.ac.lk/traffic/details.php?desc=slt-b1-new&amp;amp;data_id=1788&amp;amp;data_id_v6=&amp;amp;sub_bw=1000&amp;amp;name=National%20Knowledge%20Network%20of%20India%20(1G)&amp;amp;swap_inout=&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Connection seems to be 1 G.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id=&quot;nordunet&quot;&gt;NORDUnet&lt;/h3&gt;
&lt;p&gt;NREN for Nordic countries.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Graph &lt;a href=&quot;https://stats.nordu.net/stat-q/r-all?q=all&amp;amp;name=NKN&quot;&gt;https://stats.nordu.net/stat-q/r-all?q=all&amp;amp;name=NKN&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;I couldn’t find any public live data transfer graphs from NKN side. If you know any other graphs, do let me know.&lt;/p&gt; </description> 
	<pubDate>Wed, 25 Feb 2026 13:38:36 +0000</pubDate>

</item> 
<item>
	<title>Joachim Breitner: Vibe-coding a debugger for a DSL</title>
	<guid>https://www.joachim-breitner.de/blog/819-Vibe-coding_a_debugger_for_a_DSL</guid>
	<link>https://www.joachim-breitner.de/blog/819-Vibe-coding_a_debugger_for_a_DSL</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/nomeata.png&quot; width=&quot;64&quot; height=&quot;64&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;Earlier this week a colleague of mine, Emilio Jesús Gallego Arias, shared a demo of something he built as an experiment, and I felt the desire to share this and add a bit of reflection. (Not keen on watching a 5 min video? Read on below.)&lt;/p&gt;


&lt;h3 id=&quot;what-was-that&quot;&gt;What was that?&lt;/h3&gt;
&lt;p&gt;So what did you just see (or skipped watching)? You could see Emilio’s screen, running VSCode and editing a Lean file. He designed a small programming language that he embedded into Lean, including an evaluator. So far, so standard, but a few things stick out already:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Using Lean’s very extensible syntax this embedding is rather elegant and pretty.&lt;/li&gt;
&lt;li&gt;Furthermore, he can run this DSL code right there, in the source code, using commands like &lt;a href=&quot;https://lean-lang.org/doc/reference/latest/find/?domain=Verso.Genre.Manual.section&amp;amp;name=hash-eval&quot;&gt;&lt;code&gt;#eval&lt;/code&gt;&lt;/a&gt;. This is a bit like the interpreter found in Haskell or Python, but without needing a separate process, or like using a Jupyter notebook, but without the stateful cell management.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This is already a nice demonstration of Lean’s abilities and strength, as we know them. But what blew my mind the first time was what happened next: He had a visual debugger that allowed him to &lt;em&gt;debug his DSL program&lt;/em&gt;. It appeared on the right, in Lean’s “Info View”, where various Lean tools can hook into, show information and allow the user to interact.&lt;/p&gt;
&lt;p&gt;But it did not stop there, and my mind was blown a second time: Emilio opened VSCode’s „Debugger“ pane on the left, and was able to properly use VSCode’s full-fledged debugger frontend for his own little embedded programming language! Complete with highlighting the executed line, with the ability to set breakpoints there, and showing the state of local variables in the debugger.&lt;/p&gt;
&lt;p&gt;Having a good debugger is not to be taken for granted even for serious, practical programming languages. Having it for a small embedded language that you just built yourself? I wouldn’t have even considered that.&lt;/p&gt;
&lt;h3 id=&quot;did-it-take-long&quot;&gt;Did it take long?&lt;/h3&gt;
&lt;p&gt;If I were Emilio’s manager I would applaud the demo and then would have to ask how many weeks he spent on that. Coming up with the language, getting the syntax extension right, writing the evaluator and especially learning how the debugger integration into VSCode (using the &lt;a href=&quot;https://microsoft.github.io/debug-adapter-protocol/&quot;&gt;DAP protocol&lt;/a&gt;) works, and then instrumenting his evaluator to speak that protocol – that is a sizeable project!&lt;/p&gt;
&lt;p&gt;It turns out the answer isn’t measured in weeks: it took just one day of coding together with GPT-Codex 5.3. My mind was blown a third time.&lt;/p&gt;
&lt;h3 id=&quot;why-does-lean-make-a-difference&quot;&gt;Why does Lean make a difference?&lt;/h3&gt;
&lt;p&gt;I am sure this post is just one of many stories you have read in recent weeks about how new models like Claude Opus 4.6 and GPT-Codex 5.3 built impressive things in hours that would have taken days or more before. But have you seen something like this? Agentic coding is powerful, but limited by what the underlying platform exposes. I claim that Lean is a particularly well-suited platform to unleash the agents’ versatility.&lt;/p&gt;
&lt;p&gt;Here we are using Lean as a programming language, not as a theorem prover (which brings other immediate benefits when using agents, e.g. the produced code can be verified rather than merely plausible, but that’s a story to be told elsewhere.)&lt;/p&gt;
&lt;p&gt;But arguably because Lean is &lt;em&gt;also&lt;/em&gt; a theorem prover, and because of the requirements that stem from that, its architecture is different from that of a conventional programming language implementation:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;As a theorem prover, it needs extensible syntax to allow formalizing mathematics in an ergonomic way, but it can also be used for embedding syntax.&lt;/li&gt;
&lt;li&gt;As a theorem prover, it needs the ability to run “tactics” written by the user, hence the ability to evaluate the code right there in the editor.&lt;/li&gt;
&lt;li&gt;As a theorem prover, it needs to give access to information such as tactic state, and such introspection abilities unlock many other features – such as a debugger for an embedded language.&lt;/li&gt;
&lt;li&gt;As a theorem prover, it has to allow tools to present information like the tactic state, so it has the concept of &lt;a href=&quot;https://lean-lang.org/examples/1900-1-1-widgets/&quot;&gt;interactive “Widgets”&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;So Lean’s design has always made such a feat &lt;em&gt;possible&lt;/em&gt;. But it was no easy feat. The Lean API is large, and documentation never ceases to be improvable. In the past, it would take an expert (or someone willing to become one) to pull off that stunt. These days, coding assistants have no issue digesting, understanding and using the API, as Emilio’s demo shows.&lt;/p&gt;
&lt;p&gt;The combination of Lean’s extensibility and the ability of coding agents to make use of that is a game changer to how we can develop software, with rich, deep, flexible and bespoke ways to interact with our code, created on demand.&lt;/p&gt;
&lt;h3 id=&quot;where-does-that-lead-us&quot;&gt;Where does that lead us?&lt;/h3&gt;
&lt;p&gt;Emilio actually shared more such demos (&lt;a href=&quot;https://github.com/ejgallego/imp-lab&quot;&gt;Github repository&lt;/a&gt;). A visual explorer for the compiler output (&lt;a href=&quot;https://www.joachim-breitner.de/various/lean-compiler-explorer.png&quot;&gt;have a look at the screenshot&lt;/a&gt;. A browser-devtool-like inspection tool for Lean’s “InfoTree”. Any of these provide a significant productivity boost. Any of these would have been a sizeable project half a year ago. Now it’s just a few hours of chatting with the agent.&lt;/p&gt;
&lt;p&gt;So allow me to try and extrapolate into a future where coding agents have continued to advance at the current pace, and are used ubiquitously. Is there then even a point in polishing these tools, shipping them to our users, documenting them? Why build a compiler explorer for our users, if our users can just ask their agent to build one for them, right then when they need it, tailored to precisely the use case they have, with no unnecessary or confusing feature. The code would be single use, as the next time the user needs something like that the agent can just re-create it, maybe slightly different because every use case is different.&lt;/p&gt;
&lt;p&gt;If that comes to pass then Lean may no longer get praise for its nice out-of-the-box user experience, but instead because it is such a powerful framework for ad-hoc UX improvements.&lt;/p&gt;
&lt;p&gt;And Emilio wouldn’t post demos about his debugger. He’d just use it.&lt;/p&gt; </description> 
	<pubDate>Wed, 25 Feb 2026 10:53:30 +0000</pubDate>
  <author>mail@joachim-breitner.de (Joachim Breitner)</author>  
</item> 
<item>
	<title>Louis-Philippe Véronneau: Montreal&#39;s Debian &amp; Stuff - February 2026</title>
	<guid>tag:veronneau.org,2026-02-25:/montreals-debian-stuff-february-2026.html</guid>
	<link>https://veronneau.org/montreals-debian-stuff-february-2026.html</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/pollo.png&quot; width=&quot;65&quot; height=&quot;70&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;Our Debian User Group met on February 22&lt;sup&gt;nd&lt;/sup&gt; for our first meeting of
the year!&lt;/p&gt;
&lt;p&gt;Here&#39;s what we did:&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;pollo&lt;/strong&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;reviewed and merged Lintian contributions:&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://salsa.debian.org/lintian/lintian/-/merge_requests/664&quot;&gt;salsa MR !664&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://salsa.debian.org/lintian/lintian/-/merge_requests/665&quot;&gt;salsa MR !665&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://salsa.debian.org/lintian/lintian/-/merge_requests/666&quot;&gt;salsa MR !666&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;released lintian version &lt;code&gt;2.130.0&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;upstreamed &lt;a href=&quot;https://github.com/GjjvdBurg/wilderness/pull/16&quot;&gt;a patch&lt;/a&gt; for python-wilderness, fixed a
   few things and released version &lt;code&gt;0.1.10-3&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;updated python-clevercsv to version &lt;code&gt;0.8.4&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;updated python-mediafile to version &lt;code&gt;0.14.0&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;lelutin&lt;/strong&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;opened up a &lt;a href=&quot;https://salsa.debian.org/debian/smokeping/-/merge_requests/9&quot;&gt;RFH for co-maintenance for smokeping&lt;/a&gt; and added Marc Haber
   who responded really quickly to the call&lt;/li&gt;
&lt;li&gt;with &lt;strong&gt;mjeanson&lt;/strong&gt;&#39;s help: prepped and uploaded a new smokeping version to
   release pending work&lt;/li&gt;
&lt;li&gt;opened a &lt;a href=&quot;https://nm.debian.org/person/lelutin/&quot;&gt;NM request&lt;/a&gt; to become DM&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;viashimo&lt;/strong&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;fixed freshrss timer&lt;/li&gt;
&lt;li&gt;updated freshrss&lt;/li&gt;
&lt;li&gt;installed new navidrome container&lt;/li&gt;
&lt;li&gt;configured backups for new host (beelink mini s12)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;tvaz&lt;/strong&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;did NM work&lt;/li&gt;
&lt;li&gt;learned more about debusine and tested it&lt;/li&gt;
&lt;li&gt;uploaded antimony to debusine&lt;/li&gt;
&lt;li&gt;(co-)convinced lelutin to apply for DM (yay!)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;lavamind&lt;/strong&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;worked on autopkgtests for a new version of jruby&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Pictures&lt;/h2&gt;
&lt;p&gt;This time around, we held our meeting at &lt;a href=&quot;https://cvm.qc.ca&quot;&gt;cégep du Vieux Montréal&lt;/a&gt;, the
college where I currently work. Here is the view we had:&lt;/p&gt;
&lt;p&gt;&lt;img alt=&quot;View from my office&quot; src=&quot;https://veronneau.org/media/blog/2026-02-25/window.jpg&quot; style=&quot;margin-left: 15%;&quot; title=&quot;View from my office&quot; width=&quot;70%&quot; /&gt;&lt;/p&gt;
&lt;p&gt;We also ordered some delicious pizzas from &lt;a href=&quot;https://www.pizzeriadeicompari.com/&quot;&gt;Pizzeria dei Compari&lt;/a&gt;, a
nice pizzeria on Saint-Denis street that&#39;s been there forever.&lt;/p&gt;
&lt;p&gt;&lt;img alt=&quot;The pizzas we ate&quot; src=&quot;https://veronneau.org/media/blog/2026-02-25/pizza.jpg&quot; style=&quot;margin-left: 15%;&quot; title=&quot;The pizzas we ate&quot; width=&quot;70%&quot; /&gt;&lt;/p&gt;
&lt;p&gt;Some of us ended up grabbing a drink after the event at &lt;a href=&quot;https://amereaboire.com/&quot;&gt;l&#39;Amère à boire&lt;/a&gt;,
a pub right next to the venue, but I didn&#39;t take any pictures.&lt;/p&gt; </description> 
	<pubDate>Tue, 24 Feb 2026 21:45:18 +0000</pubDate>

</item> 
<item>
	<title>John Goerzen: Screen Power Saving in the Linux Console</title>
	<guid>https://changelog.complete.org/?p=42061</guid>
	<link>https://changelog.complete.org/archives/42061-screen-power-saving-in-the-linux-console</link>
     <description>  &lt;p&gt;I just made up a Debian trixie setup that has no need for a GUI.  In fact, I rarely use the text console either.  However, because the machine is dual boot and also serves another purpose, it’s connected to my main monitor and KVM switch.&lt;/p&gt;
&lt;p&gt;The monitor has three inputs, and when whatever display it’s set to goes into powersave mode, it will seek out another one that’s active and automatically switch to it.&lt;/p&gt;
&lt;p&gt;You can probably see where this is heading: it’s really inconvenient if one of the inputs never goes into powersave mode.  And, of course, it wastes energy.&lt;/p&gt;
&lt;p&gt;I have concluded that the Linux text console has lost the ability to enter powersave mode after an inactivity timeout.  It can still do screen blanking — setting every pixel to black — but that is a distinct and much less useful thing.&lt;/p&gt;
&lt;p&gt;You can do a lot of searching online that will tell you what to do.  Almost all of it is wrong these days.  For instance, none of these work:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Anything involving &lt;tt&gt;vbetool&lt;/tt&gt;.  This is really, really old advice.&lt;/li&gt;
&lt;li&gt;Anything involving &lt;tt&gt;xset&lt;/tt&gt;, unless you’re actually running a GUI, which is not the point of this post.&lt;/li&gt;
&lt;li&gt;Anything involving &lt;tt&gt;setterm&lt;/tt&gt; or the kernel parameters &lt;tt&gt;video=DPMS&lt;/tt&gt; or &lt;tt&gt;consoleblank&lt;/tt&gt;.&lt;/li&gt;
&lt;li&gt;Anything involving writing to paths under &lt;tt&gt;/sys&lt;/tt&gt;, such as ones ending in &lt;tt&gt;dpms&lt;/tt&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Why is this?&lt;/p&gt;
&lt;p&gt;Well, we are on at least the third generation of Linux text console display subsystems.  (Maybe more than 3, depending on how you count.)  The three major ones were:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;The VGA text console&lt;/li&gt;
&lt;li&gt;fbdev&lt;/li&gt;
&lt;li&gt;DRI/KMS&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;As I mentioned recently in my post about &lt;a href=&quot;https://changelog.complete.org/archives/10907-running-an-accurate-80x25-dos-style-console-on-modern-linux-is-possible-after-all&quot;&gt;running an accurate 80×25 DOS-style console on modern Linux&lt;/a&gt;, the VGA text console mode is pretty much gone these days.  It relied on hardware rendering of the text fonts, and that capability simply isn’t present on systems that aren’t PCs — or even on PCs that are UEFI, which is most of them now.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://en.wikipedia.org/wiki/Linux_framebuffer&quot;&gt;fbdev&lt;/a&gt;, or a framebuffer console under earlier names, has been in Linux since the late 1990s.  It was the default for most distros until more recently.  It supported DPMS powersave modes, and most of the instructions you will find online reference it.&lt;/p&gt;
&lt;p&gt;Nowadays, the &lt;a href=&quot;https://en.wikipedia.org/wiki/Direct_Rendering_Infrastructure&quot;&gt;DRI&lt;/a&gt;/&lt;a href=&quot;https://en.wikipedia.org/wiki/Direct_Rendering_Manager#Kernel_Mode_Setting&quot;&gt;KMS&lt;/a&gt; system is used for graphics.  Unfortunately, it is targeted mainly at X11 and Wayland.  It is also used for the text console, but things like DPMS-enabled timeouts were never implemented there.&lt;/p&gt;
&lt;p&gt;You can find some manual workarounds — for instance, using &lt;tt&gt;ddcutil&lt;/tt&gt; or similar for an external monitor, or adjusting the &lt;tt&gt;backlight&lt;/tt&gt; files under &lt;tt&gt;/sys&lt;/tt&gt; on a laptop.  But these have a number of flaws — making unwanted brightness adjustments, and not automatically waking up on keypress among them.&lt;/p&gt;
&lt;p&gt;&lt;b&gt;My workaround&lt;/b&gt;&lt;/p&gt;
&lt;p&gt;I finally gave up and ran &lt;tt&gt;apt-get install xdm&lt;/tt&gt;.  Then in &lt;tt&gt;/etc/X11/xdm/Xsetup&lt;/tt&gt;, I added one line:&lt;/p&gt;
&lt;pre&gt;xset dpms 0 0 120
&lt;/pre&gt;
&lt;p&gt;Now the system boots into an xdm login screen, and shuts down the screen after 2 minutes of inactivity.  On the rare occasion where I want a text console from it, I can switch to it and it won’t have a timeout, but I can live with that.&lt;/p&gt;
&lt;p&gt;Thus, quite hopefully, concludes my series of way too much information about the Linux text console!&lt;/p&gt; </description> 
	<pubDate>Tue, 24 Feb 2026 14:22:10 +0000</pubDate>

</item> 
<item>
	<title>Antoine Beaupré: PSA: North america changes time forward soon, Europe next</title>
	<guid>https://anarc.at/blog/2026-02-23-dst-warning/</guid>
	<link>https://anarc.at/blog/2026-02-23-dst-warning/</link>
     <description>  &lt;blockquote&gt;&lt;p&gt;This is a copy of an email I used to send internally at work and now
&lt;a href=&quot;https://lists.torproject.org/mailman3/hyperkitty/list/tor-project@lists.torproject.org/thread/HR3ISDIVLOR5NNAN24F2TCHMPHFOI2XR/&quot;&gt;made public&lt;/a&gt;. I&#39;m not sure I&#39;ll make a habit of posting it here,
especially not &lt;em&gt;twice a year&lt;/em&gt;, unless people really like it. Right
now, it&#39;s mostly here to keep with my current writing spree going.&lt;/p&gt;&lt;/blockquote&gt;

&lt;p&gt;This is your bi-yearly reminder that time is changing soon!&lt;/p&gt;

&lt;h1 id=&quot;whats-happening&quot;&gt;What&#39;s happening?&lt;/h1&gt;

&lt;p&gt;For people not on tor-internal, you should know that I&#39;ve been sending
semi-regular announcements when daylight saving changes occur. Starting
now, I&#39;m making those announcements public so they can be shared with
the wider community because, after all, this affects everyone (kind of).&lt;/p&gt;

&lt;p&gt;For those of you lucky enough to have no idea what I&#39;m talking about,
you should know that some places in the world implement what is called
&lt;a href=&quot;https://en.wikipedia.org/wiki/Daylight_saving_time&quot;&gt;Daylight saving time or DST&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Normally, you shouldn&#39;t have to do anything: computers automatically
change time following local rules, assuming they are correctly
configured, provided recent updates have been applied in the case of a
recent change in said rules (because yes, this happens).&lt;/p&gt;

&lt;p&gt;Appliances, of course, will likely &lt;em&gt;not&lt;/em&gt; change time and will need to
adjusted unless they are so-called &quot;smart&quot; (also known as &quot;part of a bot
net&quot;).&lt;/p&gt;

&lt;p&gt;If your clock is flashing &quot;0:00&quot; or &quot;12:00&quot;, you have no action to take,
congratulations on having the right time once or twice a day.&lt;/p&gt;

&lt;p&gt;If you haven&#39;t changed those clocks in six months, congratulations, they
will be accurate again!&lt;/p&gt;

&lt;p&gt;In any case, you should still consider DST because it might affect some
of your meeting schedules, particularly if you set up a new meeting
schedule in the last 6 months and forgot to consider this
change.&lt;/p&gt;

&lt;h1 id=&quot;if-your-location-does-not-have-dst&quot;&gt;If your location does not have DST&lt;/h1&gt;

&lt;p&gt;Properly scheduled meetings affecting multiple time zones are set in UTC
time, which does &lt;em&gt;not&lt;/em&gt; change. So if your location does not observer
time changes, your (local!) meeting time will &lt;em&gt;not&lt;/em&gt; change.&lt;/p&gt;

&lt;p&gt;But be aware that some other folks attending your meeting &lt;em&gt;might&lt;/em&gt; have
the DST bug and &lt;em&gt;their&lt;/em&gt; meeting times will change. They might miss
entire meetings or arrive late as you frantically ping them over IRC,
Matrix, Signal, SMS, Ricochet, Mattermost, SimpleX, Whatsapp, Discord,
Slack, Wechat, Snapchat, Telegram, XMPP, Briar, Zulip, RocketChat,
DeltaChat, talk(1), write(1), actual telegrams, Meshtastic, Meshcore,
Reticulum, APRS, snail mail, and, finally, flying a remote presence
drone to their house, asking what&#39;s going on.&lt;/p&gt;

&lt;p&gt;(Sorry if I forgot your preferred messaging client here, I tried my
best.)&lt;/p&gt;

&lt;p&gt;Be kind; those poor folks might be more sleep deprived as DST &lt;em&gt;steals&lt;/em&gt;
one hour of sleep from them on the night that implements the change.&lt;/p&gt;

&lt;h1 id=&quot;if-you-do-observe-dst&quot;&gt;If you do observe DST&lt;/h1&gt;

&lt;p&gt;If you are affected by the DST bug, your &lt;em&gt;local&lt;/em&gt; meeting times &lt;em&gt;will&lt;/em&gt;
change access the board. Normally, you can trust that your meetings are
scheduled to take this change into account and the new time should still
be reasonable.&lt;/p&gt;

&lt;p&gt;Trust, but verify; make sure the new times &lt;em&gt;are&lt;/em&gt; adequate and there are
no scheduling conflicts.&lt;/p&gt;

&lt;p&gt;Do this &lt;em&gt;now&lt;/em&gt;: take a look at your calendar in two week &lt;em&gt;and&lt;/em&gt; in
April. See if any meeting need to be rescheduled because of an
impossible or conflicting time.&lt;/p&gt;

&lt;h1 id=&quot;when-does-time-change-how-and-where&quot;&gt;When does time change, how and where?&lt;/h1&gt;

&lt;p&gt;Notice how I mentioned &quot;North America&quot; in the subject? That&#39;s a
lie. (&quot;The doctor lies&quot;, as they say on the BBC.) Other places,
including Europe, also changes times, just not all at once (and not all
North America).&lt;/p&gt;

&lt;p&gt;We&#39;ll get into &quot;where&quot; soon, but first let&#39;s look at the &quot;how&quot;. As you might
already know, the trick is:&lt;/p&gt;

&lt;blockquote&gt;&lt;p&gt;Spring forward, fall backwards.&lt;/p&gt;&lt;/blockquote&gt;

&lt;p&gt;This northern-centric (sorry!) proverb says that clocks will move
&lt;em&gt;forward&lt;/em&gt; by an hour this &quot;spring&quot;, after moving &lt;em&gt;backwards&lt;/em&gt; last
&quot;fall&quot;. This is why we lose an hour of work, sorry, sleep. It sucks, to
put it bluntly. I want it to stop and will keep writing those advisories
until it does.&lt;/p&gt;

&lt;p&gt;To see where and when, we, unfortunately, still need to go into politics.&lt;/p&gt;

&lt;h2 id=&quot;usa-and-canada&quot;&gt;USA and Canada&lt;/h2&gt;

&lt;p&gt;First, we start with &quot;North America&quot; which, really, is just some &lt;em&gt;parts&lt;/em&gt;
of USA[1] and Canada[2]. As usual, on the Second Sunday in March (the
8th) at 02:00 local (not UTC!), the clocks will move forward.&lt;/p&gt;

&lt;p&gt;This means that properly set clocks will flip from 1:59 to 3:00, coldly
depriving us from an hour of sleep that was perniciously granted 6
months ago and making calendar software stupidly hard to write.&lt;/p&gt;

&lt;p&gt;Practically, set your wrist watch and alarm clocks[3] back one hour
before going to bed and go to bed early.&lt;/p&gt;

&lt;p&gt;[1] except Arizona (except the Navajo nation), US territories, and
    Hawaii&lt;/p&gt;

&lt;p&gt;[2] except Yukon, most of Saskatchewan, and parts of British Columbia
    (northeast), one island in Nunavut (Southampton Island), one town in
    Ontario (Atikokan) and small parts of Quebec (Le
    Golfe-du-Saint-Laurent), a list which I keep recopying because I
    find it just so amazing how chaotic it is. When your clock has its
    &lt;a href=&quot;https://en.wikipedia.org/wiki/Time_in_Saskatchewan&quot;&gt;own Wikipedia page&lt;/a&gt;, you know something is wrong.&lt;/p&gt;

&lt;p&gt;[3] hopefully not managed by a botnet, otherwise kindly ask your bot net
    operator to apply proper software upgrades in a timely manner&lt;/p&gt;

&lt;h2 id=&quot;europe&quot;&gt;Europe&lt;/h2&gt;

&lt;p&gt;Next we look at our dear Europe, which will change time on the last
Sunday in March (the 29th) at 01:00 &lt;em&gt;UTC&lt;/em&gt; (not local!). I &lt;em&gt;think&lt;/em&gt; it
means that, Amsterdam-time, the clocks will flip from 1:59 to 3:00 AM
&lt;em&gt;local&lt;/em&gt; on that night.&lt;/p&gt;

&lt;p&gt;(Every time I write this, I have doubts. I would welcome independent
confirmation from night owls that observe that funky behavior
experimentally.)&lt;/p&gt;

&lt;p&gt;Just like your poor fellows out west, just fix your old-school clocks
before going to bed, and go to sleep early, it&#39;s good for you.&lt;/p&gt;

&lt;h2 id=&quot;rest-of-the-world-with-dst&quot;&gt;Rest of the world with DST&lt;/h2&gt;

&lt;p&gt;Renewed and recurring apologies again to the people of Cuba, Mexico,
Moldova, Israel, Lebanon, Palestine, Egypt, Chile (except Magallanes
Region), parts of Australia, and New Zealand which &lt;em&gt;all&lt;/em&gt; have their own
&lt;em&gt;individual&lt;/em&gt; DST rules, omitted here for brevity.&lt;/p&gt;

&lt;p&gt;In general, changes also happen in March, but either on different
times or different days, except in the south hemisphere, where they
happen in April.&lt;/p&gt;

&lt;h2 id=&quot;rest-of-the-world-without-dst&quot;&gt;Rest of the world without DST&lt;/h2&gt;

&lt;p&gt;All of you other folks without DST, rejoice! Thank you for reminding us
how manage calendars and clocks normally. Sometimes, doing nothing is
precisely the right thing to do. You&#39;re an inspiration for us all.&lt;/p&gt;

&lt;h1 id=&quot;changes-since-last-time&quot;&gt;Changes since last time&lt;/h1&gt;

&lt;p&gt;There were, again, no changes since last year on daylight savings that
I&#39;m aware of. It seems the &lt;a href=&quot;https://www.usatoday.com/story/news/nation/2026/02/19/daylight-act-of-2026-proposing-half-daylight-saving-time/88760725007/&quot;&gt;US congress debating switching to a
&quot;half-daylight&quot; time zone&lt;/a&gt; which is an half-baked idea that I
should have expected from the current USA politics.&lt;/p&gt;

&lt;p&gt;The plan is to, say, switch from &quot;Eastern is UTC-4 in the summer&quot; to
&quot;Eastern is UTC-4.5&quot;. The bill also proposes to do this 90 days after
enactment, which is dangerously optimistic about our capacity at
deploying any significant change in human society.&lt;/p&gt;

&lt;p&gt;In general, I rely on the &lt;a href=&quot;https://en.wikipedia.org/wiki/Daylight_saving_time_by_country&quot;&gt;Wikipedia time nerds&lt;/a&gt; for this and Paul
Eggert which seems to singlehandledly be keeping everything in order
for all of us, on the &lt;a href=&quot;https://lists.iana.org/hyperkitty/list/tz-announce@iana.org/latest&quot;&gt;tz-announce mailing list&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;This time, I&#39;ve also looked at the &lt;a href=&quot;https://lists.iana.org/hyperkitty/list/tz@iana.org/latest&quot;&gt;tz mailing list&lt;/a&gt; which is where
I learned about the congress bill.&lt;/p&gt;

&lt;p&gt;If your country has changed time and no one above noticed, now would
be an extremely late time to do something about this, typically
writing to the above list. (Incredibly, &lt;em&gt;I&lt;/em&gt; need to write to the list
because of &lt;a href=&quot;https://lists.iana.org/hyperkitty/list/tz@iana.org/thread/6HN5SWD2BJA7OVTPFR3VB42JIA6PFLPG/&quot;&gt;this post&lt;/a&gt;.)&lt;/p&gt;

&lt;p&gt;One thing that &lt;em&gt;did&lt;/em&gt; change since last year is that I&#39;ve implemented
what I hope to be a robust calendar for this, which was surprisingly
tricky.&lt;/p&gt;

&lt;p&gt;If you have access to our Nextcloud, it should be visible under the
heading &quot;Daylight saving times&quot;. If you don&#39;t, you can access it using
&lt;a href=&quot;https://gitlab.torproject.org/tpo/tpa/team/-/wikis/howto/time/dst.ics&quot;&gt;this direct link&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The procedures around how this calendar was created, how this email
was written, and curses found along the way, are also documented in
&lt;a href=&quot;https://gitlab.torproject.org/tpo/tpa/team/-/wikis/howto/time&quot;&gt;this wiki page&lt;/a&gt;, if someone ever needs to pick up the Time Lord
duty.&lt;/p&gt; </description> 
	<pubDate>Mon, 23 Feb 2026 19:31:58 +0000</pubDate>

</item> 
<item>
	<title>Wouter Verhelst: On Free Software, Free Hardware, and the firmware in between</title>
	<guid>https://grep.be/blog//en/computer/cluebat/On_Free_Software_Hardware_Firmware/</guid>
	<link>https://grep.be/blog//en/computer/cluebat/On_Free_Software_Hardware_Firmware/</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/wouter3.png&quot; width=&quot;85&quot; height=&quot;80&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;When the Free Software movement started in the 1980s, most of the world
had just made a transition from free university-written software to
non-free, proprietary, company-written software. Because of that, the
initial ethical standpoint of the Free Software foundation was that it&#39;s
fine to run a non-free operating system, as long as all the software you
&lt;em&gt;run&lt;/em&gt; on that operating system is free.&lt;/p&gt;

&lt;p&gt;Initially this was just the
&lt;a href=&quot;https://en.wikipedia.org/wiki/Emacs&quot;&gt;editor&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;But as time went on, and the FSF managed to write more and more parts of
the software stack, their ethical stance moved with the times. This was
a, very reasonable, pragmatic stance: if you don&#39;t accept using a
non-free operating system and there isn&#39;t a free operating system yet,
then obviously you can&#39;t &lt;em&gt;write&lt;/em&gt; that free operating system, and the
world won&#39;t move towards a point where free operating systems exist.&lt;/p&gt;

&lt;p&gt;In the early 1990s, when
&lt;a href=&quot;https://en.wikipedia.org/wiki/Linux_Torvalds&quot;&gt;Linus&lt;/a&gt; initiated the
Linux kernel, the situation reached the point where the original dream
of a fully free software stack was complete.&lt;/p&gt;

&lt;p&gt;Or so it would appear.&lt;/p&gt;

&lt;p&gt;Because, in fact, this was not the case. Computers are physical objects,
composed of bits of technology that we refer to as &quot;hardware&quot;, but in
order for these bits of technology to communicate with other bits of
technology in the same computer system, they need to interface with
each other, usually using some form of bus protocol. These bus protocols
can get very complicated, which means that a bit of software is required
in order to make all the bits communicate with each other properly.
Generally, this software is referred to as &quot;firmware&quot;, but don&#39;t let
that name deceive you; it&#39;s really just a bit of low-level software that
is very specific to one piece of hardware. Sometimes it&#39;s written in an
imperative high-level language; sometimes it&#39;s just a set of very simple
initialization vectors. But whatever the case might be, it&#39;s always a
bit of software.&lt;/p&gt;

&lt;p&gt;And although we largely had a free system, &lt;em&gt;this&lt;/em&gt; bit of low-level
software was not yet free.&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://grep.be/blog//en/computer/cluebat/fhw_early.png&quot;&gt;&lt;img class=&quot;img&quot; height=&quot;504&quot; src=&quot;https://grep.be/blog//en/computer/cluebat/fhw_early.png&quot; width=&quot;550&quot; /&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Initially, storage was expensive, so computers couldn&#39;t store as much
data as today, and so most of this software was stored in ROM chips on
the exact bits of hardware they were meant for. Due to this fact, it was
easy to deceive yourself that the firmware wasn&#39;t there, because you
never &lt;em&gt;directly&lt;/em&gt; interacted with it. We knew it was there; in fact, for
some larger &lt;a href=&quot;https://en.wikipedia.org/wiki/BIOS&quot;&gt;pieces of this type of
software&lt;/a&gt; it was possible, even in
those days, to install updates. But that was rarely if ever done at the
time, and it was easily forgotten.&lt;/p&gt;

&lt;p&gt;And so, when the free software movement slapped itself on the back and
declared victory when a fully free operating system was available, and
decided that the work of &lt;em&gt;creating&lt;/em&gt; a free software environment was
finished, that only keeping it recent was further required, and that we
must reject any further non-free encroachments on our fully free
software stack, the free software movement was deceiving itself.&lt;/p&gt;

&lt;p&gt;Because a computing environment can never be &lt;em&gt;fully&lt;/em&gt; free if the
low-level pieces of software that form the foundations of that computing
environment are not free. It would have been one thing if the Free
Software Foundation declared it ethical to use non-free low-level
software on a computing environment if free alternatives were not
available. But unfortunately, they did not.&lt;/p&gt;

&lt;p&gt;In fact, something very strange happened.&lt;/p&gt;

&lt;p&gt;In order for some free software hacker to be able to write a free
replacement for some piece of non-free software, they obviously need to
be able to actually install that theoretical free replacement. This
isn&#39;t just a random thought; in fact it &lt;a href=&quot;https://en.wikipedia.org/wiki/Coreboot&quot;&gt;has
happened&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Now, it&#39;s possible to install software on a piece of rewritable storage
such as flash memory inside the hardware and boot the hardware from
that, but if there is a bug in your software -- not at all unlikely if
you&#39;re trying to write software for a piece of hardware that you don&#39;t
have documentation for -- then it&#39;s not unfathomable that the
replacement piece of software will not work, thereby reducing your
expensive piece of technology to something about as useful as a
paperweight.&lt;/p&gt;

&lt;p&gt;Here&#39;s the good part.&lt;/p&gt;

&lt;p&gt;In the late 1990s and early 2000s, the bits of technology that made up
computers became so complicated, and the storage and memory available to
computers so much larger and cheaper, that it became economically more
feasible to create a small, tiny, piece of software stored in a ROM chip
on the hardware, with &lt;em&gt;just&lt;/em&gt; enough knowledge of the bus protocol to
download the rest from the main computer.&lt;/p&gt;

&lt;p&gt;This is awesome for free software. If you now write a replacement for
the non-free software that comes with the hardware, and you make a
mistake, no wobbles! You just remove power from the system, let the DRAM
chips on the hardware component fully drain, return power, and try
again. You &lt;em&gt;might&lt;/em&gt; still end up with a brick of useless silicon if some
of the things you sent to your technology make it do things that it was
not designed to do and therefore you burn through some critical bits of
metal or plastic, but the chance of this happening is significantly
lower than the chance of you writing something that impedes the boot
process of the piece of hardware and you are unable to fix it because
the flash is overwritten. There is &lt;a href=&quot;https://nondeterministic.computer/@mjg59/116108564771446668&quot;&gt;anecdotal
evidence&lt;/a&gt;
that there are free software hackers out there who do so. So, yay,
right? You&#39;d think the Free Software foundation would jump at the
possibility to get more free software? After all, a large part of why we
even have a Free Software Foundation in the first place, was because of
&lt;a href=&quot;https://www.gnu.org/philosophy/rms-nyu-2001-transcript.txt&quot;&gt;some piece of hardware that was
misbehaving&lt;/a&gt;,
so you would think that the foundation&#39;s founders would understand the
need for hardware to be controlled by software that is free.&lt;/p&gt;

&lt;p&gt;The strange thing, what has always been strange to me, is that this is
not what happened.&lt;/p&gt;

&lt;p&gt;The Free Software Foundation instead decided that non-free software on
ROM or flash chips is fine, but non-free software -- the very same
non-free software, mind -- that touches the general storage device that
you as a user use, is not. Never mind the fact that the non-free
software is always there, whether it sits on your storage device or not.&lt;/p&gt;

&lt;p&gt;Misguidedness aside, if some people decide they would rather not update
the non-free software in their hardware and use the hardware with the
old and potentially buggy version of the non-free software that it came
with, then of course that&#39;s their business.&lt;/p&gt;

&lt;p&gt;Unfortunately, it didn&#39;t quite stop there. If it had, I wouldn&#39;t have
written this blog post.&lt;/p&gt;

&lt;p&gt;You see, even though the Free Software Foundation was about Software,
they decided that they needed to create a &lt;a href=&quot;https://ryf.fsf.org&quot;&gt;&lt;em&gt;hardware&lt;/em&gt;
certification&lt;/a&gt; program. And this hardware
certification program ended up embedding the strange concept that if
something is stored in ROM it&#39;s fine, but if something is stored on a
hard drive it&#39;s not. Same hardware, same software, but different
storage. By that logic, Windows respects your freedom as long as the
software is written to ROM. Because this way, the Free Software
Foundation could come to a standstill and pretend they were still living
in the 90s.&lt;/p&gt;

&lt;p&gt;An unfortunate result of the &quot;RYF&quot; program is that it means that
companies who otherwise would have been inclined to create hardware that
was truly free, top to bottom, are now more incentivised by the RYF
program to create hardware in which the non-free low-level software
can&#39;t be replaced.&lt;/p&gt;

&lt;p&gt;Meanwhile, the rest of the world did &lt;em&gt;not&lt;/em&gt; pretend to still be living in
the nineties, and free hardware communities now exist. Because of how
the FSF has marketed themselves out of the world, these communities call
themselves &quot;Open Hardware&quot; communities, rather than &quot;Free Hardware&quot;
ones, but the principle is the same: the designs are there, if you have
the skill you can modify it, but you don&#39;t have to.&lt;/p&gt;

&lt;p&gt;In the mean time, the open hardware community has evolved to a point
where even &lt;a href=&quot;https://en.wikipedia.org/wiki/RISC-V&quot;&gt;CPUs&lt;/a&gt; are designed in
the open, which you can design your own version of.&lt;/p&gt;

&lt;p&gt;But not all hardware can be implemented as RISC-V, and so if you want a
full system that builds RISC-V you may still need components of the
system that were originally built for other architectures but that would
work with RISC-V, such as a network card or a GPU. And because the FSF
has done everything in their power to disincentivise people who would
otherwise be well situated to build free versions of the low-level
software required to support your hardware, you may now be in the weird
position where we seem to have somehow skipped a step.&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://grep.be/blog//en/computer/cluebat/fhw_now.png&quot;&gt;&lt;img class=&quot;img&quot; height=&quot;504&quot; src=&quot;https://grep.be/blog//en/computer/cluebat/fhw_now.png&quot; width=&quot;180&quot; /&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
  &lt;p&gt;My own suspicion is that the universe is not only queerer than we
  suppose, but queerer than we can suppose.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;-- J.B.S. Haldane&lt;/p&gt;

&lt;p&gt;(comments for this post will not pass moderation. Use your own blog!)&lt;/p&gt; </description> 
	<pubDate>Mon, 23 Feb 2026 16:51:00 +0000</pubDate>

</item> 
<item>
	<title>Benjamin Mako Hill: What makes online groups vulnerable to governance capture?</title>
	<guid>https://mako.cc/copyrighteous/?p=3337</guid>
	<link>https://mako.cc/copyrighteous/what-makes-online-groups-vulnerable-to-governance-capture</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/mako.gif&quot; width=&quot;65&quot; height=&quot;93&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p class=&quot;has-small-font-size&quot;&gt;&lt;em&gt;&lt;strong&gt;Note:&lt;/strong&gt; I have not published blog posts about my academic papers over the past few years. To ensure that my blog contains a more comprehensive record of my published papers and to surface these for folks who missed them, I will be periodically (re)publishing blog posts about some “older” published projects. This post is closely based on &lt;a href=&quot;https://blog.communitydata.science/governance-capture/&quot;&gt;a previously published post&lt;/a&gt; by &lt;a href=&quot;https://zarine.net/&quot;&gt;Zarine Kharazian&lt;/a&gt; on &lt;a href=&quot;https://blog.communitydata.science/&quot;&gt;the Community Data Science Blog&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;



&lt;p&gt;&lt;em&gt; &lt;/em&gt;For nearly a decade, the Croatian language version of Wikipedia was run by a cabal of far-right nationalists who edited articles in ways that promoted fringe political ideas and involved cases of &lt;a href=&quot;https://perma.cc/YJ3B-V2YP&quot;&gt;historical revisionism&lt;/a&gt; related to the &lt;a href=&quot;https://en.wikipedia.org/wiki/Usta%C5%A1e&quot;&gt;Ustaše regime&lt;/a&gt;, a fascist movement that ruled the Nazi puppet state called the Independent State of Croatia during World War II. This cabal seized complete control of the encyclopedia’s governance, banned and blocked those who disagreed with them, and operated a network of fake accounts to create the appearance of grassroots support for their policies. &lt;/p&gt;



&lt;p&gt;Thankfully, Croatian Wikipedia appears to be an outlier. Though both the Croatian and Serbian language editions have been documented to contain nationalist bias and historical revisionism, Croatian Wikipedia seems unique among Wikipedia editions in the extent to which its governance institutions were captured by a small group of users. &lt;br /&gt;&lt;br /&gt;The situation in Croatian Wikipedia was &lt;span style=&quot;margin: 0px; padding: 0px;&quot;&gt;&lt;a href=&quot;https://meta.wikimedia.org/wiki/File:Croatian_WP_Disinformation_Assessment_-_Final_Report_EN.pdf&quot; target=&quot;_blank&quot;&gt;well documented&lt;/a&gt; and is now largely fixed, but we still know very little about why it&lt;/span&gt; was taken over, while other language editions seem to have rebuffed similar capture attempts. In a paper published in the Proceedings of the ACM: Human-Computer Interaction (CSCW), Zarine Kharazian, Kate Starbird, and I present an interview-based study that provides an explanation for why Croatian was captured while several other editions facing similar contexts and threats fared better.&lt;/p&gt;



&lt;figure class=&quot;wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-4-3 wp-has-aspect-ratio&quot;&gt;&lt;div class=&quot;wp-block-embed__wrapper&quot;&gt;

&lt;/div&gt;Short video presentation of the work given at Wikimania in August 2023.&lt;/figure&gt;



&lt;p&gt;Based on insights from interviews with 15 participants from both the Croatian and Serbian Wikipedia projects and from the broader Wikimedia movement, we arrived at three propositions that, together, help explain why Croatian Wikipedia succumbed to capture while Serbian Wikipedia did not: &lt;/p&gt;



&lt;ol class=&quot;wp-block-list&quot;&gt;
&lt;li&gt;&lt;em&gt;Perceived Value as a Target.&lt;/em&gt; Is the project worth expending the effort to capture?&lt;/li&gt;



&lt;li&gt;&lt;em&gt;Bureaucratic Openness.&lt;/em&gt; How easy is it for contributors outside the core founding team to ascend to local governance positions?&lt;/li&gt;



&lt;li&gt;&lt;em&gt;Institutional Formalization.&lt;/em&gt; To what degree does the project prefer personalistic, informal forms of organization over formal ones?&lt;/li&gt;
&lt;/ol&gt;



&lt;figure class=&quot;wp-block-image size-full&quot;&gt;&lt;img alt=&quot;&quot; class=&quot;wp-image-3339&quot; height=&quot;393&quot; src=&quot;https://mako.cc/copyrighteous/wp-content/uploads/2026/02/Screenshot-2024-01-11-at-3.17.08-PM-1024x393-1.png&quot; width=&quot;1024&quot; /&gt;&lt;em&gt;The conceptual model from our paper, visualizing possible institutional configurations among Wikipedia projects that affect the risk of governance capture. &lt;/em&gt;&lt;br /&gt;&lt;/figure&gt;



&lt;p&gt;We found that both Croatian and Serbian Wikipedias were attractive targets for far-right nationalist capture due to their sizable readership and resonance with national identity. However, we also found that the two projects diverged early in their trajectories in how open they remained to new contributors ascending to local governance positions and in the degree to which they privileged informal relationships over formal rules and processes as the project’s organizing principles. Ultimately, Croatian’s relative lack of bureaucratic openness and rules constraining administrator behavior created a window of opportunity for a motivated contingent of editors to seize control of the governance mechanisms of the project. &lt;/p&gt;



&lt;p&gt;Though our empirical setting was Wikipedia, our theoretical model may offer insight into the challenges faced by self-governed online communities more broadly. As interest in decentralized alternatives to Facebook and X (formerly Twitter) grows, communities on these sites will likely face similar threats from motivated actors. Understanding the vulnerabilities inherent in these self-governing systems is crucial to building resilient defenses against threats like disinformation. &lt;/p&gt;



&lt;p&gt;For more details on our findings, take a look at the &lt;a href=&quot;https://dl.acm.org/doi/10.1145/3637338&quot;&gt;published version of our paper&lt;/a&gt;.&lt;/p&gt;



&lt;hr class=&quot;wp-block-separator has-alpha-channel-opacity&quot; /&gt;



&lt;p class=&quot;has-small-font-size&quot;&gt;&lt;em&gt;Citation for the full paper:&lt;/em&gt; Kharazian, Zarine, Kate Starbird, and Benjamin Mako Hill. 2024. “Governance Capture in a Self-Governing Community: A Qualitative Comparison of the Croatian, Serbian, Bosnian, and Serbo-Croatian Wikipedias.” &lt;em&gt;Proceedings of the ACM on Human-Computer Interaction&lt;/em&gt; 8 (CSCW1): 61:1-61:26. &lt;a href=&quot;https://doi.org/10.1145/3637338&quot;&gt;https://doi.org/10.1145/3637338&lt;/a&gt;.&lt;/p&gt;



&lt;p class=&quot;has-small-font-size&quot;&gt;&lt;em&gt; This blog post and the paper it describes are collaborative work by Zarine Kharazian, Benjamin Mako Hill, and Kate Starbird.&lt;/em&gt;&lt;/p&gt; </description> 
	<pubDate>Sun, 22 Feb 2026 21:12:41 +0000</pubDate>

</item> 
<item>
	<title>Otto Kekäläinen: Do AI models still keep getting better, or have they plateaued?</title>
	<guid>https://optimizedbyotto.com/post/ai-models-plateaued-or-not/</guid>
	<link>https://optimizedbyotto.com/post/ai-models-plateaued-or-not/</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/otto.png&quot; width=&quot;64&quot; height=&quot;90&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;img alt=&quot;Featured image of post Do AI models still keep getting better, or have they plateaued?&quot; src=&quot;https://optimizedbyotto.com/post/ai-models-plateaued-or-not/flagship-ai-mini-benchmark.png&quot; /&gt;&lt;p&gt;The AI hype is based on the assumption that the frontier AI labs are producing better and better foundational models &lt;em&gt;at an accelerating pace&lt;/em&gt;. Is that really true, or are people just in sort of a mass psychosis because AI models have become so good at mimicking human behavior that we unconsciously attribute increasing intelligence to them? I decided to conduct a mini-benchmark of my own to find out if the latest and greatest AI models are actually really good or not.&lt;/p&gt;
&lt;h2 id=&quot;the-problem-with-benchmarks&quot;&gt;&lt;a class=&quot;header-anchor&quot; href=&quot;https://optimizedbyotto.com/index.xml#the-problem-with-benchmarks&quot;&gt;&lt;/a&gt;The problem with benchmarks
&lt;/h2&gt;&lt;p&gt;Every time any team releases a new LLM, they boast how well it performs on various industry benchmarks such as &lt;a class=&quot;link&quot; href=&quot;https://agi.safe.ai/&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;Humanity’s Last Exam&lt;/a&gt;, &lt;a class=&quot;link&quot; href=&quot;https://www.swebench.com/&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;SWE-Bench&lt;/a&gt; and &lt;a class=&quot;link&quot; href=&quot;https://allenai.org/data/arc&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;Ai2 ARC&lt;/a&gt; or &lt;a class=&quot;link&quot; href=&quot;https://arcprize.org/leaderboard&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;ARC-AGI&lt;/a&gt;. An overall leaderboard can be viewed at &lt;a class=&quot;link&quot; href=&quot;https://llm-stats.com/&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;LLM-stats&lt;/a&gt;. This incentivizes teams to optimize for specific benchmarks, which might make them excel on specific tasks while general abilities degrade. &lt;strong&gt;Also, the older a benchmark dataset is, the more online material there is discussing the questions and best answers,&lt;/strong&gt; which in turn increases the chances of newer models trained on more recent web content scoring better.&lt;/p&gt;
&lt;p&gt;Thus I prefer looking at real-time leaderboards such as the &lt;a class=&quot;link&quot; href=&quot;https://arena.ai/leaderboard&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;LM Arena leaderboard&lt;/a&gt; (or &lt;a class=&quot;link&quot; href=&quot;https://rank.opencompass.org.cn/leaderboard-llm&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;OpenCompass&lt;/a&gt; for Chinese models that might be missing from LM Arena). However, even though the LM Arena Elo score is rated by humans in real-time, the benchmark can still be played. For example, &lt;a class=&quot;link&quot; href=&quot;https://www.heise.de/en/news/Meta-cheats-on-Llama-4-benchmark-10344087.html&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;Meta reportedly&lt;/a&gt; used a special chat-optimized model instead of the actual Llama 4 model when getting scored on the LM Arena.&lt;/p&gt;
&lt;p&gt;Therefore I trust my own first-hand experience more than the benchmarks for gaining intuition. Intuition however is not a compelling argument in discussions on whether or not new flagship AI models have plateaued. Thus, I decided to devise my own mini-benchmark so that no model could have possibly seen it in its training data or be specifically optimized for it in any way.&lt;/p&gt;
&lt;h2 id=&quot;my-mini-benchmark&quot;&gt;&lt;a class=&quot;header-anchor&quot; href=&quot;https://optimizedbyotto.com/index.xml#my-mini-benchmark&quot;&gt;&lt;/a&gt;My mini-benchmark
&lt;/h2&gt;&lt;p&gt;I crafted 6 questions based on my own experience using various LLMs for several years and having developed some intuition about what kinds of questions LLMs typically struggle with.&lt;/p&gt;
&lt;p&gt;I conducted the benchmark using the &lt;a class=&quot;link&quot; href=&quot;https://openrouter.ai/chat?models=anthropic%2Fclaude-opus-4.6%2Copenai%2Fgpt-5.2%2Cx-ai%2Fgrok-4.1-fast%2Cgoogle%2Fgemini-3.1-pro-preview%2Cz-ai%2Fglm-5%2Cminimax%2Fminimax-m2.5%2Cqwen%2Fqwen3.5-plus-02-15%2Cmoonshotai%2Fkimi-k2.5&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;OpenRouter.ai chat playroom&lt;/a&gt; with the following state-of-the-art models:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class=&quot;link&quot; href=&quot;https://openrouter.ai/anthropic/claude-opus-4.6&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;Claude Opus 4.6 (Anthropic)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class=&quot;link&quot; href=&quot;https://openrouter.ai/openai/gpt-5.2&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;GPT-5.2 (OpenAI)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class=&quot;link&quot; href=&quot;https://openrouter.ai/x-ai/grok-4.1-fast&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;Grok 4.1 (xAI)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class=&quot;link&quot; href=&quot;https://openrouter.ai/google/gemini-3.1-pro-preview&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;Gemini 3.1 Pro Preview (Google)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class=&quot;link&quot; href=&quot;https://openrouter.ai/z-ai/glm-5&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;GLM 5 (Z.ai)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class=&quot;link&quot; href=&quot;https://openrouter.ai/minimax/minimax-m2.5&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;MinMax M2.5 (MinMax)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class=&quot;link&quot; href=&quot;https://openrouter.ai/qwen/qwen3.5-plus-02-15&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;Qwen3.5 Plus 2026-02-15 (Alibaba)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class=&quot;link&quot; href=&quot;https://openrouter.ai/moonshotai/kimi-k2.5&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;Kimi K2.5 (Moonshot.ai)&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;OpenRouter.ai is great as it very easy to get responses from multiple models in parallel to a single question. Also it allows to turn off web search to force the models to answer purely based on their embedded knowledge.&lt;/p&gt;
&lt;p&gt;&lt;img alt=&quot;OpenRouter.ai Chat playroom&quot; class=&quot;gallery-image&quot; height=&quot;679&quot; src=&quot;https://optimizedbyotto.com/post/ai-models-plateaued-or-not/flagship-ai-mini-benchmark.gif&quot; width=&quot;800&quot; /&gt;
&lt;/p&gt;
&lt;p&gt;Common for all the test questions is that they are fairly straightforward and have a clear answer, yet the answer isn’t common knowledge or statistically the most obvious one, and instead requires a bit of reasoning to get correct.&lt;/p&gt;
&lt;p&gt;Some of these questions are also based on myself witnessing a flagship model failing miserably to answer it.&lt;/p&gt;
&lt;h3 id=&quot;1-which-cities-have-hosted-the-olympics-more-than-just-once&quot;&gt;&lt;a class=&quot;header-anchor&quot; href=&quot;https://optimizedbyotto.com/index.xml#1-which-cities-have-hosted-the-olympics-more-than-just-once&quot;&gt;&lt;/a&gt;1. Which cities have hosted the Olympics more than just once?
&lt;/h3&gt;&lt;p&gt;This question requires accounting for both summer and winter Olympics, and for Olympics hosted across multiple cities.&lt;/p&gt;
&lt;p&gt;The variance in responses comes from if the model understands that Beijing should be counted as it has hosted both summer and winter Olympics. Interestingly GPT was the only model to not mention Beijing at all. Some variance also comes from how models account for co-hosted Olympics. For example Cortina should be counted as having hosted the Olympics twice, in 1956 and 2026, but only Claude, Gemini and Kimi pointed this out. Stockholm’s 1956 hosting of the equestrian games during the Melbourne Olympics is a special case, which GPT, Gemini and Kimi pointed out in a side note. Some models seem to have old training material, and for example Grok assumes the current year is 2024. All models that accounted for awarded future Olympics (e.g. Los Angeles 2028) marked them clearly as upcoming.&lt;/p&gt;
&lt;p&gt;Overall I would judge that only GPT and MinMax gave incomplete answers, while all other models replied as the best humans could reasonably have.&lt;/p&gt;
&lt;h3 id=&quot;2-if-eurusd-continues-to-slide-to-15-by-mid-2026-what-is-the-likely-effect-on-bmws-stock-price-by-end-of-2026&quot;&gt;&lt;a class=&quot;header-anchor&quot; href=&quot;https://optimizedbyotto.com/index.xml#2-if-eurusd-continues-to-slide-to-15-by-mid-2026-what-is-the-likely-effect-on-bmws-stock-price-by-end-of-2026&quot;&gt;&lt;/a&gt;2. If EUR/USD continues to slide to 1.5 by mid-2026, what is the likely effect on BMW’s stock price by end of 2026?
&lt;/h3&gt;&lt;p&gt;This question requires mapping the currency exchange rate to historic value, dodging the misleading word “slide”, and reasoning on where the revenue of a company comes from and how a weaker US dollar affects it in multiple ways. I’ve frequently witnessed flagship models get it wrong how interest rates and exchange rates work. Apparently the binary choice between “up” or “down” is somehow challenging to the internal statistical model in the LLMs on a topic where there are a lot of training material that talk about both things being likely to happen, and choosing between them requires specifically reasoning about the scenario at hand and disregarding general knowledge of the situation.&lt;/p&gt;
&lt;p&gt;However, this time all the models concluded correctly that a weak dollar would have a negative overall effect on the BMW stock price. Gemini, GLM, Qwen and Kimi also mention the potential hedging effect of BMW’s X-series production in South Carolina for worldwide export.&lt;/p&gt;
&lt;h3 id=&quot;3-what-is-the-unicode-code-point-for-the-traffic-cone-emoji&quot;&gt;&lt;a class=&quot;header-anchor&quot; href=&quot;https://optimizedbyotto.com/index.xml#3-what-is-the-unicode-code-point-for-the-traffic-cone-emoji&quot;&gt;&lt;/a&gt;3. What is the Unicode code point for the traffic cone emoji?
&lt;/h3&gt;&lt;p&gt;This was the first question where the the flagship models clearly still struggle in 2026. The trap here is that there is no traffic cone emoji, so an advanced model should simply refuse to give any Unicode numbers at all. Most LLMs however have an urge to give some answer, leading to hallucinations. Also, as the answer has a graphical element to it, the LLM might not understand how the emoji “looks” in ways that would be obvious to a human, and thus many models claim the construction sign emoji is a traffic cone, which it is not.&lt;/p&gt;
&lt;p&gt;By far the worst response was from GPT, that simply hallucinates and stops there:&lt;/p&gt;
&lt;p&gt;&lt;img alt=&quot;OpenAIs GPT-5.2 completely wrong answer to traffic cone emoji question&quot; class=&quot;gallery-image&quot; height=&quot;117&quot; src=&quot;https://optimizedbyotto.com/post/ai-models-plateaued-or-not/gpt-5.2-traffic-cone-emoji.png&quot; width=&quot;899&quot; /&gt;
&lt;/p&gt;
&lt;p&gt;While Gemini and Grok were among the three models not falling into this trap, the response from Claude was exemplary good:&lt;/p&gt;
&lt;p&gt;&lt;img alt=&quot;Claude Opus 4.6 exemplary good answer to traffic cone emoji question&quot; class=&quot;gallery-image&quot; height=&quot;387&quot; src=&quot;https://optimizedbyotto.com/post/ai-models-plateaued-or-not/claude-opus-4.6-traffic-cone-emoji.png&quot; width=&quot;899&quot; /&gt;
&lt;/p&gt;
&lt;h3 id=&quot;4-which-languages-are-the-10-most-commonly-spoken-in-the-world-for-each-language-count-from-1-to-5&quot;&gt;&lt;a class=&quot;header-anchor&quot; href=&quot;https://optimizedbyotto.com/index.xml#4-which-languages-are-the-10-most-commonly-spoken-in-the-world-for-each-language-count-from-1-to-5&quot;&gt;&lt;/a&gt;4. Which languages are the 10 most commonly spoken in the world? For each language, count from 1 to 5.
&lt;/h3&gt;&lt;p&gt;This seemingly simple question can become quite complex when you account for the fact that other languages might also have their own character system, both for words and the numerals (e.g. Arabic واحد/one vs ١/1), and that numbers might have variations depending on what the thing being counted is (e.g in French object gender un/une), or whether plain numbers and occurrences have their own words (e.g. Russian один/odin vs раз/raz). I’ve also witnessed older models frequently mix up on what parts of the reply should be using what language.&lt;/p&gt;
&lt;p&gt;For a human this question is trivial, assuming they know the language. For LLMs it is surprisingly challenging. Grok and MinMax only gave a list of the most popular languages and stopped there without showing any words in the languages. Qwen got a bit further by trying to list numbers in each language, but the lists were only the numbers 1 to 5:&lt;/p&gt;
&lt;p&gt;&lt;img alt=&quot;Qwen3.5 Plus count to five in top 10 languages&quot; class=&quot;gallery-image&quot; height=&quot;562&quot; src=&quot;https://optimizedbyotto.com/post/ai-models-plateaued-or-not/qwen3.5-plus-count-to-five.png&quot; width=&quot;899&quot; /&gt;
&lt;/p&gt;
&lt;p&gt;I have seen Grok do something similar before, which in fact inspired me to include this question in my mini-benchmark.&lt;/p&gt;
&lt;p&gt;GPT got a bit further, but for Hindi, Arabic and Bengali it listed the numerals in local script, not the number words. Gemini, GLM and Kimi gave a complete and correct answer as a list, while the absolute best answer and presentation was by Claude, that gave the table below:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Claude Opus 4.6&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;10 Most Commonly Spoken Languages (by Total Speakers)&lt;/p&gt;
&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;#&lt;/th&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;1&lt;/th&gt;
&lt;th&gt;2&lt;/th&gt;
&lt;th&gt;3&lt;/th&gt;
&lt;th&gt;4&lt;/th&gt;
&lt;th&gt;5&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;English&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;One&lt;/td&gt;
&lt;td&gt;Two&lt;/td&gt;
&lt;td&gt;Three&lt;/td&gt;
&lt;td&gt;Four&lt;/td&gt;
&lt;td&gt;Five&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Mandarin Chinese&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;一 (yī)&lt;/td&gt;
&lt;td&gt;二 (èr)&lt;/td&gt;
&lt;td&gt;三 (sān)&lt;/td&gt;
&lt;td&gt;四 (sì)&lt;/td&gt;
&lt;td&gt;五 (wǔ)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Hindi&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;एक (ek)&lt;/td&gt;
&lt;td&gt;दो (do)&lt;/td&gt;
&lt;td&gt;तीन (tīn)&lt;/td&gt;
&lt;td&gt;चार (chār)&lt;/td&gt;
&lt;td&gt;पाँच (pā̃ch)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;4&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Spanish&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Uno&lt;/td&gt;
&lt;td&gt;Dos&lt;/td&gt;
&lt;td&gt;Tres&lt;/td&gt;
&lt;td&gt;Cuatro&lt;/td&gt;
&lt;td&gt;Cinco&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;5&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;French&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Un&lt;/td&gt;
&lt;td&gt;Deux&lt;/td&gt;
&lt;td&gt;Trois&lt;/td&gt;
&lt;td&gt;Quatre&lt;/td&gt;
&lt;td&gt;Cinq&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;6&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Arabic&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;واحد (wāḥid)&lt;/td&gt;
&lt;td&gt;اثنان (ithnān)&lt;/td&gt;
&lt;td&gt;ثلاثة (thalātha)&lt;/td&gt;
&lt;td&gt;أربعة (arbaʿa)&lt;/td&gt;
&lt;td&gt;خمسة (khamsa)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;7&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Bengali&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;এক (ek)&lt;/td&gt;
&lt;td&gt;দুই (dui)&lt;/td&gt;
&lt;td&gt;তিন (tin)&lt;/td&gt;
&lt;td&gt;চার (chār)&lt;/td&gt;
&lt;td&gt;পাঁচ (pā̃ch)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;8&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Portuguese&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Um&lt;/td&gt;
&lt;td&gt;Dois&lt;/td&gt;
&lt;td&gt;Três&lt;/td&gt;
&lt;td&gt;Quatro&lt;/td&gt;
&lt;td&gt;Cinco&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;9&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Russian&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Один (odin)&lt;/td&gt;
&lt;td&gt;Два (dva)&lt;/td&gt;
&lt;td&gt;Три (tri)&lt;/td&gt;
&lt;td&gt;Четыре (chetyre)&lt;/td&gt;
&lt;td&gt;Пять (pyat&#39;)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;10&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Japanese&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;一 (ichi)&lt;/td&gt;
&lt;td&gt;二 (ni)&lt;/td&gt;
&lt;td&gt;三 (san)&lt;/td&gt;
&lt;td&gt;四 (shi)&lt;/td&gt;
&lt;td&gt;五 (go)&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;em&gt;&lt;strong&gt;Note:&lt;/strong&gt; Rankings are based on estimated &lt;em&gt;total&lt;/em&gt; speakers (native + non-native). Rankings can vary depending on the source and methodology used.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id=&quot;5-count-the-number-of-drone-launch-pads-in-the-picture&quot;&gt;&lt;a class=&quot;header-anchor&quot; href=&quot;https://optimizedbyotto.com/index.xml#5-count-the-number-of-drone-launch-pads-in-the-picture&quot;&gt;&lt;/a&gt;5. Count the number of drone launch pads in the picture.
&lt;/h3&gt;&lt;p&gt;Together with the question, I uploaded this picture:&lt;/p&gt;
&lt;p&gt;&lt;img alt=&quot;Drone pad picture attached to question above&quot; class=&quot;gallery-image&quot; height=&quot;1584&quot; src=&quot;https://optimizedbyotto.com/post/ai-models-plateaued-or-not/drone-pad-counting-task.jpg&quot; width=&quot;2196&quot; /&gt;
&lt;/p&gt;
&lt;p&gt;A human can easily count that there are 10 rows and 30+ columns in the grid, but because the picture resolution isn’t good enough, the exact number of columns can’t be counted, and the answer should be that there are at least 300 launch pads in the picture.&lt;/p&gt;
&lt;p&gt;GPT and Grok both guessed the count is zero. Instead of hallucinating some number they say zero, but it would have been better to not give any number at all, and just state that they are unable to perform the task. Gemini gave as its answer “101”, which is quite odd, but reading the reasoning section, it seems to have tried counting items in the image without reasoning much about what it is actually counting and that there is clearly a grid that can make the counting much easier. Both Qwen and Kimi state they can see four parallel structures, but are unable to count drone launch pads.&lt;/p&gt;
&lt;p&gt;The absolutely best answer was given by Claude, which counted 10-12 rows and 30-40+ columns, and concluded that there must be 300-500 drone launch pads. Very close to best human level - impressive!&lt;/p&gt;
&lt;p&gt;This question applied only to multi-modal models that can see images, so GLM and MinMax could not give any response.&lt;/p&gt;
&lt;h3 id=&quot;6-explain-why-i-am-getting-the-error-below-and-what-is-the-best-way-to-fix-it&quot;&gt;&lt;a class=&quot;header-anchor&quot; href=&quot;https://optimizedbyotto.com/index.xml#6-explain-why-i-am-getting-the-error-below-and-what-is-the-best-way-to-fix-it&quot;&gt;&lt;/a&gt;6. Explain why I am getting the error below, and what is the best way to fix it?
&lt;/h3&gt;&lt;p&gt;Together with the question above, I gave this code block:&lt;/p&gt;
&lt;div class=&quot;codeblock &quot;&gt;
&lt;header&gt;
&lt;span class=&quot;codeblock-lang&quot;&gt;&lt;/span&gt;
&lt;button class=&quot;codeblock-copy&quot;&gt;
Copy
&lt;/button&gt;
&lt;/header&gt;
&lt;code id=&quot;codeblock-id-0&quot; style=&quot;display: none;&quot;&gt;$ SH_SCRIPTS=&quot;$(mktemp; grep -Irnw debian/ -e &#39;^#!.*/sh&#39; | sort -u | cut -d &#39;:&#39; -f 1 || true)&quot;
$ shellcheck -x --enable=all --shell=sh &quot;$SH_SCRIPTS&quot;
/tmp/tmp.xQOpI5Nljx
debian/tests/integration-tests: /tmp/tmp.xQOpI5Nljx
debian/tests/integration-tests: openBinaryFile: does not exist (No such file or directory)&lt;/code&gt;&lt;pre&gt;&lt;code&gt;$ SH_SCRIPTS=&quot;$(mktemp; grep -Irnw debian/ -e &#39;^#!.*/sh&#39; | sort -u | cut -d &#39;:&#39; -f 1 || true)&quot;
$ shellcheck -x --enable=all --shell=sh &quot;$SH_SCRIPTS&quot;
/tmp/tmp.xQOpI5Nljx
debian/tests/integration-tests: /tmp/tmp.xQOpI5Nljx
debian/tests/integration-tests: openBinaryFile: does not exist (No such file or directory)&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;p&gt;Older models would easily be misled by the last error message thinking that a file went missing, and focus on suggesting changes to the complex-looking first line. In reality the error is simply caused by having the quotes around the &lt;code&gt;$SH_SCRIPTS&lt;/code&gt;, resulting in the entire multi-line string being passed as a single argument to &lt;code&gt;shellcheck&lt;/code&gt;. So instead of receiving two separate file paths, &lt;code&gt;shellcheck&lt;/code&gt; tries to open one file literally named &lt;code&gt;/tmp/tmp.xQOpI5Nljx\ndebian/tests/integration-tests&lt;/code&gt;.&lt;/p&gt;
&lt;p&gt;Incorrect argument expansion is fairly easy for an experienced human programmer to notice, but tricky for an LLM. Indeed, Grok, MinMax, and Qwen fell for this trap and focused on the &lt;code&gt;mktemp&lt;/code&gt;, assuming it somehow fails to create a file. Interestingly GLM fails to produce an answer at all, as the reasoning step seems to be looping, thinking too much about the missing file, but not understanding why it would be missing when there is nothing wrong with how &lt;code&gt;mktemp&lt;/code&gt; is executed.&lt;/p&gt;
&lt;p&gt;Claude, Gemini, and Kimi immediately spot the real root cause of passing the variable quoted and suggested correct fixes that involve either removing the quotes, or using Bash arrays or &lt;code&gt;xargs&lt;/code&gt; in a way that makes the whole command also handle correctly filenames with spaces in them.&lt;/p&gt;
&lt;h2 id=&quot;conclusion&quot;&gt;&lt;a class=&quot;header-anchor&quot; href=&quot;https://optimizedbyotto.com/index.xml#conclusion&quot;&gt;&lt;/a&gt;Conclusion
&lt;/h2&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Model&lt;/th&gt;
&lt;th&gt;Sports&lt;/th&gt;
&lt;th&gt;Economics&lt;/th&gt;
&lt;th&gt;Emoji&lt;/th&gt;
&lt;th&gt;Languages&lt;/th&gt;
&lt;th&gt;Visual&lt;/th&gt;
&lt;th&gt;Shell&lt;/th&gt;
&lt;th&gt;Score&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Claude Opus 4.6&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;6/6&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;GPT-5.2&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;~&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;2.5/6&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Grok 4.1&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;3/6&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Gemini 3.1 Pro&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;5/6&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;GLM 5&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;?&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;N/A&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;3/5&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;MinMax M2.5&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;N/A&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;1/5&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Qwen3.5 Plus&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;~&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;2.5/6&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Kimi K2.5&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;4/6&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;Obviously, my mini-benchmark only had 6 questions, and I ran it only once. This was obviously not scientifically rigorous. However it was systematic enough to trump just a &lt;em&gt;mere feeling&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;The main finding for me personally is that Claude Opus 4.6, the flagship model by &lt;a class=&quot;link&quot; href=&quot;https://www.anthropic.com/&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;Anthropic&lt;/a&gt;, seems to give great answers consistently. The answers are not only correct, but also well scoped giving enough information to cover everything that seems relevant, without blurping unnecessary filler.&lt;/p&gt;
&lt;p&gt;I used Claude extensively in 2023-2024 when it was the main model available at my day work, but for the past year I had been using other models that I felt were better at the time. Now Claude seems to be the best-of-the-best again, with Gemini and Kimi as close follow-ups. &lt;a class=&quot;link&quot; href=&quot;https://openrouter.ai/compare/anthropic/claude-opus-4.6/google/gemini-3.1-pro-preview/moonshotai/kimi-k2.5&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;Comparing their pricing at OpenRouter.ai&lt;/a&gt; the Kimi K2.5 price of $0.6 / million tokens is almost 90% cheaper than the Claude Opus 4.6’s $5.0 / million tokens suggests that Kimi K2.5 offers the best &lt;strong&gt;price-per-performance ratio&lt;/strong&gt;. Claude might be cheaper with a monthly subscription directly from Anthropic, potentially narrowing the price gap.&lt;/p&gt;
&lt;p&gt;Overall I do feel that Anthropic, Google and Moonshot.ai have been pushing the envelope with their latest models in a way that &lt;strong&gt;one can’t really claim that AI models have plateaued&lt;/strong&gt;. In fact, one could claim that at least Claude has now climbed over the hill of &lt;a class=&quot;link&quot; href=&quot;https://en.wikipedia.org/wiki/AI_slop&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;“AI slop”&lt;/a&gt; and consistently produces valuable results. If and when AI usage expands from here, &lt;strong&gt;we might actually not drown in AI slop&lt;/strong&gt; as chances of accidentally crappy results decrease. This makes me positive about the future.&lt;/p&gt;
&lt;p&gt;I am also really happy to see that there wasn’t just one model crushing everybody else, but that there are &lt;strong&gt;at least three models doing very well&lt;/strong&gt;. As an open source enthusiast I am particularly glad to see that &lt;a class=&quot;link&quot; href=&quot;https://www.moonshot.ai/&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;Moonshot.ai’s&lt;/a&gt; Kimi K2.5 is published with an open license. Given the hardware, anyone can run it on their own. OpenRouter.ai currently lists &lt;a class=&quot;link&quot; href=&quot;https://openrouter.ai/moonshotai/kimi-k2.5/providers&quot; rel=&quot;noopener&quot; target=&quot;_blank&quot;&gt;9 independent providers&lt;/a&gt; alongside Moonshot.ai itself, showcasing the potential of open-weight models in practice.&lt;/p&gt;
&lt;p&gt;If the pattern holds and flagship models continue improving at this pace we might look back at 2026 as the year AI stopped feeling like a call center associate and started to resemble a scientific researcher. While new models become available we need to keep testing, keep questioning, and keep our expectations grounded in actual performance rather than press releases.&lt;/p&gt;
&lt;p&gt;Thanks to OpenRouter.ai for providing a great service that makes testing various models incredibly easy!&lt;/p&gt; </description> 
	<pubDate>Sun, 22 Feb 2026 00:00:00 +0000</pubDate>

</item> 
<item>
	<title>Jonathan Dowland: Lanzarote</title>
	<guid>https://jmtd.net/log/lanzarote/</guid>
	<link>https://jmtd.net/log/lanzarote/</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/jmtd.png&quot; width=&quot;65&quot; height=&quot;85&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;I want to get back into the habit of blogging, but I&#39;ve struggled.
I&#39;ve had several ideas of topics to try and write about, but I&#39;ve
not managed to put aside the time to do it. I thought I&#39;d try and
bash out a one-take, stream-of-conciousness-style post now, to get
back into the swing.&lt;/p&gt;

&lt;p&gt;I&#39;m writing from the lounge of my hotel room in Lanzarote, where
my family have gone for the School break. The weather at home has
been pretty awful this year, and this week is traditionally quite
miserable at the best of times. It&#39;s been dry with highs of around
25℃ .&lt;/p&gt;

&lt;p&gt;It&#39;s been an unusual holiday in one respect: one of my kids is
struggling with Autistic Burnout. We were really unsure whether
taking her was a good idea: and certainly towards the beginning
of the holiday felt we may have made a mistake. Writing now, at
the end, I&#39;m not so sure. But we&#39;re very unlikely to have anything
resembling a traditional summer holiday for the foreseeable.&lt;/p&gt;

&lt;p&gt;Managing Autistic Burnout and the UK ways the UK healthcare and
education systems manage it (or fail to) has been a huge part of my
recent life. Perhaps I should write more about that. This coming
week the government are likely to publish &lt;a href=&quot;https://www.theguardian.com/education/2026/feb/20/former-education-secretaries-urge-labour-mps-government-send-reform&quot;&gt;plans for reforming
Special Needs support in
Education&lt;/a&gt;.
Like many other parents, we wait in hope and fear to see what they
plan.&lt;/p&gt;

&lt;p&gt;In anticipation of spending a lot of time in the hotel room with my
preoccupied daughter I (unusually) packed &lt;a href=&quot;https://jmtd.net/hardware/yoga_260/&quot;&gt;a laptop&lt;/a&gt; and set
myself a nerd-task: writing a &lt;a href=&quot;https://pandoc.org&quot;&gt;Pandoc&lt;/a&gt; parser
(&quot;reader&quot;) for the &lt;a href=&quot;https://moinmo.in&quot;&gt;MoinMoin Wiki&lt;/a&gt; markup
language. There&#39;s some &lt;a href=&quot;https://github.com/jgm/pandoc/tree/moinmoin2&quot;&gt;unfinished prior art from around
2011&lt;/a&gt; by Simon Michael
(of &lt;a href=&quot;https://hledger.org&quot;&gt;hledger&lt;/a&gt;) to work from.&lt;/p&gt;

&lt;p&gt;The motivation was &lt;a href=&quot;https://wiki.debian.org/DebianWiki/WikiRevamp&quot;&gt;our plan to migrate the Debian Wiki away from
MoinMoin&lt;/a&gt;. We&#39;ve
since &lt;a href=&quot;https://lists.debian.org/debian-wiki/2026/02/msg00004.html&quot;&gt;decided to approach that
differently&lt;/a&gt;
but I might finish the Reader anyway, it&#39;s been an interesting
project (and a nice excuse to write Haskell) and it will be useful for others.&lt;/p&gt;

&lt;p&gt;Unusually (for me) I&#39;ve not been reading fiction on this trip: I
took with me &lt;a href=&quot;https://en.wikipedia.org/wiki/Human_Compatible&quot;&gt;Human Compatible by Prof Stuart
Russell&lt;/a&gt;:
discussing how to solve the problem of controlling a future
Artificial Intelligence. I&#39;ve largely avoided the LLM hype cycle
we&#39;re suffering through at the moment, and I have several big
concerns about it (moral, legal, etc.), and felt it was time to try
and make my concerns more well-formed and test them. This book has
been a big help in doing so, although it doesn&#39;t touch on the issue
of copyright, which is something I am particularly interested in at
the moment.&lt;/p&gt; </description> 
	<pubDate>Sat, 21 Feb 2026 19:00:30 +0000</pubDate>

</item> 
<item>
	<title>Vasudev Kamath: Learning Notes: Debsecan MCP Server</title>
	<guid>tag:copyninja.in,2026-02-21:/blog/debsecan-mcp.html</guid>
	<link>https://copyninja.in/blog/debsecan-mcp.html</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/vasudev.png&quot; width=&quot;65&quot; height=&quot;85&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;Since Generative AI is currently the most popular topic, I wanted to get my
hands dirty and learn something new. I was learning about the Model Context
Protocol at the time and wanted to apply it to build something simple.&lt;/p&gt;
&lt;div class=&quot;section&quot; id=&quot;idea&quot;&gt;
&lt;h2&gt;Idea&lt;/h2&gt;
&lt;p&gt;On Debian systems, we use &lt;cite&gt;debsecan&lt;/cite&gt; to find vulnerabilities. However, the tool
currently provides a simple list of vulnerabilities and packages with no
indication of the system&#39;s security posture—meaning no criticality information
is exposed and no executive summary is provided regarding what needs to be
fixed. Of course, one can simply run the following to install existing fixes and
be done with it:&lt;/p&gt;
&lt;div class=&quot;highlight&quot;&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;apt-get&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;install&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;k&quot;&gt;$(&lt;/span&gt;debsecan&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;--suite&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;sid&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;--format&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;packages&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;--only-fixed&lt;span class=&quot;k&quot;&gt;)&lt;/span&gt;
&lt;/pre&gt;&lt;/div&gt;
&lt;p&gt;But this is not how things work in corporate environments; you need to provide a
report showing the system&#39;s previous state and the actions taken to bring it to
a safe state. It is all about metrics and reports.&lt;/p&gt;
&lt;p&gt;My goal was to use &lt;cite&gt;debsecan&lt;/cite&gt; to generate a list of vulnerabilities, find more
detailed information on them, and prioritize them as critical, high, medium, or
low. By providing this information to an AI, I could ask it to generate an
executive summary report detailing what needs to be addressed immediately and
the overall security posture of the system.&lt;/p&gt;
&lt;/div&gt;
&lt;div class=&quot;section&quot; id=&quot;initial-take&quot;&gt;
&lt;h2&gt;Initial Take&lt;/h2&gt;
&lt;p&gt;My initial thought was to use an existing LLM, either self-hosted or a
cloud-based LLM like Gemini (which provides an API with generous limits via AI
Studio). I designed functions to output the list of vulnerabilities on the
system and provide detailed information on each. The idea was to use these as
&quot;tools&quot; for the LLM.&lt;/p&gt;
&lt;div class=&quot;section&quot; id=&quot;learnings&quot;&gt;
&lt;h3&gt;Learnings&lt;/h3&gt;
&lt;ol class=&quot;arabic simple&quot;&gt;
&lt;li&gt;I learned about open-source LLMs using Ollama, which allows you to download
and use models on your laptop.&lt;/li&gt;
&lt;li&gt;I used Llama 3.1, Llama 3.2, and Granite 4 on my laptop without a GPU. I
managed to run my experiments, even though they were time-consuming and
occasionally caused my laptop to crash.&lt;/li&gt;
&lt;li&gt;I learned about Pydantic and how to use it to parse custom JSON schemas with
minimal effort.&lt;/li&gt;
&lt;li&gt;I learned about osv.dev, an open-source initiative by Google that aggregates
vulnerability information from various sources and provides data in a
well-documented JSON schema format.&lt;/li&gt;
&lt;li&gt;I learned about the EPSS (Exploit Prediction Scoring System) and how it is
used alongside static CVSS scoring to detect truly critical vulnerabilities.
The EPSS score provides an idea of the probability of a vulnerability being
exploited in the wild based on actual real-world attacks.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;These experiments led to a collection of &lt;a class=&quot;reference external&quot; href=&quot;https://github.com/copyninja/notebooks/tree/main/langchain&quot;&gt;notebooks&lt;/a&gt;. One key takeaway
was that when defining tools, I cannot simply output massive amounts of text
because it consumes tokens and increases costs for paid models (though it is
fine for local models using your own hardware and energy). Self-hosted models
require significant prompting to produce proper output, which helped me
understand the real-world application of prompt engineering.&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;div class=&quot;section&quot; id=&quot;change-of-plans&quot;&gt;
&lt;h2&gt;Change of Plans&lt;/h2&gt;
&lt;p&gt;Despite extensive experimentation, I felt I was nowhere close to a full
implementation. While using a Gemini learning tool to study MCP, it suddenly
occurred to me: why not write the entire thing as an MCP server? This would save
me from implementing the agent side and allow me to hook it into any
IDE-based LLM.&lt;/p&gt;
&lt;div class=&quot;section&quot; id=&quot;design&quot;&gt;
&lt;h3&gt;Design&lt;/h3&gt;
&lt;p&gt;This MCP server is primarily a mix of a &quot;tool&quot; (which executes on the server
machine to identify installed packages and their vulnerabilities) and a
&quot;resource&quot; (which exposes read-only information for a specific CVE ID).&lt;/p&gt;
&lt;p&gt;The MCP exposes two tools:&lt;/p&gt;
&lt;ol class=&quot;arabic simple&quot;&gt;
&lt;li&gt;List Vulnerabilities: This tool identifies vulnerabilities in the packages
installed on the system, categorizes them using CVE and EPSS scores, and
provides a dictionary of critical, high, medium, and low vulnerabilities.&lt;/li&gt;
&lt;li&gt;Research Vulnerabilities: Based on the user prompt, the LLM can identify
relevant vulnerabilities and pass them to this function to retrieve details
such as whether a fix is available, the fixed version, and criticality.&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt;
&lt;div class=&quot;section&quot; id=&quot;vibe-coding&quot;&gt;
&lt;h3&gt;Vibe Coding&lt;/h3&gt;
&lt;p&gt;&quot;Vibe coding&quot; is the latest trend, with many claiming that software engineering
jobs are a thing of the past. Without going into too much detail, I decided to
give it a try. While this is not my first &quot;vibe coded&quot; project (I have done this
previously at work using corporate tools), it is my first attempt to vibe code a
hobby/learning project.&lt;/p&gt;
&lt;p&gt;I chose Antigravity because it seemed to be the only editor providing a
sufficient amount of free tokens. For every vibe coding project, I spend time
thinking about the barebones skeleton: the modules, function return values, and
data structures. This allows me to maintain control over the LLM-generated code
so it doesn&#39;t become overly complicated or incomprehensible.&lt;/p&gt;
&lt;p&gt;As a first step, I wrote down my initial design in a requirements document. In
that document, I explicitly called for using &lt;a class=&quot;reference external&quot; href=&quot;https://github.com/copyninja/debsecan-mcp/blob/main/docs/requirement.md&quot;&gt;debsecan&lt;/a&gt; as
the model for various components. Additionally, I asked the AI to reference my
specific &lt;a class=&quot;reference external&quot; href=&quot;https://github.com/copyninja/notebooks/blob/main/langchain/secscan-common.ipynb&quot;&gt;code for the EPSS logic&lt;/a&gt;.
The reasons were:&lt;/p&gt;
&lt;ol class=&quot;arabic simple&quot;&gt;
&lt;li&gt;&lt;cite&gt;debsecan&lt;/cite&gt; already solves the core problem; I am simply rebuilding it.
&lt;cite&gt;debsecan&lt;/cite&gt; uses a single file generated by the Debian Security team
containing all necessary information, which prevents us from needing multiple
external sources.&lt;/li&gt;
&lt;li&gt;This provides the flexibility to categorize vulnerabilities within the
listing tool itself since all required information is readily available,
unlike my original notebook-based design.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;I initially used Gemini 3 Flash as the model because I was concerned about
exceeding my free limits.&lt;/p&gt;
&lt;/div&gt;
&lt;div class=&quot;section&quot; id=&quot;hiccups&quot;&gt;
&lt;h3&gt;Hiccups&lt;/h3&gt;
&lt;p&gt;Although it initially seemed successful, I soon noticed discrepancies between
the local &lt;cite&gt;debsecan&lt;/cite&gt; outputs and the outputs generated by the tools. I asked the
AI to fix this, but after two attempts, it still could not match the outputs. I
realized it was writing its own version-comparison logic and failing
significantly.&lt;/p&gt;
&lt;p&gt;Finally, I instructed it to depend entirely on the &lt;em&gt;python-apt&lt;/em&gt; module for
version comparison; since it is not on PyPI, I asked it to pull directly from
the Git source. This solved some issues, but the problem persisted. By then, my
weekly quota was exhausted, and I stopped debugging.&lt;/p&gt;
&lt;p&gt;A week later, I resumed debugging with the Claude 3.5 Sonnet model. Within 20-25
minutes, it found the fix, which involved &lt;a class=&quot;reference external&quot; href=&quot;https://github.com/copyninja/debsecan-mcp/commit/4fdf5a2ab139f3c0c335d24973892ddfaf2b08e0#diff-9d3ed702945a5c91f0ed3e54404324fecfca1fbd7ae5fb44508081c6040e9276&quot;&gt;four lines of changes&lt;/a&gt;
in the parsing logic. However, I ran out of limits again before I could proceed
further.&lt;/p&gt;
&lt;p&gt;In the requirements, I specified that the &lt;em&gt;list vulnerabilities&lt;/em&gt; tool should
only provide a dictionary of CVE IDs divided by severity. However, the AI
instead provided full text for all vulnerability details, resulting in excessive
data—including negligible vulnerabilities—being sent to the LLM. Consequently,
it never called the &lt;em&gt;research vulnerabilities&lt;/em&gt; tool. Since I had run out of
limits, I manually fixed this in a &lt;a class=&quot;reference external&quot; href=&quot;https://github.com/copyninja/debsecan-mcp/commit/32f291d5ec4966b1349d39cfcb5d154e64ad844d&quot;&gt;follow-up commit&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;div class=&quot;section&quot; id=&quot;how-to-use&quot;&gt;
&lt;h2&gt;How to Use&lt;/h2&gt;
&lt;p&gt;I have published the current work in the &lt;a class=&quot;reference external&quot; href=&quot;https://github.com/copyninja/debsecan-mcp&quot;&gt;debsecan-mcp&lt;/a&gt; repository. I have used the same
license as the original &lt;em&gt;debsecan&lt;/em&gt;. I am not entirely sure how to interpret
licenses for vibe-coded projects, but here we are.&lt;/p&gt;
&lt;p&gt;To use this, you need to install the tool in a virtual environment and configure
your IDE to use the MCP. Here is how I set it up for Visual Studio Code:&lt;/p&gt;
&lt;ol class=&quot;arabic simple&quot;&gt;
&lt;li&gt;Follow the &lt;a class=&quot;reference external&quot; href=&quot;https://code.visualstudio.com/docs/copilot/customization/mcp-servers#_add-an-mcp-server&quot;&gt;guide from the VS Code documentation&lt;/a&gt;
regarding adding an MCP server.&lt;/li&gt;
&lt;li&gt;My global &lt;cite&gt;mcp.json&lt;/cite&gt; looks like this:&lt;/li&gt;
&lt;/ol&gt;
&lt;div class=&quot;highlight&quot;&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;nt&quot;&gt;&quot;servers&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;:&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;      &lt;/span&gt;&lt;span class=&quot;nt&quot;&gt;&quot;debsecan-mcp&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;:&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;{&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;          &lt;/span&gt;&lt;span class=&quot;nt&quot;&gt;&quot;command&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;:&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;uv&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;              &lt;/span&gt;&lt;span class=&quot;nt&quot;&gt;&quot;args&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;:&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;[&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;                  &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;--directory&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;                  &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;/home/vasudev/Documents/personal/FOSS/debsecan-mcp/debsecan-mcp&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;                  &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;run&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;,&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;                  &lt;/span&gt;&lt;span class=&quot;s2&quot;&gt;&quot;debsecan-mcp&quot;&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;              &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;]&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;          &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;},&lt;/span&gt;
&lt;span class=&quot;w&quot;&gt;  &lt;/span&gt;&lt;span class=&quot;nt&quot;&gt;&quot;inputs&quot;&lt;/span&gt;&lt;span class=&quot;p&quot;&gt;:&lt;/span&gt;&lt;span class=&quot;w&quot;&gt; &lt;/span&gt;&lt;span class=&quot;p&quot;&gt;[]&lt;/span&gt;
&lt;span class=&quot;p&quot;&gt;}&lt;/span&gt;
&lt;/pre&gt;&lt;/div&gt;
&lt;ol class=&quot;arabic simple&quot; start=&quot;3&quot;&gt;
&lt;li&gt;I am running it directly from my local codebase using a virtualenv created
with &lt;em&gt;uv&lt;/em&gt;. You may need to tweak the path based on your installation.&lt;/li&gt;
&lt;li&gt;To use the MCP server in the Copilot chat window, reference it using
&lt;cite&gt;#debsecan-mcp&lt;/cite&gt;. The LLM will then use the server for the query.&lt;/li&gt;
&lt;li&gt;Use a prompt like: &lt;em&gt;&quot;Give an executive summary of the system security status
and immediate actions to be taken.&quot;&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;You can observe the LLM using &lt;cite&gt;list_vulnerabilities&lt;/cite&gt; followed by
&lt;cite&gt;research_cves&lt;/cite&gt;. Because the first tool only provides CVE IDs based on
severity, the LLM is smart enough to research only high and critical
vulnerabilities, thereby saving tokens.&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt;
&lt;div class=&quot;section&quot; id=&quot;what-s-next&quot;&gt;
&lt;h2&gt;What&#39;s Next?&lt;/h2&gt;
&lt;p&gt;This MCP is not yet perfect and has the following issues:&lt;/p&gt;
&lt;ol class=&quot;arabic simple&quot;&gt;
&lt;li&gt;The &lt;cite&gt;list_vulnerabilities&lt;/cite&gt; dictionary contains duplicate CVE IDs because the
code used a list instead of a set. While the LLM is smart enough to
deduplicate these, it still costs extra tokens.&lt;/li&gt;
&lt;li&gt;Because I initially modeled this on &lt;cite&gt;debsecan&lt;/cite&gt;, it uses a raw method for
parsing &lt;cite&gt;/var/lib/dpkg/status&lt;/cite&gt; instead of &lt;cite&gt;python-apt&lt;/cite&gt;. I am considering
switching to &lt;cite&gt;python-apt&lt;/cite&gt; to reduce maintenance overhead.&lt;/li&gt;
&lt;li&gt;Interestingly, the AI did not add a single unit test, which is disappointing.
I will add these once my limits are restored.&lt;/li&gt;
&lt;li&gt;I need to create a cleaner README with usage instructions.&lt;/li&gt;
&lt;li&gt;I need to determine if the MCP can be used via HTTP as well as stdio.&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt;
&lt;div class=&quot;section&quot; id=&quot;conclusion&quot;&gt;
&lt;h2&gt;Conclusion&lt;/h2&gt;
&lt;p&gt;Vibe coding is interesting, but things can get out of hand if not managed
properly. Even with a good process, code must be reviewed and tested; you cannot
blindly trust an AI to handle everything. Even if it adds tests, you must
validate them, or you are doomed!&lt;/p&gt;
&lt;/div&gt; </description> 
	<pubDate>Sat, 21 Feb 2026 11:43:00 +0000</pubDate>

</item> 
<item>
	<title>Thomas Goirand: Seamlessly upgrading a production OpenStack cluster in 4 hours : with 2k lines shell script</title>
	<guid>http://thomas.goirand.fr/blog/?p=426</guid>
	<link>http://thomas.goirand.fr/blog/?p=426</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/zigo.png&quot; width=&quot;65&quot; height=&quot;85&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;&lt;br /&gt;&lt;strong&gt;tl;dr:&lt;/strong&gt;&lt;/p&gt;



&lt;p&gt;To the question: “what does it take to upgrade OpenStack”, my personal answer is: less than 2K lines of dash script. I’ll here describe its internals, and why I believe it is the correct solution.&lt;/p&gt;



&lt;p&gt;&lt;strong&gt;Why writing this blog post&lt;/strong&gt;&lt;/p&gt;



&lt;p&gt;During FOSSDEM 2024, I was asked “how to you handle upgrades”. I answered with a big smile and a short “&lt;em&gt;with a very small shell script&lt;/em&gt;” as I couldn’t explain in 2 minutes how it was done. Just saying “it is great this way” doesn’t help giving readers enough hints to be trusted. Why and how did I do it the right way ? This blog post is an attempt to reply better to this question more deeply.&lt;/p&gt;



&lt;p&gt;&lt;strong&gt;Upgrading OpenStack in production&lt;/strong&gt;&lt;/p&gt;



&lt;p&gt;I wrote this script maybe a 2 or 3 years ago. Though I’m only blogging about it today, because … I did such an upgrade in a public cloud in production last Thuesday evening (ie: the first region of the Infomaniak public cloud). I’d say the cluster is moderately large (as of today: about 8K+ VMs running, 83 compute nodes, 12 network nodes, … for a total of 10880 physical CPU cores and 125 TB of RAM if I only count the compute servers). It took “only” 4 hours to do the upgrade (though I already wore some more code to speed this up for the next time…). It went super smooth without a glitch. I mostly just sat, reading the script output… and went to bed once it finished running. The next day, all my colleagues at Infomaniak were nicely congratulating me that it went that smooth (a big thanks to all of you who did). I couldn’t dream of an upgrade that smooth! :)&lt;/p&gt;



&lt;p&gt;Still not impressed? Boring read? Yeah… let’s dive into more technical details.&lt;/p&gt;



&lt;p&gt;&lt;strong&gt;Intention behind the implementation&lt;/strong&gt;&lt;/p&gt;



&lt;p&gt;My script isn’t perfect. I wont ever pretend it is. But at least, it does minimize down time of every OpenStack service. It also is a “by the book” implementation of what’s written in the OpenStack doc, following every upstream advice. As a result, it is fully seamless for some OpenStack services, and as HA as OpenStack can be for others. The upgrade process is of course idempotent and can be re-run in case of failure. Here’s why.&lt;/p&gt;



&lt;p&gt;&lt;strong&gt;General idea&lt;/strong&gt;&lt;/p&gt;



&lt;p&gt;My upgrade script does thing in a certain order, respecting what is documented about upgrades in the OpenStack documentation. It basically does:&lt;/p&gt;



&lt;ul&gt;&lt;li&gt;Upgrade all dependency&lt;/li&gt;&lt;li&gt;Upgrade all services one by one, in all the cluster&lt;/li&gt;&lt;/ul&gt;



&lt;p&gt;&lt;strong&gt;Installing dependencies&lt;/strong&gt;&lt;/p&gt;



&lt;p&gt;The first thing the upgrade script does is:&lt;/p&gt;



&lt;ul&gt;&lt;li&gt;disable puppet on all nodes of the cluster&lt;/li&gt;&lt;li&gt;switch the APT repository&lt;/li&gt;&lt;li&gt;apt-get update on all nodes&lt;/li&gt;&lt;li&gt;install library dependency on all nodes&lt;/li&gt;&lt;/ul&gt;



&lt;p&gt;For this last thing, a static list of all needed dependency upgrade is maintained between each release of OpenStack, and for each type of nodes. Then for all packages in this list, the script checks with dpkg-query that the package is really installed, and with apt-cache policy that it really is going to be upgraded (Maybe there’s an easier way to do this?). This way, no package is marked as manually installed by mistake during the upgrade process. This ensure that “apt-get –purge autoremove” really does what it should, and that the script is really idempotent.&lt;/p&gt;



&lt;p&gt;The idea then, is that once all dependencies are installed, upgrading and restarting leaf packages (ie: OpenStack services like Nova, Glance, Cinder, etc.) is very fast, because the apt-get command doesn’t need to install all dependencies. So at this point, doing “apt-get install python3-cinder” for example (which will also, thanks to dependencies, upgrade cinder-api and cinder-scheduler, if it’s in a controller node) only takes a few seconds. This principle applies to all nodes (controller nodes, network nodes, compute nodes, etc.), which helps a lot speeding-up the upgrade and reduce unavailability.&lt;/p&gt;



&lt;p&gt;&lt;strong&gt;hapc&lt;/strong&gt;&lt;/p&gt;



&lt;p&gt;At its core, the oci-cluster-upgrade-openstack-release script uses haproxy-cmd (ie: /usr/bin/hapc) to drain each API server to-be-upgraded from haproxy. Hapc is a simple Python wrapper around the haproxy admin socket: it sends command to it with an easy to understand CLI. So it is possible to reliably upgrade one API service only after it’s drained away. Draining means one just wait for the last query to finish and the client to disconnect from http before giving the backend server some more queries. If you do not know hapc / haproxy-cmd, I recommend trying it: it’s going to be hard for you to stop using it once you tested it. Its bash-completion script makes it VERY easy to use, and it is helpful in production. But not only: it is also nice to have when writing this type of upgrade script. Let’s dive into haproxy-cmd.&lt;/p&gt;



&lt;p&gt;&lt;strong&gt;Example on how to use haproxy-cmd&lt;/strong&gt;&lt;/p&gt;



&lt;p&gt;Let me show you. First, ssh one of the 3 controller and search where the virtual IP (VIP) is located with “crm resource locate openstack-api-vip” or with a (more simple) “crm status”. Let’s ssh that server who got the VIP, and now, let’s drain it away from haproxy.&lt;/p&gt;



&lt;p&gt;&lt;code&gt;$ hapc list-backends&lt;br /&gt;$ hapc drain-server --backend glancebe --server cl1-controller-1.infomaniak.ch --verbose --wait --timeout 50&lt;br /&gt;$ apt-get install glance-api&lt;br /&gt;$ hapc enable-server --backend glancebe --server cl1-controller-1.infomaniak.ch&lt;/code&gt;&lt;/p&gt;



&lt;p&gt;&lt;strong&gt;Upgrading the control plane&lt;/strong&gt;&lt;/p&gt;



&lt;p&gt;My upgrade script leverages hapc just like above. For each OpenStack project, it’s done in this order on the first node holding the VIP:&lt;/p&gt;



&lt;ul&gt;&lt;li&gt;“hapc drain-server” of the API, so haproxy gracefully stops querying it&lt;/li&gt;&lt;li&gt;stop all services on that node (including non-API services): stop, disable and mask with systemd.&lt;/li&gt;&lt;li&gt;upgrade that service Python code. For example: “apt-get install python3-nova”, which also will pull nova-api, nova-conductor, nova-novncprox, etc. but services wont start automatically as they’ve been stoped + disabled + masked on the previous bullet point.&lt;/li&gt;&lt;li&gt;perform the db_sync so that the db is up-to-date [1]&lt;/li&gt;&lt;li&gt;start all services (unmask, enable and start with systemd)&lt;/li&gt;&lt;li&gt;re-enable the API backend with “hapc enable-server”&lt;/li&gt;&lt;/ul&gt;



&lt;p&gt;Starting at [1], the risk is that other nodes may have a new version of the database schema, but an old version of the code that isn’t compatible with it. But it doesn’t take long, because the next step is to take care of the other (usually 2) nodes of the OpenStack control plane:&lt;/p&gt;



&lt;ul&gt;&lt;li&gt;“hapc drain-server” of the API of the other 2 controllers&lt;/li&gt;&lt;li&gt;stop of all services on these 2 controllers [2]&lt;/li&gt;&lt;li&gt;upgrade of the package&lt;/li&gt;&lt;li&gt;start of all services&lt;/li&gt;&lt;/ul&gt;



&lt;p&gt;So while there’s technically zero down time, still some issues between [1] and [2] above may happen because of the new DB schema and the old code (both for API and other services) are up and running at the same time. It is however supposed to be rare cases (some OpenStack project don’t even have db change between some OpenStack releases, and it often continues to work on most queries with the upgraded db), and the cluster will be like that for a very short time, so that’s fine, and better than an full API down time.&lt;/p&gt;



&lt;p&gt;&lt;strong&gt;Satellite services&lt;/strong&gt;&lt;/p&gt;



&lt;p&gt;Then there’s satellite services, that needs to be upgraded. Like Neutron, Nova, Cinder. Nova is the least offender as it has all the code to rewrite Json object schema on-the-fly so that it continues to work during an upgrade. Though it’s a known issue that Cinder doesn’t have the feature (last time I checked), and it’s also probably the same for Neutron (maybe recent-ish versions of OpenStack do use oslo.versionnedobjects ?). Anyways, upgrade on these nodes are done just right after the control plane for each service.&lt;/p&gt;



&lt;p&gt;&lt;strong&gt;Parallelism and upgrade timings&lt;/strong&gt;&lt;/p&gt;



&lt;p&gt;As we’re dealing with potentially hundreds of nodes per cluster, a lot of operations are performed in parallel. I choose to simply use the &amp;amp; shell thingy with some “wait” shell stuff so that not too many jobs are done in parallel. For example, when disabling SSH on all nodes, this is done 24 nodes at a time. Which is fine. And the number of nodes is all depending on the type of thing that’s being done. For example, while it’s perfectly OK to disable puppet on 24 nodes at the same time, but it is not OK to do that with Neutron services. In fact, each time a Neutron agent is restarted, the script explicitly waits for 30 seconds. This conveniently avoids a hailstorm of messages in RabbitMQ, and neutron-rpc-server to become too busy. All of these waiting are necessary, and this is one of the reasons why can sometimes take that long to upgrade a (moderately big) cluster.&lt;/p&gt;



&lt;p&gt;&lt;strong&gt;Not using config management tooling&lt;/strong&gt;&lt;/p&gt;



&lt;p&gt;Some of my colleagues would have prefer that I used something like Ansible. Whever, there’s no reason to use such tool if the idea is just to perform some shell script commands on every servers. It is a way more efficient (in terms of programming) to just use bash / dash to do the work. And if you want my point of view about Ansible: using yaml for doing such programming would be crasy. Yaml is simply not adapted for a job where if, case, and loops are needed. I am well aware that Ansible has workarounds and it could be done, but it wasn’t my choice.&lt;/p&gt; </description> 
	<pubDate>Sat, 21 Feb 2026 00:44:05 +0000</pubDate>

</item> 
<item>
	<title>Bits from Debian: Proxmox Platinum Sponsor of DebConf26</title>
	<guid>tag:bits.debian.org,2026-02-20:/2026/02/proxmox-platinum-debconf26.html</guid>
	<link>https://bits.debian.org/2026/02/proxmox-platinum-debconf26.html</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/dwn.png&quot; width=&quot;77&quot; height=&quot;85&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;&lt;a href=&quot;https://www.proxmox.com/&quot;&gt;&lt;img alt=&quot;proxmox-logo&quot; src=&quot;https://bits.debian.org/images/proxmox.png&quot; /&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;We are pleased to announce that &lt;strong&gt;&lt;a href=&quot;https://www.proxmox.com&quot;&gt;Proxmox&lt;/a&gt;&lt;/strong&gt; has
committed to sponsor &lt;a href=&quot;https://debconf26.debconf.org/&quot;&gt;DebConf26&lt;/a&gt; as a
&lt;strong&gt;Platinum Sponsor&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;Proxmox develops powerful, yet easy-to-use open-source server solutions. The
comprehensive open-source ecosystem is designed to manage divers IT landscapes,
from single servers to large-scale distributed data centers. Our unified
platform integrates server virtualization, easy backup, and rock-solid email
security ensuring seamless interoperability across the entire portfolio. With
the Proxmox Datacenter Manager, the ecosystem also offers a &quot;single pane of
glass&quot; for centralized management across different locations. &lt;/p&gt;
&lt;p&gt;Since 2005, all Proxmox solutions have been built on the rock-solid Debian
platform. We are proud to return to DebConf26 as a sponsor because the Debian
community provides the foundation that makes our work possible. We believe in
keeping IT simple, open, and under your control.&lt;/p&gt;
&lt;p&gt;Thank you very much, Proxmox, for your support of DebConf26!&lt;/p&gt;
&lt;h2&gt;Become a sponsor too!&lt;/h2&gt;
&lt;p&gt;&lt;a href=&quot;https://debconf26.debconf.org/&quot;&gt;DebConf26&lt;/a&gt; will take place &lt;strong&gt;from 20th to July
25th 2026 in Santa Fe, Argentina,&lt;/strong&gt; and will be preceded by DebCamp, from 13th
to 19th July 2026.&lt;/p&gt;
&lt;p&gt;DebConf26 is accepting sponsors! Interested companies and organizations may
contact the DebConf team through
&lt;a href=&quot;mailto:sponsors@debconf.org&quot;&gt;sponsors@debconf.org&lt;/a&gt;, and visit the DebConf26
website at
&lt;a href=&quot;https://debconf26.debconf.org/sponsors/become-a-sponsor/&quot;&gt;https://debconf26.debconf.org/sponsors/become-a-sponsor/&lt;/a&gt;.&lt;/p&gt; </description> 
	<pubDate>Fri, 20 Feb 2026 17:26:00 +0000</pubDate>

</item> 
<item>
	<title>Reproducible Builds (diffoscope): diffoscope 313 released</title>
	<guid>https://diffoscope.org/news/diffoscope-313-released/</guid>
	<link>https://diffoscope.org/news/diffoscope-313-released/</link>
     <description>  &lt;p&gt;The diffoscope maintainers are pleased to announce the release of diffoscope
version &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;313&lt;/code&gt;. This version includes the following changes:&lt;/p&gt;

&lt;div class=&quot;language-plaintext highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;[ Chris Lamb ]
* Don&#39;t fail the entire pipeline if deploying to PyPI automatically fails.

[ Vagrant Cascadian ]
* Update external tool reference for 7z on guix.
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;You find out more by &lt;a href=&quot;https://diffoscope.org&quot;&gt;visiting the project homepage&lt;/a&gt;.&lt;/p&gt; </description> 
	<pubDate>Fri, 20 Feb 2026 00:00:00 +0000</pubDate>

</item> 
<item>
	<title>Peter Pentchev: Ringlet release: fnmatch-regex 0.3.0</title>
	<guid>https://extelligence.ringlet.net/roam/2026/02/19/fnmatch-regex-0.3.0/</guid>
	<link>https://extelligence.ringlet.net/roam/2026/02/19/fnmatch-regex-0.3.0/</link>
     <description>  &lt;p&gt;Version 0.3.0 of the &lt;a href=&quot;https://crates.io/crate/fnmatch-regex&quot; title=&quot;The crates.io fnmatch-regex page&quot;&gt;fnmatch-regex&lt;/a&gt; Rust crate is now available. The major new addition is the &lt;a href=&quot;https://docs.rs/fnmatch-regex/0.3.0/fnmatch_regex/glob/fn.glob_to_regex_pattern.html&quot; title=&quot;The glob_to_regex_pattern function&quot;&gt;glob_to_regex_pattern&lt;/a&gt; function that only converts the glob pattern to a regular expression one without building a regular expression matcher. Two new features - &lt;code&gt;regex&lt;/code&gt; and &lt;code&gt;std&lt;/code&gt; - are also added, both enabled by default.&lt;/p&gt; &lt;p&gt;For more information, see the changelog at &lt;a href=&quot;https://devel.ringlet.net/textproc/fnmatch-regex/&quot; title=&quot;The Ringlet fnmatch-regex homepage&quot;&gt;the homepage&lt;/a&gt;.&lt;/p&gt; </description> 
	<pubDate>Thu, 19 Feb 2026 11:18:00 +0000</pubDate>

</item> 
<item>
	<title>Clint Adams: Holger says</title>
	<guid>https://xana.scru.org/posts/quanks/sqnetworkkeyserversearch.html</guid>
	<link>https://xana.scru.org/posts/quanks/sqnetworkkeyserversearch.html</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/clint.png&quot; width=&quot;80&quot; height=&quot;88&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;div class=&quot;inlinecontent&quot;&gt;
&lt;p&gt;&lt;code&gt;sq network keyserver search $id ; sq cert export --cert=$id &amp;gt; $id.asc&lt;/code&gt;&lt;/p&gt;
&lt;/div&gt;

&lt;div class=&quot;info&quot;&gt;
    Posted on 2026-02-18
    
&lt;/div&gt;
&lt;div class=&quot;info&quot;&gt;
    
    Tags: &lt;a href=&quot;https://xana.scru.org/tags/quanks.html&quot; rel=&quot;tag&quot; title=&quot;All pages tagged &#39;quanks&#39;.&quot;&gt;quanks&lt;/a&gt;
    
&lt;/div&gt; </description> 
	<pubDate>Wed, 18 Feb 2026 22:59:00 +0000</pubDate>

</item> 
<item>
	<title>Antoine Beaupré: net-tools to iproute cheat sheet</title>
	<guid>https://anarc.at/blog/2026-02-18-iproute2/</guid>
	<link>https://anarc.at/blog/2026-02-18-iproute2/</link>
     <description>  &lt;p&gt;This is also known as: &quot;&lt;code&gt;ifconfig&lt;/code&gt; is not installed by default
anymore, how do I do this only with the &lt;code&gt;ip&lt;/code&gt; command?&quot;&lt;/p&gt;

&lt;p&gt;I have been slowly training my brain to use the new commands but I
sometimes forget some. So, here&#39;s a couple of equivalence from the old
package to &lt;code&gt;net-tools&lt;/code&gt; the new &lt;code&gt;iproute2&lt;/code&gt;, about 10 years late:&lt;/p&gt;

&lt;table class=&quot;table&quot;&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt; &lt;code&gt;net-tools&lt;/code&gt;                 &lt;/th&gt;
&lt;th&gt; &lt;code&gt;iproute2&lt;/code&gt;                                   &lt;/th&gt;
&lt;th&gt; shorter form                 &lt;/th&gt;
&lt;th&gt; what it does                            &lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt; &lt;code&gt;arp -an&lt;/code&gt;                   &lt;/td&gt;
&lt;td&gt; &lt;code&gt;ip neighbor&lt;/code&gt;                                &lt;/td&gt;
&lt;td&gt; &lt;code&gt;ip n&lt;/code&gt;                       &lt;/td&gt;
&lt;td&gt;                                         &lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; &lt;code&gt;ifconfig&lt;/code&gt;                  &lt;/td&gt;
&lt;td&gt; &lt;code&gt;ip address&lt;/code&gt;                                 &lt;/td&gt;
&lt;td&gt; &lt;code&gt;ip a&lt;/code&gt;                       &lt;/td&gt;
&lt;td&gt; show current IP address                 &lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; &lt;code&gt;ifconfig&lt;/code&gt;                  &lt;/td&gt;
&lt;td&gt; &lt;code&gt;ip link&lt;/code&gt;                                    &lt;/td&gt;
&lt;td&gt; &lt;code&gt;ip l&lt;/code&gt;                       &lt;/td&gt;
&lt;td&gt; show link stats (up/down/packet counts) &lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; &lt;code&gt;route&lt;/code&gt;                     &lt;/td&gt;
&lt;td&gt; &lt;code&gt;ip route&lt;/code&gt;                                   &lt;/td&gt;
&lt;td&gt; &lt;code&gt;ip r&lt;/code&gt;                       &lt;/td&gt;
&lt;td&gt; show or modify the routing table        &lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; &lt;code&gt;route add default GATEWAY&lt;/code&gt; &lt;/td&gt;
&lt;td&gt; &lt;code&gt;ip route add default via GATEWAY&lt;/code&gt;           &lt;/td&gt;
&lt;td&gt; &lt;code&gt;ip r a default via GATEWAY&lt;/code&gt; &lt;/td&gt;
&lt;td&gt; add default route to &lt;code&gt;GATEWAY&lt;/code&gt;          &lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; &lt;code&gt;route del ROUTE&lt;/code&gt;           &lt;/td&gt;
&lt;td&gt; &lt;code&gt;ip route del ROUTE&lt;/code&gt;                         &lt;/td&gt;
&lt;td&gt; &lt;code&gt;ip r d ROUTE&lt;/code&gt;               &lt;/td&gt;
&lt;td&gt; remove &lt;code&gt;ROUTE&lt;/code&gt; (e.g. &lt;code&gt;default&lt;/code&gt;)         &lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; &lt;code&gt;netstat -anpe&lt;/code&gt;             &lt;/td&gt;
&lt;td&gt; &lt;code&gt;ss --all --numeric  --processes --extended&lt;/code&gt; &lt;/td&gt;
&lt;td&gt; &lt;code&gt;ss -anpe&lt;/code&gt;                   &lt;/td&gt;
&lt;td&gt; list listening processes, less pretty   &lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;


&lt;p&gt;Note that I wrote a &lt;a href=&quot;https://anarc.at/blog/2023-03-10-listening-processes/&quot;&gt;whole
article&lt;/a&gt; about the latter.&lt;/p&gt;

&lt;h1 id=&quot;another-trick&quot;&gt;Another trick&lt;/h1&gt;

&lt;p&gt;Also note that I often alias &lt;code&gt;ip&lt;/code&gt; to &lt;code&gt;ip -br -c&lt;/code&gt; as it provides a
much prettier output.&lt;/p&gt;

&lt;p&gt;Compare, before:&lt;/p&gt;

&lt;pre&gt;&lt;code&gt;anarcat@angela:~&amp;gt; ip a
1: lo: &amp;lt;LOOPBACK,UP,LOWER_UP&amp;gt; mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000
    link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
    inet 127.0.0.1/8 scope host lo
       valid_lft forever preferred_lft forever
    inet6 ::1/128 scope host noprefixroute 
       valid_lft forever preferred_lft forever
2: wlan0: &amp;lt;NO-CARRIER,BROADCAST,MULTICAST,UP&amp;gt; mtu 1500 qdisc noqueue state DOWN group default qlen 1000
    link/ether xx:xx:xx:xx:xx:xx brd ff:ff:ff:ff:ff:ff permaddr xx:xx:xx:xx:xx:xx
    altname wlp166s0
    altname wlx8cf8c57333c7
4: virbr0: &amp;lt;NO-CARRIER,BROADCAST,MULTICAST,UP&amp;gt; mtu 1500 qdisc noqueue state DOWN group default qlen 1000
    link/ether xx:xx:xx:xx:xx:xx brd ff:ff:ff:ff:ff:ff
    inet 192.168.122.1/24 brd 192.168.122.255 scope global virbr0
       valid_lft forever preferred_lft forever
20: eth0: &amp;lt;BROADCAST,MULTICAST,UP,LOWER_UP&amp;gt; mtu 1500 qdisc fq_codel state UP group default qlen 1000
    link/ether xx:xx:xx:xx:xx:xx brd ff:ff:ff:ff:ff:ff
    inet 192.168.0.108/24 brd 192.168.0.255 scope global dynamic noprefixroute eth0
       valid_lft 40699sec preferred_lft 40699sec
&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;After:&lt;/p&gt;

&lt;pre&gt;&lt;code&gt;anarcat@angela:~&amp;gt; ip -br -c a
lo               UNKNOWN        127.0.0.1/8 ::1/128 
wlan0            DOWN           
virbr0           DOWN           192.168.122.1/24 
eth0             UP             192.168.0.108/24 
&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;I don&#39;t even need to redact MAC addresses! It also affects the display
of the other commands, which look similarly neat.&lt;/p&gt;

&lt;p&gt;Also imagine pretty colors above.&lt;/p&gt;

&lt;p&gt;Finally, I don&#39;t have a cheat sheet for &lt;code&gt;iw&lt;/code&gt; vs &lt;code&gt;iwconfig&lt;/code&gt; (from
&lt;code&gt;wireless-tools&lt;/code&gt;) yet. I just use NetworkManager now and rarely have
to mess with wireless interfaces directly.&lt;/p&gt;

&lt;h1 id=&quot;background-and-history&quot;&gt;Background and history&lt;/h1&gt;

&lt;p&gt;For context, there are traditionally two ways of configuring the
network in Linux:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;the old way, with commands like &lt;code&gt;ifconfig&lt;/code&gt;, &lt;code&gt;arp&lt;/code&gt;, &lt;code&gt;route&lt;/code&gt; and
&lt;code&gt;netstat&lt;/code&gt;, those are part of the &lt;a href=&quot;https://sourceforge.net/projects/net-tools/&quot;&gt;net-tools&lt;/a&gt; package&lt;/li&gt;
&lt;li&gt;the new way, mostly (but not entirely!) wrapped in a single &lt;code&gt;ip&lt;/code&gt;
command, that is the &lt;a href=&quot;https://wiki.linuxfoundation.org/networking/iproute2&quot;&gt;iproute2&lt;/a&gt; package&lt;/li&gt;
&lt;/ul&gt;


&lt;p&gt;It seems like the latter was made &quot;important&quot; in Debian &lt;a href=&quot;https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=487533&quot;&gt;in 2008&lt;/a&gt;,
which means every release since Debian 5 &quot;lenny&quot; &lt;img alt=&quot;(!)&quot; src=&quot;https://anarc.at/smileys/idea.png&quot; /&gt; has featured the
&lt;code&gt;ip&lt;/code&gt; command.&lt;/p&gt;

&lt;p&gt;The former &lt;code&gt;net-tools&lt;/code&gt; package was &lt;a href=&quot;https://lists.debian.org/debian-devel/2016/12/msg00775.html&quot;&gt;demoted in December 2016&lt;/a&gt; which
means every release since Debian 9 &quot;stretch&quot; ships &lt;em&gt;without&lt;/em&gt; an
&lt;code&gt;ifconfig&lt;/code&gt; command unless explicitly requested. Note that this was
mentioned &lt;a href=&quot;https://www.debian.org/releases/stretch/amd64/release-notes&quot;&gt;in the release notes&lt;/a&gt; in a similar (but, IMHO, less
useful) table.&lt;/p&gt;

&lt;p&gt;(Technically, the &lt;code&gt;net-tools&lt;/code&gt; Debian package source still indicates it
is &lt;code&gt;Priority: important&lt;/code&gt; but that&#39;s &lt;a href=&quot;https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1128342&quot;&gt;a bug I have just filed&lt;/a&gt;.)&lt;/p&gt;

&lt;p&gt;Finally, and perhaps more importantly, the name &lt;code&gt;iproute&lt;/code&gt; is hilarious
if you are a bilingual french speaker: it can be read as &quot;I proute&quot;
which can be interpreted as &quot;I fart&quot; as &quot;prout!&quot; is the sound a fart
makes. The fact that it&#39;s called &lt;code&gt;iproute2&lt;/code&gt; makes it only more
hilarious.&lt;/p&gt; </description> 
	<pubDate>Wed, 18 Feb 2026 16:30:46 +0000</pubDate>

</item> 
<item>
	<title>Thomas Lange: 42.000 FAI.me jobs created</title>
	<guid>http://blog.fai-project.org/posts/42000/</guid>
	<link>http://blog.fai-project.org/posts/42000/</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/mrfai.png&quot; width=&quot;76&quot; height=&quot;100&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;The &lt;a href=&quot;https://fai-project.org/FAIme&quot;&gt;FAI.me service&lt;/a&gt; has reached another milestone:&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;42.000th&lt;/strong&gt; job was submitted via the web interface
since the beginning of this service in 2017.&lt;/p&gt;

&lt;p&gt;The idea was to provide a simple web interface for end users for
creating the configs for the fully automatic installation with only
minimal questions and without knowing the syntax of the configuration files.
Thanks a lot for using this service and for all your feedback.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The next job can be yours!&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;P.S.: I like to get more feedback for the FAI.me service. What do you
like most? What&#39;s missing? Do you have any success story how you use the
customized ISO for your deployment? Please fill out the
&lt;a href=&quot;https://fai-project.org/questionnaire&quot;&gt;FAI questionaire&lt;/a&gt;
or sent feedback via email to fai.me@fai-project.org&lt;/p&gt;

&lt;h3&gt;About &lt;a href=&quot;https://fai-project.org/FAIme&quot;&gt;FAI.me&lt;/a&gt;&lt;/h3&gt;

&lt;p&gt;FAI.me is the service for building your own customized images via a web
interface. You can create an installation or live ISO or a cloud
image.
For Debian, multiple release versions can be chosen, as well as
installations for Ubuntu Server, Ubuntu Desktop, or Linux Mint.&lt;/p&gt;

&lt;p&gt;Multiple options are available like selecting different desktop
environments, the language and keyboard and adding a user with a
password.
Optional settings include adding your own package list,
choosing a backports kernel, adding a postinst script
and adding a ssh public key, choosing a partition
layout and some more.&lt;/p&gt; </description> 
	<pubDate>Wed, 18 Feb 2026 14:55:11 +0000</pubDate>

</item> 
<item>
	<title>Russell Coker: Links February 2026</title>
	<guid>https://etbe.coker.com.au/?p=5960</guid>
	<link>https://etbe.coker.com.au/2026/02/17/links-february-2026/</link>
     <description>  &lt;p&gt;&lt;a href=&quot;http://www.antipope.org/charlie/blog-static/2025/12/barnums-law-of-ceos.html&quot;&gt;Charles Stross has a good theory of why “AI” is being pushed on corporations, really we need to just replace CEOs with LLMs [1]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://futurism.com/openai-investor-chatgpt-mental-health&quot;&gt;This disturbing and amusing article describes how an Open AI investor appears to be having psychological problems releated to SCP based text generated by ChatGPT [2]&lt;/a&gt;. Definitely going to be a recursive problem as people who believe in it invest in it.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://blog.vaxry.net/articles/2025-dbusSucks&quot;&gt;interesting analysis of dbus and design for a more secure replacement [3]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=1fZTOjd_bOQ&quot;&gt;Scott Jenson gave an insightful lecture for Canonical about future potential developments in the desktop UX [4]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://ploum.net/2026-01-05-unteaching_github.html&quot;&gt;Ploum wrote an insightful article about the problems caused by the Github monopoly [5]&lt;/a&gt;. Radicale sounds interesting.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://tonsky.me/blog/tahoe-icons/&quot;&gt;Niki Tonsky write an interesting article about the UI problems with Tahoe (latest MacOS release) due to trying to make an icon for everything [6]&lt;/a&gt;. They have a really good writing style as well as being well researched.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://lwn.net/Articles/1042938/&quot;&gt;Fil-C is an interesting project to compile C/C++ programs in a memory safe way, some of which can be considered a software equivalent of CHERI [7]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://krebsonsecurity.com/2025/12/dismantling-defenses-trump-2-0-cyber-year-in-review/&quot;&gt;Brian Krebs wrote a long list of the ways that Trump has enabled corruption and a variety of other crimes including child sex abuse in the last year [8]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=H5QQ0ECfwyE&quot;&gt;This video about designing a C64 laptop is a masterclass in computer design [9]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.salon.com/2017/10/22/the-twitter-thought-experiment-that-exposes-pro-life-hypocrisy/&quot;&gt;Salon has an interesting article about the abortion thought experiment that conservatives can’t handle [10]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://blog.rongarret.info/2017/10/the-utter-absurdity-of-pro-life-position.html&quot;&gt;Ron Garrett wrote an insightful blog post about abortion [11]&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.schneier.com/blog/archives/2026/01/could-chatgpt-convince-you-to-buy-something.html&quot;&gt;Bruce Schneier and Nathan E. Sanders wrote an insightful article about the potential of LLM systems for advertising and enshittification [12]&lt;/a&gt;. We need serious legislation about this ASAP!&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;[1]&lt;a href=&quot;http://www.antipope.org/charlie/blog-static/2025/12/barnums-law-of-ceos.html&quot;&gt; https://tinyurl.com/27q8xtuv&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[2]&lt;a href=&quot;https://futurism.com/openai-investor-chatgpt-mental-health&quot;&gt; https://futurism.com/openai-investor-chatgpt-mental-health&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[3]&lt;a href=&quot;https://blog.vaxry.net/articles/2025-dbusSucks&quot;&gt; https://blog.vaxry.net/articles/2025-dbusSucks&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[4]&lt;a href=&quot;https://www.youtube.com/watch?v=1fZTOjd_bOQ&quot;&gt; https://www.youtube.com/watch?v=1fZTOjd_bOQ&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[5]&lt;a href=&quot;https://ploum.net/2026-01-05-unteaching_github.html&quot;&gt; https://ploum.net/2026-01-05-unteaching_github.html&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[6]&lt;a href=&quot;https://tonsky.me/blog/tahoe-icons/&quot;&gt; https://tonsky.me/blog/tahoe-icons/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[7]&lt;a href=&quot;https://lwn.net/Articles/1042938/&quot;&gt; https://lwn.net/Articles/1042938/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[8]&lt;a href=&quot;https://krebsonsecurity.com/2025/12/dismantling-defenses-trump-2-0-cyber-year-in-review/&quot;&gt; https://tinyurl.com/2b4sh2s9&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[9]&lt;a href=&quot;https://www.youtube.com/watch?v=H5QQ0ECfwyE&quot;&gt; https://www.youtube.com/watch?v=H5QQ0ECfwyE&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[10]&lt;a href=&quot;https://www.salon.com/2017/10/22/the-twitter-thought-experiment-that-exposes-pro-life-hypocrisy/&quot;&gt; https://tinyurl.com/2d9l8wqm&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[11]&lt;a href=&quot;https://blog.rongarret.info/2017/10/the-utter-absurdity-of-pro-life-position.html&quot;&gt; https://tinyurl.com/2yp94bpo&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;[12]&lt;a href=&quot;https://www.schneier.com/blog/archives/2026/01/could-chatgpt-convince-you-to-buy-something.html&quot;&gt; https://tinyurl.com/29o67syo&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;div class=&quot;yarpp yarpp-related yarpp-related-rss yarpp-template-list&quot;&gt;

&lt;p&gt;Related posts:&lt;/p&gt;&lt;ol&gt;
&lt;li&gt;&lt;a href=&quot;https://etbe.coker.com.au/2024/02/29/links-february-2024/&quot; rel=&quot;bookmark&quot; title=&quot;Links February 2024&quot;&gt;Links February 2024&lt;/a&gt; &lt;small&gt;In 2018 Charles Stross wrote an insightful blog post Dude...&lt;/small&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://etbe.coker.com.au/2025/02/21/links-february-2025/&quot; rel=&quot;bookmark&quot; title=&quot;Links February 2025&quot;&gt;Links February 2025&lt;/a&gt; &lt;small&gt;Oliver Lindburg wrote an interesting article about Designing for Crisis...&lt;/small&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://etbe.coker.com.au/2021/02/28/links-february-2021/&quot; rel=&quot;bookmark&quot; title=&quot;Links February 2021&quot;&gt;Links February 2021&lt;/a&gt; &lt;small&gt;Elestic Search gets a new license to deal with AWS...&lt;/small&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt; </description> 
	<pubDate>Tue, 17 Feb 2026 08:09:17 +0000</pubDate>

</item> 
<item>
	<title>Freexian Collaborators: Monthly report about Debian Long Term Support, January 2026 (by Santiago Ruano Rincón)</title>
	<guid>https://www.freexian.com/blog/debian-lts-report-2026-01/</guid>
	<link>https://www.freexian.com/blog/debian-lts-report-2026-01/</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/freexian.png&quot; width=&quot;215&quot; height=&quot;101&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;img src=&quot;https://www.freexian.com/images/debian-lts-logo.png&quot; style=&quot;float: right;&quot; /&gt;
&lt;p&gt;The Debian LTS Team, funded by &lt;a href=&quot;https://www.freexian.com/lts/debian/&quot;&gt;Freexian’s Debian LTS offering&lt;/a&gt;,
is pleased to report its activities for January.&lt;/p&gt;
&lt;h3 id=&quot;activity-summary&quot;&gt;Activity summary&lt;/h3&gt;
&lt;p&gt;During the month of January, 20 contributors have been
paid to work on &lt;a href=&quot;https://wiki.debian.org/LTS&quot;&gt;Debian LTS&lt;/a&gt; (links to individual
contributor reports are located below).&lt;/p&gt;
&lt;p&gt;The team released &lt;a href=&quot;https://lists.debian.org/debian-lts-announce/2026/01/threads.html&quot;&gt;33 DLAs&lt;/a&gt;
fixing 216 CVEs.&lt;/p&gt;
&lt;p&gt;The team continued preparing security updates in its usual rhythm. Beyond the
updates targeting Debian 11 (“bullseye”), which is the current release under LTS,
the team also proposed updates for more recent releases (&lt;a href=&quot;https://www.debian.org/releases/bookworm/&quot;&gt;Debian 12 (“bookworm”)&lt;/a&gt;
and &lt;a href=&quot;https://www.debian.org/releases/trixie/&quot;&gt;Debian 13 (“trixie”)&lt;/a&gt;), including &lt;a href=&quot;https://www.debian.org/releases/sid/&quot;&gt;Debian unstable&lt;/a&gt;.  We highlight several notable
security updates here below.&lt;/p&gt;
&lt;p&gt;Notable security updates:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;python3.9, prepared by Andrej Shadura
(&lt;a href=&quot;https://security-tracker.debian.org/tracker/DLA-4455-1&quot;&gt;DLA-4455-1&lt;/a&gt;),
fixing multiple vulnerabilities in the Python interpreter.&lt;/li&gt;
&lt;li&gt;php, prepared by Guilhem Moulin
(&lt;a href=&quot;https://security-tracker.debian.org/tracker/DLA-4447-1&quot;&gt;DLA-4447-1&lt;/a&gt;),
fixing two vulnerabilities that could yield to request forgery or denial of
service.&lt;/li&gt;
&lt;li&gt;apache2, prepared by Bastien Roucariès
&lt;a href=&quot;https://security-tracker.debian.org/tracker/DLA-4452-1&quot;&gt;DLA-4452-1&lt;/a&gt;, fixing
four CVEs.&lt;/li&gt;
&lt;li&gt;linux-6.1, prepared by Ben Hutchings
(&lt;a href=&quot;https://security-tracker.debian.org/tracker/DLA-4436-1&quot;&gt;DLA-4436-1&lt;/a&gt;), as a
regular update of the linux 6.1 backport to Debian 11.&lt;/li&gt;
&lt;li&gt;python-django, prepared by Chris Lamb
(&lt;a href=&quot;https://security-tracker.debian.org/tracker/DLA-4458-1&quot;&gt;DLA-4458-1&lt;/a&gt;),
resolving multiple vulnerabilities.&lt;/li&gt;
&lt;li&gt;firefox-esr prepared by Emilio Pozuelo Monfort
(&lt;a href=&quot;https://security-tracker.debian.org/tracker/DLA-4439-1&quot;&gt;DLA-4439-1&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;gnupg2, prepared by Roberto Sánchez
(&lt;a href=&quot;https://security-tracker.debian.org/tracker/DLA-4437-1&quot;&gt;DLA-4437-1&lt;/a&gt;),
fixing multiple issues, including
&lt;a href=&quot;https://security-tracker.debian.org/tracker/CVE-2025-68973&quot;&gt;CVE-2025-68973&lt;/a&gt;
that could potentially be exploited to execute arbitrary code.&lt;/li&gt;
&lt;li&gt;apache-log4j2, prepared by Markus Koschany
(&lt;a href=&quot;https://security-tracker.debian.org/tracker/DLA-4444-1&quot;&gt;DLA-4444-1&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;ceph, prepared by Utkarsh Gupta
(&lt;a href=&quot;https://security-tracker.debian.org/tracker/DLA-4460-1&quot;&gt;DLA-4460-1&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;inetutils, prepared by Andreas Henriksson
(&lt;a href=&quot;https://security-tracker.debian.org/tracker/DLA-4453-1&quot;&gt;DLA-4453-1&lt;/a&gt;),
fixing an authentication bypass in telnetd.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Moreover, Sylvain Beucler studied the security support status of p7zip, a fork
of 7zip that has become unmaintained upstream. To avoid letting the users
continue using an unsupported package, Sylvain has investigated a path forward
in collaboration with the security team and the 7zip maintainer, looking to
replace p7zip with 7zip. It is to note however that 7zip developers don’t
reveal the information about the patches that fix CVEs, making it difficult
to backport single patches to fix vulnerabilities in Debian released versions.&lt;/p&gt;
&lt;p&gt;Contributions from outside the LTS Team:&lt;/p&gt;
&lt;p&gt;Thunderbird, prepared by maintainer Christoph Goehre. The DLA
(&lt;a href=&quot;https://security-tracker.debian.org/tracker/DLA-4442-1&quot;&gt;DLA-4442-1&lt;/a&gt;) was
published by Emilio.&lt;/p&gt;
&lt;p&gt;The LTS Team has also contributed with updates to the latest Debian releases:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Bastien uploaded gpsd to
&lt;a href=&quot;https://tracker.debian.org/news/1708618/accepted-gpsd-3275-01-source-into-unstable/&quot;&gt;unstable&lt;/a&gt;,
and proposed updates for trixie &lt;a href=&quot;https://bugs.debian.org/1126121&quot;&gt;#1126121&lt;/a&gt;
and bookworm &lt;a href=&quot;https://bugs.debian.org/1126168&quot;&gt;#1126168&lt;/a&gt; to fix two CVEs.&lt;/li&gt;
&lt;li&gt;Bastien also prepared the imagemagick updates for trixie and bookworm,
released as
&lt;a href=&quot;https://security-tracker.debian.org/tracker/DSA-6111-1&quot;&gt;DSA-6111-1&lt;/a&gt;, along
with the bullseye update
&lt;a href=&quot;https://security-tracker.debian.org/tracker/DLA-4448-1&quot;&gt;DLA-4448-1&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Chris proposed a trixie point update for python-django
(&lt;a href=&quot;https://bugs.debian.org/1126461&quot;&gt;#112646&lt;/a&gt;), and the work for bookworm was
completed in February (&lt;a href=&quot;https://bugs.debian.org/1079454&quot;&gt;#1079454&lt;/a&gt;). The
longstanding bookworm update required tracking down a regression in the
django-storages packages.&lt;/li&gt;
&lt;li&gt;Markus prepared tomcat10 updates for trixie and bookworm
(&lt;a href=&quot;https://security-tracker.debian.org/tracker/DSA-6120-1&quot;&gt;DSA-6120-1&lt;/a&gt;), and
tomcat11 for trixie
(&lt;a href=&quot;https://security-tracker.debian.org/tracker/DSA-6121-1&quot;&gt;DSA-6121-1&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Thorsten Alteholz prepared bookworm point updates for zvbi
(&lt;a href=&quot;https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1126167&quot;&gt;#1126167&lt;/a&gt;) to
fix five CVEs; taglib
(&lt;a href=&quot;https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1126273&quot;&gt;#1126273&lt;/a&gt;) to fix
one CVE; and libuev
(&lt;a href=&quot;https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1126370&quot;&gt;#1126370&lt;/a&gt;) to fix
one CVE.&lt;/li&gt;
&lt;li&gt;Utkarsh prepared an unstable update of
&lt;a href=&quot;https://tracker.debian.org/news/1712164/accepted-node-lodash-41721dfsgcs83119820210220-10-source-into-unstable/&quot;&gt;node-lodash&lt;/a&gt;
to fix one CVE.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Other than the work related to updates, Sylvain made several improvements to
the documentation and tooling used by the team.&lt;/p&gt;
&lt;h3 id=&quot;individual-debian-lts-contributor-reports&quot;&gt;Individual Debian LTS contributor reports&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://people.debian.org/~abhijith/reports/LTS_ELTS-January-2026.txt&quot;&gt;Abhijith PA&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://lists.debian.org/debian-lts/2026/01/msg00039.html&quot;&gt;Andreas Henriksson&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://lists.debian.org/msgid-search/be496500-b710-457f-abb7-f4f1800c2295@app.fastmail.com&quot;&gt;Andrej Shadura&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://lists.debian.org/debian-lts/2026/02/msg00000.html&quot;&gt;Bastien Roucariès&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.decadent.org.uk/ben/blog/2026/02/04/foss-activity-in-january-2026.html&quot;&gt;Ben Hutchings&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://lists.debian.org/debian-lts/2026/02/msg00009.html&quot;&gt;Carlos Henrique Lima Melara&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://chris-lamb.co.uk/posts/free-software-activities-in-january-2026&quot;&gt;Chris Lamb&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://lists.debian.org/msgid-search/58009db585a1ac053be172759a8de669e3aa4e1c.camel@debian.org&quot;&gt;Daniel Leidert&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://people.debian.org/~pochu/lts/reports/2026-01.txt&quot;&gt;Emilio Pozuelo Monfort&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://lists.debian.org/msgid-search/?m=ypuFIxTelo32Y6%2B4@debian.org&quot;&gt;Guilhem Moulin&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://lists.debian.org/msgid-search/aYBdUZR43qSh4GDL@vis&quot;&gt;Jochen Sprickerhof&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://lists.debian.org/debian-lts/2026/02/msg00017.html&quot;&gt;Lee Garrett&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://dl.gambaru.de/blog/202601_LTS_ELTS_report.txt&quot;&gt;Markus Koschany&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://lists.debian.org/msgid-search/fe825aed-7cc9-4024-ac57-8b47e880752d@debian.org&quot;&gt;Paride Legovini&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://people.debian.org/~roberto/lts_elts_reports/2026-01.txt&quot;&gt;Roberto C. Sánchez&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://people.debian.org/~santiago/lts-elts-reports/report-2026-01.txt&quot;&gt;Santiago Ruano Rincón&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://lists.debian.org/debian-lts/2026/02/msg00010.html&quot;&gt;Sylvain Beucler&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://blog.alteholz.eu/2026/02/2779/&quot;&gt;Thorsten Alteholz&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://lists.debian.org/debian-lts/2026/02/msg00013.html&quot;&gt;Tobias Frost&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://utkarsh2102.org/posts/foss-in-jan-26/&quot;&gt;Utkarsh Gupta&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id=&quot;thanks-to-our-sponsors&quot;&gt;Thanks to our sponsors&lt;/h3&gt;
&lt;p&gt;Sponsors that joined recently are in bold.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Platinum sponsors:
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://www.global.toshiba/ww/top.html&quot;&gt;Toshiba Corporation&lt;/a&gt; (for 124 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://cip-project.org&quot;&gt;Civil Infrastructure Platform (CIP)&lt;/a&gt; (for 92 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://vyos.io&quot;&gt;VyOS Inc&lt;/a&gt; (for 56 months)&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Gold sponsors:
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://www.roche.com/about/business/diagnostics.htm&quot;&gt;F. Hoffmann-La Roche AG&lt;/a&gt; (for 134 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.conet.de/&quot;&gt;CONET Deutschland GmbH&lt;/a&gt; (for 118 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.plathome.com&quot;&gt;Plat’Home&lt;/a&gt; (for 117 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;http://www.ox.ac.uk&quot;&gt;University of Oxford&lt;/a&gt; (for 74 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.edf.fr&quot;&gt;EDF SA&lt;/a&gt; (for 46 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.dataport.de&quot;&gt;Dataport AöR&lt;/a&gt; (for 21 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://home.cern/&quot;&gt;CERN&lt;/a&gt; (for 19 months)&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Silver sponsors:
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://domainnameshop.com/&quot;&gt;Domeneshop AS&lt;/a&gt; (for 139 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://metropole.nantes.fr/&quot;&gt;Nantes Métropole&lt;/a&gt; (for 133 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.akamai.com/&quot;&gt;Akamai - Linode&lt;/a&gt; (for 129 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;http://www.univention.de&quot;&gt;Univention GmbH&lt;/a&gt; (for 125 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;http://portail.univ-st-etienne.fr/&quot;&gt;Université Jean Monnet de St Etienne&lt;/a&gt; (for 125 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://ribboncommunications.com/&quot;&gt;Ribbon Communications, Inc.&lt;/a&gt; (for 119 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.exonet.nl&quot;&gt;Exonet B.V.&lt;/a&gt; (for 109 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.lrz.de&quot;&gt;Leibniz Rechenzentrum&lt;/a&gt; (for 103 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.diplomatie.gouv.fr&quot;&gt;Ministère de l’Europe et des Affaires Étrangères&lt;/a&gt; (for 87 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://dinahosting.com&quot;&gt;Dinahosting SL&lt;/a&gt; (for 74 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://upsun.com&quot;&gt;Upsun Formerly Platform.sh&lt;/a&gt; (for 68 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://deveryware.com&quot;&gt;Deveryware&lt;/a&gt; (for 62 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.moxa.com&quot;&gt;Moxa Inc.&lt;/a&gt; (for 62 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://sipgate.de&quot;&gt;sipgate GmbH&lt;/a&gt; (for 60 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://ovhcloud.com&quot;&gt;OVH US LLC&lt;/a&gt; (for 58 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.tilburguniversity.edu/&quot;&gt;Tilburg University&lt;/a&gt; (for 58 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.gsi.de&quot;&gt;GSI Helmholtzzentrum für Schwerionenforschung GmbH&lt;/a&gt; (for 49 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.cesky-hosting.cz/&quot;&gt;THINline s.r.o.&lt;/a&gt; (for 22 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.cph.dk&quot;&gt;Copenhagen Airports A/S&lt;/a&gt; (for 16 months)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href=&quot;https://www.isere.fr&quot;&gt;Conseil Départemental de l’Isère&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Bronze sponsors:
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;http://www.seznam.cz&quot;&gt;Seznam.cz, a.s.&lt;/a&gt; (for 140 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;http://www.evolix.fr&quot;&gt;Evolix&lt;/a&gt; (for 139 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;http://linuxhotel.de&quot;&gt;Linuxhotel GmbH&lt;/a&gt; (for 137 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;http://intevation.de&quot;&gt;Intevation GmbH&lt;/a&gt; (for 136 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://waays.fr&quot;&gt;Daevel SARL&lt;/a&gt; (for 135 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;http://www.megaspace.de&quot;&gt;Megaspace Internet Services GmbH&lt;/a&gt; (for 134 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;http://www.greenbone.net&quot;&gt;Greenbone AG&lt;/a&gt; (for 133 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;http://numlog.fr&quot;&gt;NUMLOG&lt;/a&gt; (for 133 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;http://www.wingo.ch/&quot;&gt;WinGo AG&lt;/a&gt; (for 132 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.entrouvert.com/&quot;&gt;Entr’ouvert&lt;/a&gt; (for 124 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://adfinis.com&quot;&gt;Adfinis AG&lt;/a&gt; (for 121 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;http://www.legi.grenoble-inp.fr&quot;&gt;Laboratoire LEGI - UMR 5519 / CNRS&lt;/a&gt; (for 116 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.tesorion.nl/&quot;&gt;Tesorion&lt;/a&gt; (for 116 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;http://bearstech.com&quot;&gt;Bearstech&lt;/a&gt; (for 107 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;http://lihas.de&quot;&gt;LiHAS&lt;/a&gt; (for 107 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;http://www.catalyst.net.nz&quot;&gt;Catalyst IT Ltd&lt;/a&gt; (for 102 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://demarcq.net&quot;&gt;Demarcq SAS&lt;/a&gt; (for 96 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.univ-grenoble-alpes.fr&quot;&gt;Université Grenoble Alpes&lt;/a&gt; (for 82 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.touchweb.fr&quot;&gt;TouchWeb SAS&lt;/a&gt; (for 74 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.spin-ag.de&quot;&gt;SPiN AG&lt;/a&gt; (for 71 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.corefiling.com&quot;&gt;CoreFiling&lt;/a&gt; (for 67 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.osug.fr/&quot;&gt;Observatoire des Sciences de l’Univers de Grenoble&lt;/a&gt; (for 58 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.werfen.com&quot;&gt;Tem Innovations GmbH&lt;/a&gt; (for 53 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://wordfinder.pro&quot;&gt;WordFinder.pro&lt;/a&gt; (for 53 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.resif.fr&quot;&gt;CNRS DT INSU Résif&lt;/a&gt; (for 51 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.soliton.co.jp&quot;&gt;Soliton Systems K.K.&lt;/a&gt; (for 47 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.alterway.fr&quot;&gt;Alter Way&lt;/a&gt; (for 44 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://math.univ-lyon1.fr&quot;&gt;Institut Camille Jordan&lt;/a&gt; (for 34 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;http://www.sobis.com/&quot;&gt;SOBIS Software GmbH&lt;/a&gt; (for 19 months)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.tuxera.com&quot;&gt;Tuxera Inc.&lt;/a&gt; (for 10 months)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href=&quot;https://opm-op.com&quot;&gt;OPM-OP AS&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt; </description> 
	<pubDate>Tue, 17 Feb 2026 00:00:00 +0000</pubDate>

</item> 
<item>
	<title>Antoine Beaupré: Keeping track of decisions using the ADR model</title>
	<guid>https://anarc.at/blog/2026-02-12-recording-decisions/</guid>
	<link>https://anarc.at/blog/2026-02-12-recording-decisions/</link>
     <description>  &lt;p&gt;In the Tor Project system Administrator&#39;s team (colloquially known as
TPA), we&#39;ve recently changed how we take decisions, which means you&#39;ll
get clearer communications from us about upcoming changes or
&lt;em&gt;targeted&lt;/em&gt; questions about a proposal.&lt;/p&gt;

&lt;p&gt;Note that this change only affects the TPA team. At Tor, each team has
its own way of coordinating and making decisions, and so far this
process is only used inside TPA. We encourage other teams inside and
outside Tor to evaluate this process to see if it can improve your
processes and documentation.&lt;/p&gt;

&lt;h1 id=&quot;the-new-process&quot;&gt;The new process&lt;/h1&gt;

&lt;p&gt;We had traditionally been using a &quot;RFC&quot; (&quot;Request For Comments&quot;)
process and have recently switched to &quot;ADR&quot; (&quot;Architecture Decision
Record&quot;).&lt;/p&gt;

&lt;p&gt;The ADR process is, for us, pretty simple. It consists of three
things:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;a simpler template&lt;/li&gt;
&lt;li&gt;a simpler process&lt;/li&gt;
&lt;li&gt;communication guidelines separate from the decision record&lt;/li&gt;
&lt;/ol&gt;


&lt;h2 id=&quot;the-template&quot;&gt;The template&lt;/h2&gt;

&lt;p&gt;As team lead, the first thing I did was to propose a new template (in
&lt;a href=&quot;https://gitlab.torproject.org/tpo/tpa/team/-/wikis/policy/0100-adr-template&quot;&gt;ADR-100&lt;/a&gt;), a variation of the &lt;a href=&quot;https://github.com/joelparkerhenderson/architecture-decision-record/blob/main/locales/en/templates/decision-record-template-by-michael-nygard/index.md&quot;&gt;Nygard template&lt;/a&gt;. The &lt;a href=&quot;https://gitlab.torproject.org/tpo/tpa/team/-/wikis/policy/template&quot;&gt;TPA
variation of the template&lt;/a&gt; is similarly simple, as it has only 5
headings, and is worth quoting in full:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Context&lt;/strong&gt;: What is the issue that we&#39;re seeing that is motivating
this decision or change?&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Decision&lt;/strong&gt;: What is the change that we&#39;re proposing and/or doing?&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Consequences&lt;/strong&gt;: What becomes easier or more difficult to do
because of this change?&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;More Information&lt;/strong&gt; (optional): What else should we know? For
larger projects, consider including a timeline and cost estimate,
along with the impact on affected users (perhaps including existing
Personas). Generally, this includes a short evaluation of
alternatives considered.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Metadata&lt;/strong&gt;: status, decision date, decision makers, consulted,
informed users, and link to a discussion forum&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;


&lt;p&gt;The &lt;a href=&quot;https://gitlab.torproject.org/tpo/tpa/wiki-replica/-/blob/d52de1828d3ee406996345704d12663dd30f5513/policy/template.md&quot;&gt;previous RFC template&lt;/a&gt; had &lt;strong&gt;17&lt;/strong&gt; (seventeen!) headings, which
encouraged much longer documents. Now, the decision record will be
easier to read and digest at one glance.&lt;/p&gt;

&lt;p&gt;An immediate effect of this is that I&#39;ve started using GitLab issues
more for comparisons and brainstorming. Instead of dumping in a
document all sorts of details like pricing or in-depth alternatives
comparison, we record those in the discussion issue, keeping the
document shorter.&lt;/p&gt;

&lt;h2 id=&quot;the-process&quot;&gt;The process&lt;/h2&gt;

&lt;p&gt;The whole process is simple enough that it&#39;s worth quoting in full as
well:&lt;/p&gt;

&lt;blockquote&gt;&lt;p&gt;Major decisions are introduced to stakeholders in a meeting, smaller
ones by email. A delay allows people to submit final comments before
adoption.&lt;/p&gt;&lt;/blockquote&gt;

&lt;p&gt;Now, of course, the devil is in the details (and &lt;a href=&quot;https://gitlab.torproject.org/tpo/tpa/team/-/wikis/policy/0101-adr-process&quot;&gt;ADR-101&lt;/a&gt;), but the
point is to keep things simple.&lt;/p&gt;

&lt;p&gt;A crucial aspect of the proposal, which Jacob Kaplan-Moss calls the
&lt;a href=&quot;https://jacobian.org/2023/dec/5/how-to-decide/&quot;&gt;one weird trick&lt;/a&gt;, is to &quot;decide who decides&quot;. Our previous process
was vague about who makes the decision and the new template (and
process) clarifies decision makers, for each decision.&lt;/p&gt;

&lt;p&gt;Inversely, some decisions degenerate into endless discussions around
trivial issues because &lt;em&gt;too many&lt;/em&gt; stakeholders are consulted, a
problem known as the &lt;a href=&quot;https://en.wikipedia.org/wiki/Bike_shedding&quot;&gt;Law of triviality&lt;/a&gt;, also known as the &quot;Bike
Shed syndrome&quot;.&lt;/p&gt;

&lt;p&gt;The new process better identifies stakeholders:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&quot;informed&quot; users (previously &quot;affected users&quot;)&lt;/li&gt;
&lt;li&gt;&quot;consulted&quot; (previously undefined!)&lt;/li&gt;
&lt;li&gt;&quot;decision maker&quot; (instead of the vague &quot;approval&quot;)&lt;/li&gt;
&lt;/ul&gt;


&lt;p&gt;Picking those stakeholders is still tricky, but our definitions are
more explicit and aligned to the classic &lt;a href=&quot;https://en.wikipedia.org/wiki/Responsibility_assignment_matrix&quot;&gt;RACI matrix&lt;/a&gt; (Responsible,
Accountable, Consulted, Informed).&lt;/p&gt;

&lt;h2 id=&quot;communication-guidelines&quot;&gt;Communication guidelines&lt;/h2&gt;

&lt;p&gt;Finally, a crucial part of the process (&lt;a href=&quot;https://gitlab.torproject.org/tpo/tpa/team/-/wikis/policy/0102-adr-communications&quot;&gt;ADR-102&lt;/a&gt;) is to decouple
the act of making and recording decisions from &lt;em&gt;communicating&lt;/em&gt; about
the decision. Those are two &lt;em&gt;radically&lt;/em&gt; different problems to
solve. We have found that a single document can&#39;t serve both purposes.&lt;/p&gt;

&lt;p&gt;Because ADRs can affect a wide range of things, we don&#39;t have a
specific template for communications. We suggest the &lt;a href=&quot;https://en.wikipedia.org/wiki/Five_Ws&quot;&gt;Five Ws&lt;/a&gt;
method (Who? What?  When? Where? Why?) and, again, to keep things simple.&lt;/p&gt;

&lt;h1 id=&quot;how-we-got-there&quot;&gt;How we got there&lt;/h1&gt;

&lt;p&gt;The &lt;a href=&quot;https://adr.github.io/&quot;&gt;ADR process&lt;/a&gt; is not something I invented. I first stumbled upon
it in the &lt;a href=&quot;https://github.com/thunderbird/thunderbird-android/blob/be2af5c6a0bce08385fc3f654c1185ccf9db3859/docs/architecture/adr/README.md&quot;&gt;Thunderbird Android project&lt;/a&gt;. Then, in parallel, I was in
the &lt;a href=&quot;https://gitlab.torproject.org/tpo/tpa/team/-/issues/41428&quot;&gt;process of reviewing the RFC process&lt;/a&gt;, following Jacob
Kaplan-Moss&#39;s &lt;a href=&quot;https://jacobian.org/2023/dec/1/against-rfcs/&quot;&gt;criticism of the RFC process&lt;/a&gt;. Essentially, he argues
that:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;the RFC process &quot;doesn&#39;t include any sort of decision-making framework&quot;&lt;/li&gt;
&lt;li&gt;&quot;RFC processes tend to lead to endless discussion&quot;&lt;/li&gt;
&lt;li&gt;the process &quot;rewards people who can write to exhaustion&quot;&lt;/li&gt;
&lt;li&gt;&quot;these processes are insensitive to expertise&quot;, &quot;power dynamics and
power structures&quot;&lt;/li&gt;
&lt;/ol&gt;


&lt;p&gt;And, indeed, I have been guilty of a lot of those issues. A verbose
writer, I have written &lt;a href=&quot;https://gitlab.torproject.org/tpo/tpa/team/-/wikis/policy/tpa-rfc-33-monitoring&quot;&gt;extremely long proposals&lt;/a&gt; that I suspect no
one has ever fully read. Some proposals were adopted by exhaustion, or
ignored because not looping in the right stakeholders.&lt;/p&gt;

&lt;p&gt;Our &lt;a href=&quot;https://gitlab.torproject.org/tpo/tpa/team/-/issues/41428&quot;&gt;discussion issue&lt;/a&gt; on the topic has more details on the issues I
found with our RFC process. But to give credit to the old process, it
did serve us well while it was there: it&#39;s better than nothing, and it
allowed us to document a staggering number of changes and decisions
(&lt;a href=&quot;https://gitlab.torproject.org/tpo/tpa/team/-/wikis/policy&quot;&gt;95 RFCs&lt;/a&gt;!) made over the course of 6 years of work.&lt;/p&gt;

&lt;h1 id=&quot;whats-next&quot;&gt;What&#39;s next?&lt;/h1&gt;

&lt;p&gt;We&#39;re still experimenting with the communication around decisions, as
this text might suggest. Because it&#39;s a separate step, we also have a
tendency to forget or postpone it, like this post, which comes a
couple of months late.&lt;/p&gt;

&lt;p&gt;Previously, we&#39;d just ship a copy of the RFC to everyone, which was
easy and quick, but incomprehensible to most. Now we need to write a
separate communication, which is more work but, hopefully, worth the
as the result is more digestible.&lt;/p&gt;

&lt;p&gt;We can&#39;t wait to hear what you think of the new process and how it
works for you, here or in the &lt;a href=&quot;https://gitlab.torproject.org/tpo/tpa/team/-/issues/41428&quot;&gt;discussion issue&lt;/a&gt;! We&#39;re particularly
interested in people that are already using a similar process, or that
will adopt one after reading this.&lt;/p&gt;

&lt;blockquote&gt;&lt;p&gt;Note: this article was also published on the &lt;a href=&quot;https://blog.torproject.org/tpa-adr&quot;&gt;Tor Blog&lt;/a&gt;.&lt;/p&gt;&lt;/blockquote&gt; </description> 
	<pubDate>Mon, 16 Feb 2026 20:21:46 +0000</pubDate>

</item> 
<item>
	<title>Philipp Kern: What is happening with this &quot;connection verification&quot;?</title>
	<guid>tag:blogger.com,1999:blog-5048890463514304208.post-6179376777520248706</guid>
	<link>https://debblog.philkern.de/2026/02/what-is-happening-with-this-connection.html</link>
     <description>  &lt;img src=&quot;http://planet.debian.org/heads/pkern.png&quot; width=&quot;69&quot; height=&quot;85&quot; alt=&quot;&quot; align=&quot;right&quot; style=&quot;float: right;&quot;&gt;  &lt;p&gt;You might see a verification screen pop up on more and more Debian web properties. Unfortunately the AI world of today is meeting web hosts that use Perl CGIs and are not built as multi-tiered scalable serving systems. The issues have been at three layers:&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;ol style=&quot;text-align: left;&quot;&gt;&lt;li&gt;Apache&#39;s serving capacity runs full - with no threads left to serve requests. This means that your connection will sit around for a long time, not getting accepted. In theory this can be configured, but that would require requests to be handled in time.&lt;/li&gt;&lt;li&gt;Startup costs of request handlers are too high, because we spawn a process for every request. This currently affects the BTS and dgit&#39;s browse interface. packages.debian.org has been fixed, which increased scalability sufficiently.&lt;/li&gt;&lt;li&gt;Requests themselves are too expensive to be served quickly - think git blame without caching.&lt;/li&gt;&lt;/ol&gt;&lt;p style=&quot;text-align: left;&quot;&gt;Optimally we would go and solve some scalability issues with the services, however there is also a question of how much we &lt;i&gt;want&lt;/i&gt; to be able to serve - as AI scraper demand is just a steady stream of requests that are not shown to humans.&lt;/p&gt;&lt;div&gt;&lt;/div&gt;&lt;h3 style=&quot;text-align: left;&quot;&gt;How is it implemented?&lt;/h3&gt;&lt;div&gt;&lt;/div&gt;&lt;p style=&quot;text-align: left;&quot;&gt;DSA has now stood up some VMs with Varnish for proxying. Incoming TLS is provided by hitch, and TLS &quot;on-loading&quot; is done using haproxy. That way TLS goes in and TLS goes out. While Varnish does cache, if the content is cachable (e.g. does not depend on cookies) - that is not the primary reason for using it: It can be used for flexible query and response rewriting.&lt;/p&gt;&lt;div&gt;&lt;/div&gt;&lt;p style=&quot;text-align: left;&quot;&gt;If no cookie with a proof of work is provided, the user is redirected to a challenge page that does some webcrypto in Javascript - because that looked similar to what other projects do (e.g. &lt;a href=&quot;https://github.com/dgl/haphash&quot;&gt;haphash&lt;/a&gt; that originally inspired the solution). However so far it looks like scrapers generally do not run with Javascript enabled, so this whole crypto proof of work business could probably be replaced with just a Javascript-based redirect. The existing solution also has big (security) holes in it. And, as we found out, Firefox is slower at webcrypto than Chrome. I have recently reduced the complexity, so you should notice it blocking you significantly less.&lt;/p&gt;&lt;div&gt;&lt;/div&gt;&lt;p style=&quot;text-align: left;&quot;&gt;Once you have the cookie, you can keep accessing the site for as long as the cookie is valid. Please do not make any assumptions about the cookies, or you will be broken in the future.&lt;/p&gt;&lt;p&gt;For legitimate scrapers that obey robots.txt, there is now an automatically generated IP allowlist in place (thanks, Marco d&#39;Itri). Turns out that the search engines do not actually run Javascript either and then loudly complain about the redirect to the challenge page. Other bots are generally exempt.&lt;/p&gt;&lt;h3 style=&quot;text-align: left;&quot;&gt;Conclusion &lt;/h3&gt;&lt;p&gt;I hope that right now we found sort of the sweet spot where the admins can stop spending human time on updating firewall rules and the services are generally available, reasonably fast, and still indexed. In case you see problems or run into a block with your own (legitimate) bots, please let me know.&lt;/p&gt;&lt;p&gt;&lt;/p&gt; </description> 
	<pubDate>Mon, 16 Feb 2026 19:55:00 +0000</pubDate>
  <author>noreply@blogger.com (Philipp Kern)</author>  
</item> 
</channel>
</rss>
