Some interesting takeaways (With the caveat that exit polls are not completely accurate and we won't have the full picture for days.)
President Trump seems to have won the popular vote which no Republican has done I believe since Reagan.
Apparently women didn't particularly care about abortion (CNN said only 14% considered it their primary issue) There is a noticable divide but it is single versus married not women versus men per se.
Hispanics who are here legally voted against Hispanics coming here illegally. Latinx's didn't vote for anything because they don't exist.
The infamous MSG rally joke had no effect on the voting habits of Puerto Ricans.
Republicans have taken the Senate and if trends continue as they are will retain control of the House of Representatives.
President Biden may have actually been a better candidate than Border Czar Harris.
The first piece of anti-semitic writing attributed to Adolf Hitler is the
Gemlich letter.
After World War I, Hitler remained in the German army. He was posted
to an intelligence role in Munich. Adolf Gemlich wrote a letter about
the Jewish question. Hitler's superior, Karl Mayr, asked Hitler to write
the response.
The Gemlich letter was written on 16 September 1919, while Hitler was
still an army officer, well before Hitler became Fuhrer.
One of the key points in the letter states that there should be
a Code of Conduct (CoC) for Jewish people:
legally fight and remove the privileges enjoyed by the
Jews as opposed to other foreigners living among us
So there would be one set of laws for everybody else and a second set
of laws, or a CoC, for the Jews.
The other key point in the Gemlich letter is "behavior":
there lives amongst us a non-German, alien race,
unwilling and indeed unable to shed its racial characteristics,
its particular feelings, thoughts and ambitions
On 16 September 2018 Linus Torvalds posted
the email announcing he has to submit himself to the code of conduct
on the Linux Kernel Mailing List and mind his behavior.
Linus tells us he is taking a break, in other words, some of his
privileges are on hold for a while.
We saw the same thing in Afghanistan. When the Taliban took back control
of the country, women had to change their behavior and become better at
listening to the demands from their masters.
From Linus Torvalds
Date Sun, 16 Sep 2018 12:22:43 -0700
Subject Linux 4.19-rc4 released, an apology, and a maintainership note
[ So this email got a lot longer than I initially thought it would
get, but let's start out with the "regular Sunday release" part ]
Another week, another rc.
Nothing particularly odd stands out on the technical side in the
kernel updates for last week - rc4 looks fairly average in size for
this stage in the release cycle, and all the other statistics look
pretty normal too.
We've got roughly two thirds driver fixes (gpu and networking look to
be the bulk of it, but there's smaller changes all over in various
driver subsystems), with the rest being the usual mix: core
networking, perf tooling updates, arch updates, Documentation, some
filesystem, vm and minor core kernel fixes.
So it's all fairly small and normal for this stage. As usual, I'm
appending the shortlog at the bottom for people who want to get an
overview of the details without actually having to go dig in the git
tree.
The one change that stands out and merits mention is the code of
conduct addition...
[ And here comes the other, much longer, part... ]
Which brings me to the *NOT* normal part of the last week: the
discussions (both in public mainly on the kernel summit discussion
lists and then a lot in various private communications) about
maintainership and the kernel community. Some of that discussion came
about because of me screwing up my scheduling for the maintainer
summit where these things are supposed to be discussed.
And don't get me wrong. It's not like that discussion itself is in
any way new to this week - we've been discussing maintainership and
community for years. We've had lots of discussions both in private and
on mailing lists. We have regular talks at conferences - again, both
the "public speaking" kind and the "private hallway track" kind.
No, what was new last week is really my reaction to it, and me being
perhaps introspective (you be the judge).
There were two parts to that.
One was simply my own reaction to having screwed up my scheduling of
the maintainership summit: yes, I was somewhat embarrassed about
having screwed up my calendar, but honestly, I was mostly hopeful that
I wouldn't have to go to the kernel summit that I have gone to every
year for just about the last two decades.
Yes, we got it rescheduled, and no, my "maybe you can just do it
without me there" got overruled. But that whole situation then
started a whole different kind of discussion. And kind of
incidentally to that one, the second part was that I realized that I
had completely mis-read some of the people involved.
This is where the "look yourself in the mirror" moment comes in.
So here we are, me finally on the one hand realizing that it wasn't
actually funny or a good sign that I was hoping to just skip the
yearly kernel summit entirely, and on the other hand realizing that I
really had been ignoring some fairly deep-seated feelings in the
community.
It's one thing when you can ignore these issues. Usually it’s just
something I didn't want to deal with.
This is my reality. I am not an emotionally empathetic kind of person
and that probably doesn't come as a big surprise to anybody. Least of
all me. The fact that I then misread people and don't realize (for
years) how badly I've judged a situation and contributed to an
unprofessional environment is not good.
This week people in our community confronted me about my lifetime of
not understanding emotions. My flippant attacks in emails have been
both unprofessional and uncalled for. Especially at times when I made
it personal. In my quest for a better patch, this made sense to me.
I know now this was not OK and I am truly sorry.
The above is basically a long-winded way to get to the somewhat
painful personal admission that hey, I need to change some of my
behavior, and I want to apologize to the people that my personal
behavior hurt and possibly drove away from kernel development
entirely.
I am going to take time off and get some assistance on how to
understand people’s emotions and respond appropriately.
Put another way: When asked at conferences, I occasionally talk about
how the pain-points in kernel development have generally not been
about the _technical_ issues, but about the inflection points where
development flow and behavior changed.
These pain points have been about managing the flow of patches, and
often been associated with big tooling changes - moving from making
releases with "patches and tar-balls" (and the _very_ painful
discussions about how "Linus doesn't scale" back 15+ years ago) to
using BitKeeper, and then to having to write git in order to get past
the point of that no longer working for us.
We haven't had that kind of pain-point in about a decade. But this
week felt like that kind of pain point to me.
To tie this all back to the actual 4.19-rc4 release (no, really, this
_is_ related!) I actually think that 4.19 is looking fairly good,
things have gotten to the "calm" period of the release cycle, and I've
talked to Greg to ask him if he'd mind finishing up 4.19 for me, so
that I can take a break, and try to at least fix my own behavior.
This is not some kind of "I'm burnt out, I need to just go away"
break. I'm not feeling like I don't want to continue maintaining
Linux. Quite the reverse. I very much *do* want to continue to do
this project that I've been working on for almost three decades.
This is more like the time I got out of kernel development for a while
because I needed to write a little tool called "git". I need to take
a break to get help on how to behave differently and fix some issues
in my tooling and workflow.
And yes, some of it might be "just" tooling. Maybe I can get an email
filter in place so at when I send email with curse-words, they just
won't go out. Because hey, I'm a big believer in tools, and at least
_some_ problems going forward might be improved with simple
automation.
I know when I really look “myself in the mirror” it will be clear it's
not the only change that has to happen, but hey... You can send me
suggestions in email.
I look forward to seeing you at the Maintainer Summit.
Linus
When Donald Trump was elected in 2016, his victory became known
on my birthday. That was 9 November 2016.
I couldn't agree with some of the things Trump says or does but
when it comes to his signature promise to build a wall, he said
it but while Trump struggled to deliver, student politicians actually did it.
Moreover, 9 November is also the day that Germans
tore down the Berlin Wall in 1989.
By total coincidence, the students did this during my first weeks at
university in 1997. The scene was photographed by
legendary war photographer Ashley Gilbertson and appeared in
many Australian newspapers.
The university commended the students for leaving peacefully.
In September 1997 the students elected me to the house and services
committee responsible for capital works and similar projects.
Not long after this and I had been elected to the
National Union of Students (Victoria state branch).
The Age, 10 May 1997: Food flies as hunger hits student protest
by Caroline Milburn
Police dodged flying bananas, food parcels and the occasional salami
yesterday as students tried to throw supplies to colleagues occupying
the second floor of Melbourne University's administration building.
...
The quest for food then turned into a battle of wits between police at
the first-floor windows and the protesters immediately above. As ropes
were lowered to the crowd, police lunged at them, using various implements,
including a long pole with a hook and a large knife, with varying degrees
of success.
...
Professor Gilbert (Vice Chancellor) rejected student criticism that the
number of Government-subsidised university places would be reduced by the
impact of fee-paying places. He said the university would not charge
yesterday's protesters with trespass because they left the building
peacefully.
One of the protestors, Felicity Martin, a Melbourne University student
said the protest had ended while morale was high.
Red Flag, 15 May 2014: Students have always been right to rebel
One of John Howard’s first moves on getting into office was to slash the education budget and introduce up-front fees for undergraduates. In response, students not only demonstrated in their thousands across the country, but occupied administration buildings on their campuses.
...
On 8 May around 2,000 students marched on Melbourne Uni, where about 500 occupied the main administration building; 180 stayed overnight without food, electricity or proper toilets. The next day 1,000 rallied in support outside the building. Engineering students (whom many leftists had previously dismissed as hopelessly apolitical or right wing) helped rig up an elaborate rope and pulley system to transport food and supplies to the occupiers through the second floor windows.
The Melbourne Uni occupation inspired students elsewhere. The following day hundreds stormed the admin building at Sydney Uni, while in Adelaide 300 picketed education minister Amanda Vanstone’s office before occupying their own campus. Other occupations took place in even such far-flung places as Deakin’s Warrnambool campus in far western Victoria.
In August, the longest occupation took place. Hundreds of students occupied the finance division of RMIT in central Melbourne and stayed there for almost four weeks! It was the most sustained student occupation since the 1970s. With police cutting off access to the occupied floors, students again hoisted cigarettes, food and printed materials from the street, from what became a semi-permanent solidarity encampment in Swanston Street. Despite vitriolic hostility in the media, students gained support from unions and other community groups, including the RMIT NTEU branch, which organised stop-works and rallies in support of the occupiers.
I first went to
Switzerland in August 2006. My first photo in
Switzerland was taken in Lausanne with Lake Geneva and the French alps
behind me. At the time this was taken, I did not realize I would
eventually have the task of draining the swamp.
Already by the end of the 2nd century, there is evidence
that the saints were already being venerated. First, the holy martyrs,
who were soon joined by the apostles, the official witnesses of the faith.
After the great persecutions under Imperial Rome, other men and
women who had lived heroic Christian lives gradually became the object
of veneration as well. The first non-martyr to be venerated as a saint
was Martin of Tours. Toward the end of the year 1000, due to
the uncontrolled development of "saint making" and the "purchase"
of relics, a process for canonization was developed which required
evidence of miracles.
Some people have come to regard the Swiss passport as a small miracle.
Some said it is a big miracle. It measures 125 x 85 millimeters.
The Swiss flag, however, is one of only two national flags that
is officially square. The other is that of the Vatican state.
People notice the same "saint making" phenomena in awkward groups like
Debian and GNOME, where people who are not authors of the software
are given elaborate titles and keynote speaking opportunities
ahead of real Debian Developers like Dr Norbert Preining and I.
Nonetheless, the key point is that All Saints Day began with the
martyrs. Or did it?
People ask how I can be Australian, Irish and Swiss all at the same time.
In fact, we can show how they are related very easily.
On 3 August, the application for citizenship gained its final approval
by the Swiss Confederation in Bern.
On 19 September the Swiss financial regulator FINMA belatedly
published a document about the poor behavior of Parreaux,
Thiebaud & Partners, a Swiss legal insurance and law office.
FINMA redacted the name of the insurance company, the names of
the jurists and even the dates, frustrating attempts by the
20,000 clients to find out where our money went in Geneva.
On 28 September the Canton of Vaud sent me the invitation to a
citizenship ceremony scheduled for 1 November.
At the time, community efforts to investigate and fact-check the
bad faith legal insurance and jurist scandal were well under way.
On 7 October I
published my first blog about the JuristGate scandal.
As I fact-checked all the details, I became concerned that there may
be miscarriages of justice, innocent clients who have lost legal disputes,
lost their business or suffered a loss of confidential
information and subsequently committed suicide.
Shortly after I published the blog, somebody asked me if the Swiss jurists
came with their guns.
I subsequently started the
bi-lingual JuristGate web site where more details will appear
as I have time and motivation. Draining the swamp is hard work,
Lake Geneva has a depth of 310 meters.
With the eventual death tolls of
JuristGate and
Debianism on my mind,
the Swiss citizenship ceremony, on the day after Halloween, couldn't have been
a more grim occasion for me. The harm that
Swiss racism has done to black cats, the harm that
Swiss corruption has done to my family and so many other families,
the potential for even more suicides, were the predominant things
on my mind as I took the Swiss citizenship oath.
9 November is my birthday and that was the day the new Swiss passport
was delivered. Switzerland seems to produce passports much faster than
Australia or Ireland.
Coincidentally, 9 November was also the
Kristallnacht, the giant pogrom that kickstarted the Holocaust.
The falsified accusations and insults being distributed on the Debian
web site contain different dates in them. On the first page it refers to
10 November and on the last page they wrote 27 November. Discrepencies
like that are often tell-tale signs of a forgery. That would be nothing new
for Debian, I previously
exposed how these dishonest people falsified harassment claims against another
Google critic, Dr Jacob Appelbaum. Forgeries, falsehoods
and targetting of Google critics is part of a very predictable
pattern for Debian cyberbullies and their social media rent-a-mob.
Nonetheless, the Kristallnacht morning-after, 10 November,
is the date the Jewish business owners woke up to find their
shops and factories had been trashed by the Nazis. Now we can see
that the
Open Source Developer Freedoms are officially in liquidation.
The people celebrating that are closer to the spirit of Hitler
than to the spirit of those who started the free software movement.
Dr Richard Stallman and several other key figures are of Jewish origin.
It is telling that some of the nazi flag bearers creating documents like that
have used this same date, the day after my birthday, the hangover of the
Kristallnacht, to try and trash my life. Hitler may be dead but his
spirit lives on in the incurably racist Swiss women like Pascale Koester who
attack unpaid volunteers with their unholy cocktail of false accusations and
girlish gossip.
People who read the
real history of Debian come to realize that it is not about my
own conduct or skill as a Debian Developer. The long history of suicides
and public humiliations prove the problems originate at the core
of this group and it has always been that way. They use the racist jurists
to cover that up.
There you have it. An Irish-Australian information security expert
exposing Swiss corruption almost in the same breath as taking
the oath of Swiss citizenship on the day of the martyrs, which was
itself modeled on an Irish pagan ritual. So I may be Swiss and Australian
but I'm no less Irish.
This results in a permanent diff because the Google CloudDNS API seems to parse the
record content, and stores the ipv6hint expanded (removing the :: notation) and in all
lowercase as 2001:db8:0:0:0:0:0:1. Thus to fix the permanent diff we've to use it like
this:
Since WFDF changed their ultimate rules web site
to be less-than-ideal (in the name of putting everything into Wordpress…),
I made my own, at urules.org. It was a fun
journey; I've never fiddled with PWAs
before, and I was a bit surprised how low-level it all was. I assumed that
since my page is just a bunch of HTML files and ~100 lines of JS, I could
just bundle that up—but no, that is something they expect a framework to do
for you.
The only primitive you get is seemingly that you can fire up your own
background service worker (JS running in its own, locked-down context)
and that gets to peek at every HTTP request done and possibly intercept it.
So you can use a Web Cache
(seemingly a separate concept from web local storage?), insert stuff into
that, and then query it to intercept requests. It doesn't feel very elegant,
perhaps?
It is a bit neat that I can use this to make my own bundling, though.
All the pages and images (painfully converted to SVG to save space and
re-flow for mobile screens, mostly by simply drawing over bitmaps by hand
in Inkscape) are stuck into a JSON dictionary, compressed using the slowest
compressor I could find and then downloaded as a single 159 kB bundle.
It makes the site actually sort of weird to navigate; since it pretty quickly
downloads the bundle in the background, everything goes offline and the
speed of loading new pages just feels… off somehow. As if it's not a
Serious Web Page if there's no load time.
Of course, this also means that I couldn't cache PNGs, because have you ever
tried to have non-UTF-8 data in a JSON sent through N layers of JavaScript? :-)
Another short status update of what happened on my side last month. Besides a phosh bugfix release improving text input and selection
was a prevalent pattern again resulting in improvements in the compositor, the OSK and some apps.
Consistent focus style on lock screen and settings (MR). Improves the visual appearance
as the dotted focus frame doesn't match our otherwise colored focus frames
Don't focus buttons in settings (MR). Improves the visual appearance as
attention isn't drawn to the button focus.
Close Phosh's settings when activating a Settings panel (MR)
Collect some of the QCom workarounds in a package (MR). This is not meant to go into Debian proper but it's nicer than doing all the mods by hand and forgetting which files were modified.
Don't take focus when sending messages, adding emojis or attachments (MR). Makes typing faster (as the OSK
won't hide) and thus using those buttons easier
xdg-desktop-portal
Use categories that work for both xdg-spec and the portal (MR)
Reviews
This is not code by me but reviews on other peoples code. The list is
fairly incomplete, hope to improve on this in the upcoming months:
If you want to support my work see donations. This includes
a list of hardware we want to improve support for. Thanks a lot to all current and past donors.
A hot-fix release 1.0.13-1, consisting of two small PRs relative to
the last regular CRAN release
1.0.13,
just arrived on CRAN. When we
prepared 1.0.13,
we included a change related to the ‘tightening’ of the C API of R
itself. Sadly, we pinned an expected change to ‘comes with next (minor)
release 4.4.2’ rather than now ‘next (normal aka major) release 4.5.0’.
And now that R 4.4.2 is out (as of two days ago) we accidentally broke
building against the header file with that check. Whoops. Bugs happen,
and we are truly sorry—but this is now addressed in 1.0.13-1.
The normal (bi-annual) release cycle will resume with 1.0.14 slated
for January. As you can see from the NEWS
file of the development branch, we have a number of changes coming.
You can safely access that release candidate version, either off the
default branch at github or via r-universe artifacts.
The list below details all changes, as usual. The only other change
concerns the now-mandatory use of Authors@R.
Changes in
Rcpp release version 1.0.13-1 (2024-11-01)
Changes in Rcpp API:
Use read-only VECTOR_PTR and STRING_PTR
only with with R 4.5.0 or later (Kevin in #1342 fixing #1341)
Changes in Rcpp Deployment:
Authors@R is now used in DESCRIPTION as mandated by CRAN
Two months ago I bought a Thinkpad X1 Yoga Gen3 [1]. I’m still very happy with it, the screen is a great improvement over the FullHD screen on my previous Thinkpad. I have yet to discover what’s the best resolution to have on a laptop if price isn’t an issue, but it’s at least 1440p for a 14″ display, that’s 210DPI. The latest Thinkpad X1 Yoga is the 7th gen and has up to 3840*2400 resolution on the internal display for 323DPI. Apple apparently uses the term “Retina Display” to mean something in the range of 250DPI to 300DPI, so my current laptop is below “Retina” while the most expensive new Thinkpads are above it.
I did some tests on external displays and found that this Thinkpad along with a Dell Latitude of the same form factor and about the same age can only handle one 4K display on a Thunderbolt dock and one on HDMI. On Reddit u/Carlioso1234 pointed out this specs page which says it supports a maximum of 3 displays including the built in TFT [2]. The Thunderbolt/USB-C connection has a maximum resolution of 5120*2880 and the HDMI port has a maximum of 4K. The latest Yoga can support four displays total which means 2*5K over Thunderbolt and one 4K over HDMI. It would be nice if someone made a 8000*2880 ultrawide display that looked like 2*5K displays when connected via Thunderbolt. It would also be nice if someone made a 32″ 5K display, currently they all seem to be 27″ and I’ve found that even for 4K resolution 32″ is better than 27″.
With the typical configuration of Linux and the BIOS the Yoga Gen3 will have it’s touch screen stop working after suspend. I have confirmed this for stylus use but as the finger-touch functionality is broken I couldn’t confirm that. On r/thinkpad u/p9k told me how to fix this problem [3]. I had to set the BIOS to Win 10 Sleep aka Hybrid sleep and then put the following in /etc/systemd/system/thinkpad-wakeup-config.service :
Now it works fine, for stylus at least. I still get kernel error messages like the following which don’t seem to cause problems:
wacom 0003:056A:5146.0005: wacom_idleprox_timeout: tool appears to be hung in-prox. forcing it out.
When it wasn’t working I got the above but also kernel error messages like:
wacom 0003:056A:5146.0005: wacom_wac_queue_insert: kfifo has filled, starting to drop events
This change affected the way suspend etc operate. Now when I connect the laptop to power it will leave suspend mode. I’ve configured KDE to suspend when the lid is closed and there’s no monitor connected.
MrWhosTheBoss made a good YouTube video reviewing recent Huawei products [2]. At 2:50 in that video he shows how you can link a phone and tablet, control one from the other, drag and drop of running apps and files between phone and tablet, mirror the screen between devices, etc. He describes playing a video on one device and having it appear on the other, I hope that it actually launches a new instance of the player app as the Google Chromecast failed in the market due to remote display being laggy. At 7:30 in that video he starts talking about the features that are available when you have multiple Huawei devices, starting with the ability to move a Bluetooth pairing for earphones to a different device.
At 16:25 he shows what Huawei is doing to get apps going including allowing apk files to be downloaded and creating what they call “Quick Apps” which are instances of a web browser configured to just use one web site and make it look like a discrete app, we need something like this for FOSS phone distributions – does anyone know of a browser that’s good for it?
Another thing that we need is to have an easy way of transferring open web pages between systems. Chrome allows sending pages between systems but it’s proprietary, limited to Chrome only, and also takes an unreasonable amount of time. KDEConnect allows sharing clipboard contents which can be used to send URLs that can then be pasted into a browser, but the process of copy URL, send via KDEConnect, and paste into other device is unreasonably slow. The design of Chrome with a “Send to your devices” menu option from the tab bar is OK. But ideally we need a “Send to device” for all tabs of a window as well, we need it to run from free software and support using your own server not someone else’s server (AKA “the cloud”). Some of the KDEConnect functionality but using a server rather than direct connection over the same Wifi network (or LAN if bridged to Wifi) would be good.
I recently had someone describe a Mac Mini as a “workstation”, which I strongly disagree with. The Wikipedia page for Workstation [1] says that it’s a type of computer designed for scientific or technical use, for a single user, and would commonly run a multi-user OS.
The Mac Mini runs a multi-user OS and is designed for a single user. The issue is whether it is for “scientific or technical use”. A Mac Mini is a nice little graphical system which could be used for CAD and other engineering work. But I believe that the low capabilities of the system and lack of expansion options make it less of a workstation.
The latest versions of the Mac Mini (to be officially launched next week) have up to 64G of RAM and up to 8T of storage. That is quite decent compute power for a small device. For comparison the HP ML 110 Gen9 workstation I’m currently using was released in 2021 and has 256G of RAM and has 4 * 3.5″ SAS bays so I could easily put a few 4TB NVMe devices and some hard drives larger than 10TB. The HP Z640 workstation I have was released in 2014 and has 128G of RAM and 4*2.5″ SATA drive bays and 2*3.5″ SATA drive bays. Previously I had a Dell PowerEdge T320 which was released in 2012 and had 96G of RAM and 8*3.5″ SAS bays.
In CPU and GPU power the recent Mac Minis will compare well to my latest workstations. But they compare poorly to workstations from as much as 12 years ago for RAM and storage. Which is more important depends on the task, if you have to do calculations on 80G of data with lots of scans through the entire data set then a system with 64G of RAM will perform very poorly and a system with 96G and a CPU less than half as fast will perform better. A Dell PowerEdge T320 from 2012 fully loaded with 192G of RAM will outperform a modern Mac Mini on many tasks due to this and the T420 supported up to 384G.
Another issue is generic expansion options. I expect a workstation to have a number of PCIe slots free for GPUs and other devices. The T320 I used to use had a PCIe power cable for a power hungry GPU and I think all the T320 and T420 models with high power PSUs supported that.
I think that a usable definition of a “workstation” is a system having a feature set that is typical of servers (ECC RAM, lots of storage for RAID, maybe hot-swap storage devices, maybe redundant PSUs, and lots of expansion options) while also being suitable for running on a desktop or under a desk. The Mac Mini is nice for running on a desk but that’s the only workstation criteria it fits. I think that ECC RAM should be a mandatory criteria and any system without it isn’t a workstation. That excludes most Apple hardware. The Mac Mini is more of a thin-client than a workstation.
My main workstation with ECC RAM could run 3 VMs that each have more RAM than the largest Mac Mini that will be sold next week.
If 32G of non-ECC RAM is considered enough for a “workstation” then you could get an Android phone that counts as a workstation – and it will probably cost less than a Mac Mini.
So a common theme on the Internet about Debian is so old. And
right, I am getting close to the stage that I feel a little laggy: I
am using a bunch of backports for packages I need, and I'm missing a
bunch of other packages that just landed in unstable and didn't make
it to backports for various reasons.
I disagree that "old" is a bad thing: we definitely run Debian stable
on a fleet of about 100 servers and can barely keep up, I would make
it older. And "old" is a good thing: (port) wine and (any) beer
needs time to age properly, and so do humans, although some humans
never seem to grow old enough to find wisdom.
But at this point, on my laptop, I am feeling like I'm missing
out. This page, therefore, is an evolving document that is a twist
on the classic NewIn game. Last time I played seems to be
#newinwheezy
(2013!), so really, I'm due for an update. (To be fair to myself, I do
keep tabs on upgrades quite well at home and
work, which do have their share of "new in", just after the fact.)
New packages to explore
Those tools are shiny new things available in unstable or perhaps
Trixie (testing) already that I am not using yet, but I find
interesting enough to list here.
Those are packages that I have tested because I found them
interesting, but ended up not using, but I think people could find
interesting anyways.
kew: surprisingly fast music player, parsed my entire library
(which is huge) instantaneously and just started playing (I still
use Supersonic, for which I maintain a flatpak on my
Navidrome server)
mdformat: good markdown formatter, think black or gofmt but
for markdown), but it didn't actually do what I needed, and
it's not quite as opinionated as it should (or could) be)
Backports already in use
Those are packages I already use regularly, which have backports or
that can just be installed from unstable:
If you know of cool things I'm missing out of, then by all means let
me know!
That said, overall, this is a pretty short list! I have most of what I
need in stable right now, and if I wasn't a Debian developer, I don't
think I'd be doing the jump now. But considering how easier it is to
develop Debian (and how important it is to test the next release!),
I'll probably upgrade soon.
Previously, I was running Debian testing (which why the slug on that
article is why-trixie), but now I'm actually considering just
running unstable on my laptop directly anyways. It's been a long time
since we had any significant instability there, and I can typically
deal with whatever happens, except maybe when I'm traveling, and then
it's easy to prepare for that (just pin testing).
Almost all of my Debian contributions this month were
sponsored by Freexian.
You can also support my work directly via
Liberapay.
Ansible
I noticed that Ansible had fallen out of Debian
testing due to autopkgtest failures. This seemed like a problem worth
fixing: in common with many other people, we use Ansible for configuration
management at Freexian, and it probably wouldn’t make our sysadmins too
happy if they upgraded to trixie after its release and found that Ansible
was gone.
The problems here were really just slogging through test failures in both
the ansible-core and ansible packages, but their test suites are large
and take a while to run so this took some time. I was able to contribute a
few small fixes to various upstreams in the process:
Martin-Éric Racine
reported that ssh-audit
didn’t list the ext-info-s feature as being available in Debian’s OpenSSH
9.2 packaging in bookworm, contrary to what OpenSSH upstream said on their
specifications page at the time. I
spent some time looking into this and realized that upstream was mistakenly
saying that implementations of ext-info-c and ext-info-s were added at
the same time, while in fact ext-info-s was added rather later.
ssh-audit now has clearer output, and the OpenSSH maintainers have
corrected their specifications page.
I looked into a report of an ssh
failure in certain cases when using GSS-API key exchange (which is a Debian
patch). Once again, having integration
tests was a huge win here: the affected
scenario is quite a fiddly one, but I was able to set it up in the
test,
and thereby make sure it doesn’t regress in future. It still took me a
couple of hours to get all the details right, but in the past this sort of
thing took me much longer with a much lower degree of confidence that the
fix was correct.
On upstream’s
advice,
I cherry-picked some key exchange fixes needed for big-endian architectures.
Python team
I packaged python-evalidate, needed for a
new upstream version of buildbot.
tzdata
moved
some timezone definitions to tzdata-legacy, which has broken a number of
packages. I added tzdata-legacy build-dependencies to
alembic and
python-icalendar to deal with this in
those packages, though there are still some other instances of this left.
I tracked down an nltk regression that
caused build failures in many other packages.
I fixed Rust crate versioning issues in
pydantic-core,
python-bcrypt, and
python-maturin (mostly fixed by Peter
Michael Green and Jelmer Vernooij, but it needed a little extra work).
Overdue is a stand-alone novelette in the Library Trilogy universe. Returns is a collection of two
stories, the novelette "Returns" and the short story "About Pain." All of
them together are about the length of a novella, so I'm combining them
into a single review.
These are ancillary stories in the same universe as the novels, but not
necessarily in the same timeline. (Trying to fit "About Pain" into the
novel timeline will give you a headache and I am choosing to read it as
author's fan fiction.) I'm guessing they're part of the new fad for
releasing short fiction on Amazon to tide readers over and maintain
interest between books in a series, a fad about which I have mixed
feelings. Given the total lack of publisher metadata in either the
stories or on Amazon, I'm assuming they were self-published even though
the novels are published by Ace, but I don't know that for certain.
I found all three of these stories irritating and thuddingly trite.
"Returns" is probably the best of the lot in terms of quality of
storytelling, but I intensely dislike the structural implications of the
nature of the book at its center and am therefore hoping that it's
non-canonical.
I would not waste your time with these even if you are enjoying the
novels.
"Overdue": Three owners of the same bookstore at different
points in time have encounters with an albino man named Yute who is on a
quest. One of the owners is trying to write a book, one of them is older,
depressed, and closed off, and one of them has regular conversations with
her sister's ghost. The nature of the relationship between the three is
too much of a spoiler, but it involves similar shenanigans as The
Book That Wouldn't Burn.
Lawrence uses my least favorite resolution of benign ghost stories. The
story tries very hard to sell it as a good thing, but I thought it was
cruel and prefer fantasy that rejects both branches of that dilemma.
Other than that, it was fine, I guess, although the moral was delivered
with all of the subtlety of the last two minutes of a Saturday morning
cartoon. (5)
"Returns": Livira returns a book deep inside the library and
finds that she can decipher it, which leads her to a story about Yute
going on a trip to recover another library book. This had a lot of great
Yute lines, plus I always like seeing Livira in exploration mode. The
book itself is paradoxical in a causality-destroying way, which is
handwaved away as literal magic. I liked this one the best of the three
stories, but I hope the world-building of the main series does not go in
this direction and I'm a little afraid it might. (6)
"About Pain": A man named Holden runs into a woman named Clovis
at the gym while carrying a book titled Catcher that his dog found
and that he's returning to the library. I thoroughly enjoy Clovis and was
happy to read a few more scenes about her. Other than that, this was
fine, I guess, although it is a story designed to deliver a point and that
point is one that appears in every discussion of classics and re-reading
that has ever happened on the Internet. Also, I know I'm being grumpy,
but Lawrence's puns with authors and character names are chapter-epigraph
amusing but not short-story-length funny. Yes, yes, his name is Holden,
we get it. (5)
Another pure maintenance release 0.2.7 of the gcbd package is now on
CRAN. The gcbd proposes a
benchmarking framework for LAPACK and BLAS operations (as the library
can exchanged in a plug-and-play sense on suitable OSs) and records
result in local database. Its original motivation was to also compare to
GPU-based operations. However, as it is both challenging to keep CUDA
working packages on CRAN
providing the basic functionality appear to come and go so testing the
GPU feature can be challenging. The main point of gcbd is now to actually
demonstrate that ‘yes indeed’ we can just swap BLAS/LAPACK libraries
without any change to R, or R packages. The ‘configure / rebuild R for
xyz’ often seen with ‘xyz’ being Goto or MKL is simply plain wrong: you
really can just swap them (on proper operating systems, and R
configs – see the package vignette for more). But nomatter how often we
aim to correct this record, it invariably raises its head another
time.
This release accommodates a CRAN change request as we were
referencing the (now only suggested) package gputools. As
hinted in the previous paragraph, it was once on CRAN but is not right now so we
adjusted our reference.
Just a "warn your brothers" for people foolish enough to
use GKE and run on the Rapid release channel.
Update from version 1.31.1-gke.1146000 to 1.31.1-gke.1678000 is causing
trouble whenever NetworkPolicy resources and a readinessProbe (or health check)
are configured. As a workaround we started to remove the NetworkPolicy
resources. E.g. when kustomize is involved with a patch like this:
We tried to update to the latest version - right now 1.31.1-gke.2008000 - which
did not change anything.
Behaviour is pretty much erratic, sometimes it still works and sometimes the traffic
is denied. It also seems that there is some relevant fix in 1.31.1-gke.1678000
because that is now the oldest release of 1.31.1 which I can find in the regular and
rapid release channels. The last known good version 1.31.1-gke.1146000 is not
available to try a downgrade.
The number of FAIme jobs has reached 30.000. Yeah!
At the end of this November the FAIme web service for building customized ISOs turns 7 years old.
It had reached 10.000 jobs in March 2021 and 20.000 jobs were reached in
June 2023. A nice increase of the usage.
Here are some statistics for the jobs processed in 2024:
Type of jobs
3%
cloud image
11%
live ISO
86%
install ISO
Distribution
2%
bullseye
8%
trixie
12%
ubuntu 24.04
78%
bookworm
Misc
18% used a custom postinst script
11% provided their ssh pub key for passwordless root login
50% of the jobs didn't included a desktop environment at
all, the others used GNOME, XFCE or KDE or the Ubuntu desktop the most.
The biggest ISO was a FAIme job which created a live ISO with a desktop and some additional packages
This job took 30min to finish and the resulting ISO was 18G in size.
Execution Times
The cloud and live ISOs need more time for their creation because the
FAIme server needs to unpack and install all packages. For the install
ISO the packages are only downloaded. The amount of software
packages also affects the build time.
Every ISO is build in a VM on an old 6-core E5-1650 v2.
Times given are calculated from the jobs of the past two weeks.
Job type
Avg
Max
install no desktop
1 min
2 min
install GNOME
2 min
5 min
The times for Ubuntu without and with desktop are one minute higher than those mentioned above.
Job type
Avg
Max
live no desktop
4 min
6 min
live GNOME
8 min
11 min
The times for cloud images are similar to live images.
A New Feature
For a few weeks now, the system has been showing the number of jobs
ahead of you in the queue when you submit a job that cannot be
processed immediately.
The Next Milestone
At the end of this years the FAI project will be 25 years old.
If you have a success story of your FAI usage to share please post it
to the linux-fai mailing list or send it to me.
Do you know the FAI questionnaire ? A lot of
reports are already available.
Here's an overview what happened in the past 20 years in the FAI
project.
About FAIme
FAIme is the service for building your own customized ISO via a web
interface. You can create an installation or live ISO or a cloud
image. Several Debian releases can be selected and also Ubuntu
server or Ubuntu desktop installation ISOs can be customized.
Multiple options are available like selecting a desktop and the language, adding your own package
list, choosing a partition layout, adding a user, choosing a backports
kernel, adding a postinst script and some more.
This looks straightforward and is far from it. I expect tool support will
improve in the future. Meanwhile, this blog post serves as a step by step
explanation for what is going on in code that I'm about to push to my team.
Let's take this relatively straightforward python code. It has a function
printing an int, and a decorator that makes it argument optional, taking it
from a global default if missing:
It lacks functools.wraps and typing, though. Let's add them.
Adding functools.wraps
Adding a simple @functools.wraps, mock unexpectedly stops working:
# python3 test1.py
Answer: 12
Answer: 42
Mocked answer: 12
Traceback (most recent call last):
File "/home/enrico/lavori/freexian/tt/test1.py", line 42, in <module>
fiddle.print()
File "<string>", line 2, in print
File "/usr/lib/python3.11/unittest/mock.py", line 186, in checksig
sig.bind(*args, **kwargs)
File "/usr/lib/python3.11/inspect.py", line 3211, in bind
return self._bind(args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/inspect.py", line 3126, in _bind
raise TypeError(msg) from None
TypeError: missing a required argument: 'value'
This is the new code, with explanations and a fix:
# Introduce functoolsimportfunctoolsfromunittestimportmockdefault=42defwith_default(f):@functools.wraps(f)defwrapped(self,value=None):ifvalueisNone:value=defaultreturnf(self,value)# Fix:# del wrapped.__wrapped__returnwrappedclassFiddle:@with_defaultdefprint(self,value):assertvalueisnotNoneprint("Answer:",value)fiddle=Fiddle()fiddle.print(12)fiddle.print()defmocked(self,value=None):print("Mocked answer:",value)withmock.patch.object(Fiddle,"print",autospec=True,side_effect=mocked):fiddle.print(12)# mock's autospec uses inspect.getsignature, which follows __wrapped__ set# by functools.wraps, which points to a wrong signature: the idea that# value is optional is now lostfiddle.print()
Adding typing
For simplicity, from now on let's change Fiddle.print to match its wrapped signature:
# Give up with making value not optional, to simplify things :(defprint(self,value:int|None=None)->None:assertvalueisnotNoneprint("Answer:",value)
Typing with ParamSpec
# Introduce typing, try with ParamSpecimportfunctoolsfromtypingimportTYPE_CHECKING,ParamSpec,Callablefromunittestimportmockdefault=42P=ParamSpec("P")defwith_default(f:Callable[P,None])->Callable[P,None]:# Using ParamSpec we forward arguments, but we cannot use them!@functools.wraps(f)defwrapped(self,value:int|None=None)->None:ifvalueisNone:value=defaultreturnf(self,value)returnwrappedclassFiddle:@with_defaultdefprint(self,value:int|None=None)->None:assertvalueisnotNoneprint("Answer:",value)
mypy complains inside the wrapper, because while we forward arguments we don't
constrain them, so we can't be sure there is a value in there:
test2.py:17: error: Argument 2 has incompatible type "int"; expected "P.args" [arg-type]
test2.py:19: error: Incompatible return value type (got "_Wrapped[P, None, [Any, int | None], None]", expected "Callable[P, None]") [return-value]
test2.py:19: note: "_Wrapped[P, None, [Any, int | None], None].__call__" has type "Callable[[Arg(Any, 'self'), DefaultArg(int | None, 'value')], None]"
Typing with Callable
We can use explicit Callable argument lists:
# Introduce typing, try with CallableimportfunctoolsfromtypingimportTYPE_CHECKING,Callable,TypeVarfromunittestimportmockdefault=42A=TypeVar("A")# Callable cannot represent the fact that the argument is optional, so now mypy# complains if we try to omit itdefwith_default(f:Callable[[A,int|None],None])->Callable[[A,int|None],None]:@functools.wraps(f)defwrapped(self:A,value:int|None=None)->None:ifvalueisNone:value=defaultreturnf(self,value)returnwrappedclassFiddle:@with_defaultdefprint(self,value:int|None=None)->None:assertvalueisnotNoneprint("Answer:",value)ifTYPE_CHECKING:reveal_type(Fiddle.print)fiddle=Fiddle()fiddle.print(12)# !! Too few arguments for "print" of "Fiddle" [call-arg]fiddle.print()defmocked(self,value=None):print("Mocked answer:",value)withmock.patch.object(Fiddle,"print",autospec=True,side_effect=mocked):fiddle.print(12)fiddle.print()
Now mypy complains when we try to omit the optional argument, because Callable
cannot represent optional arguments:
test3.py:32: note: Revealed type is "def (test3.Fiddle, Union[builtins.int, None])"
test3.py:37: error: Too few arguments for "print" of "Fiddle" [call-arg]
test3.py:46: error: Too few arguments for "print" of "Fiddle" [call-arg]
Callable cannot express complex signatures such as functions that take a
variadic number of arguments, overloaded functions, or functions that have
keyword-only parameters. However, these signatures can be expressed by
defining a Protocol class with a call() method:
Let's do that!
Typing with Protocol, take 1
# Introduce typing, try with ProtocolimportfunctoolsfromtypingimportTYPE_CHECKING,Protocol,TypeVar,Generic,castfromunittestimportmockdefault=42A=TypeVar("A",contravariant=True)classPrinter(Protocol,Generic[A]):def__call__(_,self:A,value:int|None=None)->None:...defwith_default(f:Printer[A])->Printer[A]:@functools.wraps(f)defwrapped(self:A,value:int|None=None)->None:ifvalueisNone:value=defaultreturnf(self,value)returncast(Printer,wrapped)classFiddle:# function has a __get__ method to generated bound versions of itself# the Printer protocol does not define it, so mypy is now unable to type# the bound method correctly@with_defaultdefprint(self,value:int|None=None)->None:assertvalueisnotNoneprint("Answer:",value)ifTYPE_CHECKING:reveal_type(Fiddle.print)fiddle=Fiddle()# !! Argument 1 to "__call__" of "Printer" has incompatible type "int"; expected "Fiddle"fiddle.print(12)fiddle.print()defmocked(self,value=None):print("Mocked answer:",value)withmock.patch.object(Fiddle,"print",autospec=True,side_effect=mocked):fiddle.print(12)fiddle.print()
New mypy complaints:
test4.py:41: error: Argument 1 to "__call__" of "Printer" has incompatible type "int"; expected "Fiddle" [arg-type]
test4.py:42: error: Missing positional argument "self" in call to "__call__" of "Printer" [call-arg]
test4.py:50: error: Argument 1 to "__call__" of "Printer" has incompatible type "int"; expected "Fiddle" [arg-type]
test4.py:51: error: Missing positional argument "self" in call to "__call__" of "Printer" [call-arg]
What happens with class methods, is that the function object has a __get__
method that generates a bound versions of itself. Our Printer protocol does not
define it, so mypy is now unable to type the bound method correctly.
Typing with Protocol, take 2
So... we add the function descriptor methos to our Protocol!
# Introduce typing, try with Protocol, harder!importfunctoolsfromtypingimportTYPE_CHECKING,Protocol,TypeVar,Generic,cast,overload,Unionfromunittestimportmockdefault=42A=TypeVar("A",contravariant=True)# We now produce typing for the whole function descriptor protocol## See https://github.com/python/typing/discussions/1040classBoundPrinter(Protocol):"""Protocol typing for bound printer methods."""def__call__(_,value:int|None=None)->None:"""Bound signature."""classPrinter(Protocol,Generic[A]):"""Protocol typing for printer methods."""# noqa annotations are overrides for flake8 being confused, giving either D418:# Function/ Method decorated with @overload shouldn't contain a docstring# or D105:# Missing docstring in magic method## F841 is for vulture being confused:# unused variable 'objtype' (100% confidence)@overloaddef__get__(# noqa: D105self,obj:A,objtype:type[A]|None=None# noqa: F841)->BoundPrinter:...@overloaddef__get__(# noqa: D105self,obj:None,objtype:type[A]|None=None# noqa: F841)->"Printer[A]":...def__get__(self,obj:A|None,objtype:type[A]|None=None# noqa: F841)->Union[BoundPrinter,"Printer[A]"]:"""Implement function descriptor protocol for class methods."""def__call__(_,self:A,value:int|None=None)->None:"""Unbound signature."""defwith_default(f:Printer[A])->Printer[A]:@functools.wraps(f)defwrapped(self:A,value:int|None=None)->None:ifvalueisNone:value=defaultreturnf(self,value)returncast(Printer,wrapped)classFiddle:# function has a __get__ method to generated bound versions of itself# the Printer protocol does not define it, so mypy is now unable to type# the bound method correctly@with_defaultdefprint(self,value:int|None=None)->None:assertvalueisnotNoneprint("Answer:",value)fiddle=Fiddle()fiddle.print(12)fiddle.print()defmocked(self,value=None):print("Mocked answer:",value)withmock.patch.object(Fiddle,"print",autospec=True,side_effect=mocked):fiddle.print(12)fiddle.print()
Oddly enough, as I parked the car, Allie Sherlock's first single was playing
on the radio. I photographed
Allie Sherlock and Zoe Clark four months ago. We can look forward to
the day when Fergus's hit Roadkill comes on the radio while driving.
Again this year, Arm offered to
host us for
a mini-debconf
in Cambridge. Roughly 60 people turned up on 10-13 October to the Arm
campus, where they made us really welcome. They even had some
Debian-themed treats made to spoil us!
Hacking together
For the first two days, we had a "mini-debcamp" with disparate
group of people working on all sorts of things: Arm support, live
images, browser stuff, package uploads, etc. And (as is traditional)
lots of people doing last-minute work to prepare slides for their
talks.
Sessions and talks
Saturday and Sunday were two days devoted to more traditional
conference sessions. Our talks covered a typical range of Debian
subjects: a DPL "Bits" talk, an update from the Release Team, live
images. We also had some wider topics: handling your own data, what to
look for in the upcoming Post-Quantum Crypto world, and even me
talking about the ups and downs of Secure Boot. Plus a random set of
lightning talks too! :-)
Video team awesomeness
Lots of volunteers from the DebConf video team were on hand too
(both on-site and remotely!), so our talks were both streamed live and
recorded for posterity - see the links from the individual talk pages
in the wiki,
or http://meetings-archive.debian.net/pub/debian-meetings/2024/MiniDebConf-Cambridge/
for the full set if you'd like to see more.
A great time for all
Again, the mini-conf went well and feedback from attendees was very
positive. Thanks to all our helpers, and of course to our
sponsor: Arm for providing the venue
and infrastructure for the event, and all the food and drink too!
Photo credits: Andy Simpkins, Mark Brown, Jonathan Wiltshire. Thanks!
Late last month there was an announcement of a “severity 9.9 vulnerability” allowing remote code execution that affects “all GNU/Linux systems (plus others)” [1]. For something to affect all Linux systems that would have to be either a kernel issue or a sshd issue. The announcement included complaints about the lack of response of vendors and “And YES: I LOVE hyping the sh1t out of this stuff because apparently sensationalism is the only language that forces these people to fix”.
He seems to have a different experience to me of reporting bugs, I have had plenty of success getting bugs fixed without hyping them. I just report the bug, wait a while, and it gets fixed. I have reported potential security bugs without even bothering to try and prove that they were exploitable (any situation where you can make a program crash is potentially exploitable), I just report it and it gets fixed. I was very dubious about his ability to determine how serious a bug is and to accurately report it so this wasn’t a situation where I was waiting for it to be disclosed to discover if it affected me. I was quite confident that my systems wouldn’t be at any risk.
Analysis
Not All Linux Systems Run CUPS
When it was published my opinion was proven to be correct, it turned out to be a series of CUPS bugs [2]. To describe that as “all GNU/Linux systems (plus others)” seems like a vast overstatement, maybe a good thing to say if you want to be a TikTok influencer but not if you want to be known for computer security work.
For the Debian distribution the cups-browsed package (which seems to be the main exploitable one) is recommended by cups-daemon, as I have my Debian systems configured to not install recommended packages by default that means that it wasn’t installed on any of my systems. Also the vast majority of my systems don’t do printing and therefore don’t have any part of CUPS installed.
CUPS vs NAT
The next issue is that in Australia most home ISPs don’t have IPv6 enabled and CUPS doesn’t do the things needed to allow receiving connections from the outside world via NAT with IPv4. If inbound port 631 is blocked on both TCP and USP as is the default on Australian home Internet or if there is a correctly configured firewall in place then the network is safe from attack. There is a feature called uPnP port forwarding [3] to allow server programs to ask a router to send inbound connections to them, this is apparently usually turned off by default in router configuration. If it is enabled then there are Debian packages of software to manage this, the miniupnpc package has the client (which can request NAT changes on the router) [4]. That package is not installed on any of my systems and for my home network I don’t use a router that runs uPnP.
The only program I knowingly run that uses uPnP is Warzone2100 and as I don’t play network games that doesn’t happen. Also as an aside in version 4.4.2-1 of warzone2100 in Debian and Ubuntu I made it use Bubblewrap to run the game in a container. So a Remote Code Execution bug in Warzone 2100 won’t be an immediate win for an attacker (exploits via X11 or Wayland are another issue).
To check SE Linux access I first use the “semanage fcontext” command to check the context of the binary, cupsd_exec_t means that the daemon runs as cupsd_t. Then I checked what file access is granted with the sesearch program, mostly just access to temporary files, cupsd config files, the faillog, the Kerberos cache files (not used on the Kerberos client systems I run), Samba run files (might be a possibility of exploiting something there), and the security_t used for interfacing with kernel security infrastructure. I then checked the access to the security class and found that it is permitted to check contexts and access-vectors – not access that can be harmful.
The next test was to use sesearch to discover what capabilities are granted, which unfortunately includes the sys_admin capability, that is a capability that allows many sysadmin tasks that could be harmful (I just checked the Fedora source and Fedora 42 has the same access). Whether the sys_admin capability can be used to do bad things with the limited access cupsd_t has to device nodes etc is not clear. But this access is undesirable.
So the SE Linux policy in Debian and Fedora will stop cupsd_t from writing SETUID programs that can be used by random users for root access and stop it from writing to /etc/shadow etc. But the sys_admin capability might allow it to do hostile things and I have already uploaded a changed policy to Debian/Unstable to remove that. The sys_rawio capability also looked concerning but it’s apparently needed to probe for USB printers and as the domain has no access to block devices it is otherwise harmless. Below are the commands I used to discover what the policy allows and the output from them.
This is an example of how not to handle security issues. Some degree of promotion is acceptable but this is very excessive and will result in people not taking security announcements seriously in future. I wonder if this is even a good career move by the researcher in question, will enough people believe that they actually did something good in this that it outweighs the number of people who think it’s misleading at best?
Whilst researching what synth to buy, I learned of the Behringer1
Model-D2: a 2018 clone of the 1970 Moog Minimoog, in a desktop form
factor.
Behringer Model-D
In common with the original Minimoog, it's a monophonic analogue synth,
featuring three audible oscillators3 , Moog's famous 12-ladder filter and
a basic envelope generator. The model-d has lost the keyboard from the
original and added some patch points for the different stages, enabling
some slight re-routing of the audio components.
1970 Moog Minimoog
Since I was focussing on more fundamental, back-to-basics
instruments,
this was very appealing to me. I'm very curious to find out what's so compelling
about the famous Moog sound. The relative lack of features feels like an
advantage: less to master. The additional patch points makes it a little
more flexible and offer a potential gateway into the world of modular synthesis.
The Model-D is also very affordable: about £ 200 GBP. I'll never
own a real Moog.
For this to work, I would need to supplement it with some other equipment.
I'd need a keyboard (or press the Micron into service as a controller); I
would want some way of recording and overdubbing (same as with any synth).
There are no post-mix effects on the Model-D, such as delay, reverb or
chorus, so I may also want something to add those.
What stopped me was partly the realisation that there was little chance that a
perennial beginner, such as I, could eek anything novel out of a synthesiser
design that's 54 years old. Perhaps that shouldn't matter, but it gave me
pause. Whilst the Model-D has patch points, I don't have anything to connect
to them, and I'm firmly wanting to avoid the Modular Synthesis money pit.
The lack of effects, and polyphony could make it hard to live-sculpt a tone.
I started characterizing the Model-D as the "heart" choice, but it seemed
wise to instead go for a "head" choice.
Maybe another day!
There's a whole other blog post of material I could write about
Behringer and their clones of classic synths, some long out of production,
and others, not so much. But, I decided to skip on that for now.↩
taken from the fact that the Minimoog was a productised version
of Moog's fourth internal prototype, the model D.↩
What benefits do these things offer when a general purpose computer can do so
many things nowadays? Is there a USB keyboard that you can connect to a
laptop or phone to do these things? I presume that all recent phones have the
compute power to do all the synthesis you need if you have the right
software. Is it just a lack of software and infrastructure for doing it on
laptops/phones that makes synthesisers still viable?
I've decided to turn my response into a post of its own.
The issue is definitely not compute power. You can indeed attach a USB keyboard
to a computer and use a plethora of software synthesisers, including very
faithful emulations of all the popular classics. The raw compute power of
modern hardware synths is comparatively small: I’ve been told the modern Korg
digital synths are on a par with a raspberry pi. I’ve seen some DSPs which are
32 bit ARMs, and other tools which are roughly equivalent to arduinos.
I can think of four reasons hardware synths remain popular with some despite
the above:
As I touched on in my original synth post, computing dominates my
life outside of music already. I really wanted something separate from
that to keep mental distance from work.
Synths have hard real-time requirements. They don't have raw power in
compute terms, but they absolutely have to do their job within microseconds
of being instructed to, with no exceptions. Linux still has a long way to go
for hard real-time.
The Linux audio ecosystem is… complex. Dealing with pipewire, pulseaudio,
jack, alsa, oss, and anything else I've forgotten, as well as their failure
modes, is too time consuming.
The last point is to do with creativity and inspiration. A good synth is
more than the sum of its parts: it's an instrument, carefully designed and
its components integrated by musically-minded people who have set out to
create something to inspire. There are plenty of synths which aren't good
instruments, but have loads of features: they’re boxes of "stuff". Good
synths can't do it all: they often have limitations which you have to
respond to, work around or with, creatively. This was expressed better than
I could by Trent Reznor in the video archetype of a synthesiser:
I nearly did, but ultimately I didn't buy an Arturia Microfreak.
The Microfreak is a small form factor hybrid synth with a distinctive style.
It's priced at the low end of the market and it is overflowing with features.
It has a weird 2-octave keyboard which is a stylophone-style capacitive strip
rather than weighted keys. It seems to have plenty of controls, but given the
amount of features it has, much of that functionality is inevitably buried in
menus. The important stuff is front and centre, though. The digital
oscillators are routed through an analog filter. The Microfreak gained sampler
functionality in a firmware update that surprised and delighted its owners.
I watched a load of videos about the Microfreak, but the above review from
musician Stimming stuck
in my mind because it made a comparison between the Microfreak and Teenage
Engineering's OP-1.
The Teenage Engineering OP-1.
I'd been lusting after the OP-1 since it appeared in 2011: a
pocket-sized1 music making machine with eleven synthesis engines, a
sampler, and less conventional features such as an FM radio, a large colour
OLED display, and a four track recorder. That last feature in particular was
really appealing to me: I loved the idea of having an all-in-one machine to try
and compose music. Even then, I was not keen on involving conventional
computers in music making.
Of course in many ways it is a very compromised machine. I never did buy a
OP-1, and by now they've replaced it with a new model (the OP-1 field)
that costs 50% more (but doesn't seem to do 50% more) I'm still not buying one.
Framing the Microfreak in terms of the OP-1 made the penny drop for me.
The Microfreak doesn't have the four-track functionality, but almost no synth
has: I'm going to have to look at something external to provide that. But it
might capture a similar sense of fun; it's something I could use on the sofa,
in the spare room, on the train, during lunchbreaks at work, etc.
So I didn't buy the Microfreak. Maybe one day in the future once I'm further
down the road. Instead, I started to concentrate my search on more fundamental,
back-to-basics instruments…
A new minor release of the drat package
arrived on CRAN today, which is
just over a year since the previous release. drat stands for
drat R Archive Template, and helps with easy-to-create and
easy-to-use repositories for R packages. Since its inception in
early 2015 it has found reasonably widespread adoption among R users
because repositories with marked releases is the better way to
distribute code.
Because for once it really is as your mother told you: Friends
don’t let friends install random git commit snapshots. Properly
rolled-up releases it is. Just how CRAN shows us: a model that has
demonstrated for over two-and-a-half decades how to do this.
And you can too: drat is easy to use, documented by six
vignettes and just works. Detailed information about
drat is at its documentation site. That
said, and ‘these days’, if you mainly care about github code then r-universe is there too, also
offering binaries its makes and all that jazz. But sometimes you just
want to, or need to, roll a local repository and drat can help
you there.
This release contains a small PR (made by Arne Holmin just after the
previous release) adding support for an ‘OSflacour’ variable (helpful
for macOS). We also corrected an issue with one test file being
insufficiently careful of using git2r only when installed,
and as usual did a round of maintenance for the package concerning both
continuous integration and documentation.
The NEWS file summarises the release as follows:
Changes in drat
version 0.2.5 (2024-10-21)
Function insertPackage has a new optional argument
OSflavour (Arne Holmin in #142)
A test file conditions correctly about git2r being present (Dirk)
Several smaller packaging updates and enhancements to continuous
integration and documentation have been added (Dirk)
I'm in the unlucky position to have to deal with GitHub. Thus
I've a terraform module in a project which deals with
populating organization secrets in our GitHub organization, and
assigning repositories access to those secrets.
Since the GitHub terraform provider internally works mostly
with repository IDs, not slugs (this human readable
organization/repo format), we've to do some mapping in between.
In my case it looks like this:
#tfvars Input for Module
org_secrets = {
"SECRET_A" = {
repos = [
"infra-foo",
"infra-baz",
"deployment-foobar",
]
"SECRET_B" = {
repos = [
"job-abc",
"job-xyz",
]
}
}
# Module Code
/*
Limitation: The GH search API which is queried returns at most 1000
results. Thus whenever we reach that limit this approach will no longer work.
The query is also intentionally limited to internal repositories right now.
*/
data "github_repositories" "repos" {
query = "org:myorg archived:false -is:public -is:private"
include_repo_id = true
}
/*
The properties of the github_repositories.repos data source queried
above contains only lists. Thus we've to manually establish a mapping
between the repository names we need as a lookup key later on, and the
repository id we got in another list from the search query above.
*/
locals {
# Assemble the set of repository names we need repo_ids for
repos = toset(flatten([for v in var.org_secrets : v.repos]))
# Walk through all names in the query result list and check
# if they're also in our repo set. If yes add the repo name -> id
# mapping to our resulting map
repos_and_ids = {
for i, v in data.github_repositories.repos.names : v => data.github_repositories.repos.repo_ids[i]
if contains(local.repos, v)
}
}
resource "github_actions_organization_secret" "org_secrets" {
for_each = var.org_secrets
secret_name = each.key
visibility = "selected"
# the logic how the secret value is sourced is omitted here
plaintext_value = data.xxx
selected_repository_ids = [
for r in each.value.repos : local.repos_and_ids[r]
if can(local.repos_and_ids[r])
]
}
Now if we do something bad, delete a repository and forget to remove it
from the configuration for the module, we receive some error message that a (numeric)
repository ID could not be found. Pretty much useless for the average user because
you've to figure out which repository is still in the configuration list, but got deleted
recently.
Luckily terraform supports since version
1.2 precondition checks, which we can use in an output-block
to provide the information which repository is missing. What we
need is the set of missing repositories and the validation condition:
locals {
# Debug facility in combination with an output and precondition check
# There we can report which repository we still have in our configuration
# but no longer get as a result from the data provider query
missing_repos = setsubtract(local.repos, data.github_repositories.repos.names)
}
# Debug facility - If we can not find every repository in our
# search query result, report those repos as an error
output "missing_repos" {
value = local.missing_repos
precondition {
condition = length(local.missing_repos) == 0
error_message = format("Repos in config missing from resultset: %v", local.missing_repos)
}
}
Now you only have to be aware that GitHub is GitHub and the TF provider has open bugs,
but is not supported by GitHub and you will encounter
inconsistent results. But
it works, even if your terraform apply failed that way.
As usual with these every-two-year posts, probably of direct interest only
to California residents. Maybe the more obscure things we're voting on
will be a minor curiosity to people elsewhere. I'm a bit late this year,
although not as late as last year, so a lot of people may have already
voted, but I've been doing this for a while and wanted to keep it up.
This post will only be about the ballot propositions. I don't have
anything useful to say about the candidates that isn't hyper-local. I
doubt anyone who has read my posts will be surprised by which candidates
I'm voting for.
As always with Calfornia ballot propositions, it's worth paying close
attention to which propositions were put on the ballot by the legislature,
usually because there's some state law requirement (often that I disagree
with) that they be voted on by the public, and propositions that were put
on the ballot by voter petition. The latter are often poorly written and
have hidden problems. As a general rule of thumb, I tend to default to
voting against propositions added by petition. This year, one can
conveniently distinguish by number: the single-digit propositions were
added by the legislature, and the two-digit ones were added by petition.
Proposition 2: YES. Issue $10 billion in bonds for public school
infrastructure improvements. I generally vote in favor of spending
measures like this unless they have some obvious problem. The opposition
argument is a deranged rant against immigrants and government debt and
fails to point out actual problems. The opposition argument also claims
this will result in higher property taxes and, seriously, if only that
were true. That would make me even more strongly in favor of it.
Proposition 3: YES. Enshrines the right to marriage without
regard to sex or race into the California state constitution. This is
already the law given US Supreme Court decisions, but fixing California
state law is a long-overdue and obvious cleanup step. One of the quixotic
things I would do if I were ever in government, which I will never be,
would be to try to clean up the laws to make them match reality, repealing
all of the dead clauses that were overturned by court decisions or are
never enforced. I am in favor of all measures in this direction even when
I don't agree with the direction of the change; here, as a bonus, I also
strongly agree with the change.
Proposition 4: YES. Issue $10 billion in bonds for
infrastructure improvements to mitigate climate risk. This is basically
the same argument as Proposition 2. The one drawback of this measure is
that it's kind of a mixed grab bag of stuff and probably some of it should
be supported out of the general budget rather than bonds, but I consider
this a minor problem. We definitely need to ramp up climate risk
mitigation efforts.
Proposition 5: YES. Reduces the required super-majority to pass
local bond measures for affordable housing from 67% to 55%. The fact that
this requires a supermajority at all is absurd, California desperately
needs to build more housing of any kind however we can, and publicly
funded housing is an excellent idea.
Proposition 6: YES. Eliminates "involuntary servitude" (in other
words, "temporary" slavery) as a legally permissible punishment for crimes
in the state of California. I'm one of the people who think the 13th
Amendment to the US Constitution shouldn't have an exception for
punishment for crimes, so obviously I'm in favor of this. This is one
very, very tiny step towards improving the absolutely atrocious prison
conditions in the state.
Proposition 32: YES. Raises the minimum wage to $18 per hour
from the current $16 per hour, over two years, and ties it to inflation.
This is one of the rare petition-based propositions that I will vote in
favor of because it's very straightforward, we clearly should be raising
the minimum wage, and living in California is absurdly expensive because
we refuse to build more housing (see Propositions 5 and 33). The
opposition argument is the standard lie that a higher minimum wage will
increase unemployment, which we know from numerous other natural
experiments is simply not true.
Proposition 33: NO. Repeals Costa-Hawkins, which prohibits local
municipalities from enacting rent control on properties built after 1995.
This one is going to split the progressive vote rather badly, I suspect.
California has a housing crisis caused by not enough housing supply. It
is not due to vacant housing, as much as some people would like you to
believe that; the numbers just don't add up. There are way more people
living here and wanting to live here than there is housing, so we need to
build more housing.
Rent control serves a valuable social function of providing stability to
people who already have housing, but it doesn't help, and can hurt, the
project of meeting actual housing demand. Rent control alone
creates a two-tier system where people who have housing are protected but
people who don't have housing have an even harder time getting housing
than they do today. It's therefore quite consistent with the general
NIMBY playbook of trying to protect the people who already have housing by
making life harder for the people who do not, while keeping the housing
supply essentially static.
I am in favor of rent control in conjunction with real measures to
increase the housing supply. I am therefore opposed to this proposition,
which allows rent control without any effort to increase housing supply.
I am quite certain that, if this passes, some municipalities will use it
to make constructing new high-density housing incredibly difficult by
requiring it all be rent-controlled low-income housing, thus cutting off
the supply of multi-tenant market-rate housing entirely. This is already
a common political goal in the part of California where I live. Local
neighborhood groups advocate for exactly this routinely in local political
fights.
Give me a mandate for new construction that breaks local zoning
obstructionism, including new market-rate housing to maintain a healthy
lifecycle of housing aging into affordable housing as wealthy people move
into new market-rate housing, and I will gladly support rent control
measures as part of that package. But rent control on its own just
allocates winners and losers without addressing the underlying problem.
Proposition 34: NO. This is an excellent example of why I vote
against petition propositions by default. This is a law designed to
affect exactly one organization in the state of California: the AIDS
Healthcare Foundation. The reason for this targeting is disputed; one
side claims it's because of the AHF support for Proposition 33, and
another side claims it's because AHF is a slumlord abusing California
state funding. I have no idea which side of this is true. I also don't
care, because I am fundamentally opposed to writing laws this way. Laws
should establish general, fair principles that are broadly applicable, not
be written with bizarrely specific conditions (health care providers that
operate multifamily housing) that will only be met by a single
organization. This kind of nonsense creates bad legal codes and the legal
equivalent of technical debt. Just don't do this.
Proposition 35: YES. I am, reluctantly, voting in favor of this
even though it is a petition proposition because it looks like a useful
simplification and cleanup of state health care funding, makes an expiring
tax permanent, and is supported by a very wide range of organizations that
I generally trust to know what they're talking about. No opposition
argument was filed, which I think is telling.
Proposition 36: NO. I am resigned to voting down attempts to
start new "war on drugs" nonsense for the rest of my life because the
people who believe in this crap will never, ever, ever stop. This one has
bonus shoplifting fear-mongering attached, something that touches on nasty
local politics that have included large retail chains manipulating crime
report statistics to give the impression that shoplifting is up
dramatically. It's yet another round of the truly horrific California
"three strikes" criminal penalty obsession, which completely
misunderstands both the causes of crime and the (almost nonexistent)
effectiveness of harsh punishment as deterrence.
Ada Lovelace Day was
celebrated on October 8 in 2024, and on this occasion, to celebrate and
raise awareness of the contributions of women to the STEM fields we
interviewed some of the women in Debian.
Here we share their thoughts, comments, and concerns with the hope of inspiring
more women to become part of the Sciences, and of course, to work inside of
Debian.
This article was simulcasted to the debian-women mail list.
Beatrice Torracca
1. Who are you?
I am Beatrice, I am Italian. Internet technology and everything computer-related
is just a hobby for me, not my line of work or the subject of my academic
studies. I have too many interests and too little time. I would like to do lots
of things and at the same time I am too Oblomovian to do any.
2. How did you get introduced to Debian?
As a user I started using newsgroups when I had my first dialup connection and
there was always talk about this strange thing called
Linux. Since moving from DR DOS to Windows was a shock
for me, feeling like I lost the control of my machine, I tried Linux with
Debian Potato and I never strayed
away from Debian since then for my personal equipment.
3. How long have you been into Debian?
Define "into". As a user... since Potato, too many years to count. As a
contributor, a similar amount of time, since early 2000 I think. My first
archived email about contributing to the translation of the description of
Debian packages dates 2001.
4. Are you using Debian in your daily life? If yes, how?
Yes!! I use testing. I have it on my desktop PC at home and I have it on my
laptop. The desktop is where I have a local IMAP server that fetches all the
mails of my email accounts, and where I sync and back up all my data. On both I
do day-to-day stuff (from email to online banking, from shopping to taxes), all
forms of entertainment, a bit of work if I have to work from home
(GNU R for statistics,
LibreOffice... the usual suspects). At work I am
required to have another OS, sadly, but I am working on setting up a
Debian Live system to use there too.
Plus if at work we start doing bioinformatics there might be a Linux machine in
our future... I will of course suggest and hope for a Debian system.
5. Do you have any suggestions to improve women's participation in Debian?
This is a tough one. I am not sure. Maybe, more visibility for the women already
in the Debian Project, and make the newcomers feel seen, valued and welcomed. A
respectful and safe environment is key too, of course, but I think Debian made
huge progress in that aspect with the
Code of Conduct. I am a big fan of
promoting diversity and inclusion; there is always room for improvement.
Ileana Dumitrescu (ildumi)
1. Who are you?
I am just a girl in the world who likes cats and packaging
Free Software.
2. How did you get introduced to Debian?
I was tinkering with a computer running Debian a few years ago, and I decided to
learn more about Free Software. After a search or two, I found
Debian Women.
3. How long have you been into Debian?
I started looking into contributing to Debian in 2021. After contacting Debian
Women, I received a lot of information and helpful advice on different ways I
could contribute, and I decided package maintenance was the best fit for me. I
eventually became a Debian Maintainer in 2023, and I continue to maintain a few
packages in my spare time.
4. Are you using Debian in your daily life? If yes, how?
Yes, it is my favourite GNU/Linux operating system! I use it for email,
chatting, browsing, packaging, etc.
5. Do you have any suggestions to improve women's participation in Debian?
The mailing list for Debian Women may
attract more participation if it is utilized more. It is where I started, and I
imagine participation would increase if it is more engaging.
Kathara Sasikumar (kathara)
1. Who are you?
I'm Kathara Sasikumar, 22 years old and a recent Debian user turned Maintainer
from India. I try to become a creative person through sketching or playing
guitar chords, but it doesn't work! xD
2. How did you get introduced to Debian?
When I first started college, I was that overly enthusiastic student who signed
up for every club and volunteered for anything that crossed my path just like
every other fresher.
But then, the pandemic hit, and like many, I hit a low point. COVID depression
was real, and I was feeling pretty down. Around this time, the
FOSS Club at my college suddenly became more active.
My friends, knowing I had a love for free software, pushed me to join the club.
They thought it might help me lift my spirits and get out of the slump I was in.
At first, I joined only out of peer pressure, but once I got involved, the club
really took off. FOSS Club became more and more active during the pandemic, and
I found myself spending more and more time with it.
A year later, we had the opportunity to host a
MiniDebConf at our college. Where I got to
meet a lot of Debian developers and maintainers, attending their talks
and talking with them gave me a wider perspective on Debian, and I loved the
Debian philosophy.
At that time, I had been distro hopping but never quite settled down. I
occasionally used Debian but never stuck around. However, after the MiniDebConf,
I found myself using Debian more consistently, and it truly connected with me.
The community was incredibly warm and welcoming, which made all the difference.
3. How long have you been into Debian?
Now, I've been using Debian as my daily driver for about a year.
4. Are you using Debian in your daily life? If yes, how?
It has become my primary distro, and I use it every day for continuous learning
and working on various software projects with free and open-source tools. Plus,
I've recently become a Debian Maintainer (DM) and have taken on the
responsibility of maintaining a few packages. I'm looking forward to
contributing more to the Debian community 🙂
Rhonda D'Vine (rhonda)
1. Who are you?
My name is Rhonda, my pronouns are she/her, or per/pers. I'm 51 years old,
working in IT.
2. How did you get introduced to Debian?
I was already looking into Linux because of university, first it was
SuSE. And people played around with gtk. But when they
packaged GNOME and it just didn't even install I
looked for alternatives. A working colleague from back then gave me a CD of
Debian. Though I couldn't install from it because
Slink didn't recognize the pcmcia
drive. I had to install it via floppy disks, but apart from that it was
quite well done. And the early GNOME was working, so I never looked back. 🙂
3. How long have you been into Debian?
Even before I was more involved, a colleague asked me whether I could help with
translating the release documentation. That was my first contribution to Debian,
for the slink release in early 1999. And I was using some other software before
on my SuSE systems, and I wanted to continue to use them on Debian obviously. So
that's how I got involved with packaging in Debian. But I continued to help with
translation work, for a long period of time I was almost the only person active
for the German part of the website.
4. Are you using Debian in your daily life? If yes, how?
Being involved with Debian was a big part of the reason I got into my jobs since
a long time now. I always worked with maintaining Debian (or
Ubuntu) systems.
Privately I run Debian on my laptop, with occasionally switching to Windows in
dual boot when (rarely) needed.
5. Do you have any suggestions to improve women's participation in Debian?
There are factors that we can't influence, like that a lot of women are pushed
into care work because patriarchal structures work that way, and don't have the
time nor energy to invest a lot into other things. But we could learn to
appreciate smaller contributions better, and not focus so much on the quantity
of contributions. When we look at longer discussions on mailing lists, those
that write more mails actually don't contribute more to the discussion, they
often repeat themselves without adding more substance. Through working on our
own discussion patterns this could create a more welcoming environment for a lot
of people.
Sophie Brun (sophieb)
1. Who are you?
I'm a 44 years old French woman. I'm married and I have 2 sons.
2. How did you get introduced to Debian?
In 2004 my boyfriend (now my husband) installed Debian on my personal computer
to introduce me to Debian. I knew almost nothing about Open Source. During my
engineering studies, a professor mentioned the existence of Linux,
Red Hat in particular, but without giving any details.
I've been a user since 2004. But I only started contributing to Debian in 2015:
I had quit my job and I wanted to work on something more meaningful. That's why
I joined my husband in Freexian, his company.
Unlike most people I think, I started contributing to Debian for my work. I only
became a DD in 2021 under gentle social pressure and when I felt confident
enough.
4. Are you using Debian in your daily life? If yes, how?
Of course I use Debian in my professional life for almost all the tasks: from
administrative tasks to Debian packaging.
I also use Debian in my personal life. I have very basic needs:
Firefox,
LibreOffice, GnuCash
and Rhythmbox are the main
applications I need.
Sruthi Chandran (srud)
1. Who are you?
A feminist, a librarian turned Free Software advocate and a Debian Developer.
Part of Debian Outreach team and
DebConf Committee.
2. How did you get introduced to Debian?
I got introduced to the free software world and Debian through my husband. I
attended many Debian events with him. During one such event, out of curiosity, I
participated in a Debian packaging workshop. Just after that I visited a Tibetan
community in India and they mentioned that there was no proper Tibetan font in
GNU/Linux. Tibetan font was my first package in Debian.
3. How long have you been into Debian?
I have been contributing to Debian since 2016 and Debian Developer since 2019.
4. Are you using Debian in your daily life? If yes, how?
I haven't used any other distro on my laptop since I got introduced to Debian.
5. Do you have any suggestions to improve women's participation in Debian?
I was involved with actively mentoring newcomers to Debian since I started
contributing myself. I specially work towards reducing the gender gap inside the
Debian and Free Software community in general. In my experience, I believe that
visibility of already existing women in the community will encourage more women
to participate. Also I think we should reintroduce mentoring through
debian-women.
Tássia Camões Araújo (tassia)
1. Who are you?
Tássia Camões Araújo, a Brazilian living in Canada. I'm a passionate learner who
tries to push myself out of my comfort zone and always find something new to
learn. I also love to mentor people on their learning journey. But I don't
consider myself a typical geek. My challenge has always been to not get
distracted by the next project before I finish the one I have in my hands. That
said, I love being part of a community of geeks and feel empowered by it. I love
Debian for its technical excellence, and it's always reassuring to know that
someone is taking care of the things I don't like or can't do. When I'm not
around computers, one of my favorite things is to feel the wind on my cheeks,
usually while skating or riding a bike; I also love music, and I'm always
singing a melody in my head.
2. How did you get introduced to Debian?
As a student, I was privileged to be introduced to FLOSS at the same time I was
introduced to computer programming. My university could not afford to have labs
in the usual proprietary software model, and what seemed like a limitation at
the time turned out to be a great learning opportunity for me and my colleagues.
I joined this student-led initiative to "liberate" our servers and build
LTSP-based labs - where a single powerful computer could power a few dozen
diskless thin clients. How revolutionary it was at the time! And what an
achievement! From students to students, all using Debian. Most of that group
became close friends; I've married one of them, and a few of them also found
their way to Debian.
3. How long have you been into Debian?
I first used Debian in 2001, but my first real connection with the community was
attending DebConf 2004. Since then, going to DebConfs has become a habit. It is
that moment in the year when I reconnect with the global community and my
motivation to contribute is boosted. And you know, in 20 years I've seen people
become parents, grandparents, children grow up; we've had our own child and had
the pleasure of introducing him to the community; we've mourned the loss of
friends and healed together. I'd say Debian is like family, but not the kind you
get at random once you're born, Debian is my family by choice.
4. Are you using Debian in your daily life? If yes, how?
5. Do you have any suggestions to improve women's participation in Debian?
I think the most effective way to inspire other women is to give visibility to
active women in our community. Speaking at conferences, publishing content,
being vocal about what we do so that other women can see us and see themselves
in those positions in the future. It's not easy, and I don't like being in the
spotlight. It took me a long time to get comfortable with public speaking, so I
can understand the struggle of those who don't want to expose themselves. But I
believe that this space of vulnerability can open the way to new connections. It
can inspire trust and ultimately motivate our next generation. It's with this in
mind that I publish these lines.
Another point we can't neglect is that in Debian we work on a volunteer basis,
and this in itself puts us at a great disadvantage. In our societies, women
usually take a heavier load than their partners in terms of caretaking and other
invisible tasks, so it is hard to afford the free time needed to volunteer. This
is one of the reasons why I bring my son to the conferences I attend, and so far
I have received all the support I need to attend DebConfs with him. It is a way
to share the caregiving burden with our community - it takes a village to raise
a child. Besides allowing us to participate, it also serves to show other women
(and men) that you can have a family life and still contribute to Debian.
My feeling is that we are not doing super well in terms of diversity in Debian
at the moment, but that should not discourage us at all. That's the way it is
now, but that doesn't mean it will always be that way. I feel like we go through
cycles. I remember times when we had many more active female contributors, and
I'm confident that we can improve our ratio again in the future. In the
meantime, I just try to keep going, do my part, attract those I can, reassure
those who are too scared to come closer. Debian is a wonderful community, it is
a family, and of course a family cannot do without us, the women.
These interviews were conducted via email exchanges in October, 2024. Thanks to
all the wonderful women who participated in this interview. We really appreciate
your contributions in Debian and to Free/Libre software.
Cybersecurity is a vital topic for Switzerland and
social engineering attacks are a significant issue in the realm
of cybersecurity.
Organizations like Google, Facebook and LinkedIn could be seen as a
very effective social engineering attack against Swiss culture and
privacy.
Frans Pop, the
Debian Day Volunteer Suicide Victim, had sent at least one of
his suicide notes on debian-private gossip network the
night before Debian Day. If an organization can get into somebody's
head like that, such that decisions about life and death revolve
around this software, we could contemplate the possibility that
Frans Pop died under the influence of a social engineering culture.
Adrian von Bidder died on the same day that Carla and I got married.
Why can't we ask questions about that?
Switzerland reportedly has
a higher per-capita ratio of Debian Developers than any other country
except perhaps Ireland. Yet according to Shuttleworth's
email, many of these people have a loyalty to Debian culture that is above
their loyalty to Swiss employers and Swiss law. This dual
allegience appears to be a sign that they are under the sway of
social engineering or at risk of external influence.
By way of background, in 2006, Adrian and Diana got
married. In 2007,
the suicide petition to Basel Stadt authorities
was signed by A. von Bidder.
In August 2010, we had the confirmed suicide of Frans Pop, the
warning from Mark Shuttleworth and a sustained period of stress
among volunteers in the Debian Developer world.
In April 2011, Adrian von Bidder died. It was discussed like a suicide
but they told us casually that it could be a heart attack.
There was no comment about whether the couple had any children
during the five years of their marriage.
On 28 April 2011, very soon after von Bidder died, Diana modified his blog,
adding a new post:
Sadly, I have to make an end to this blog. Adrian - my husband - died on april 17th of a heart attack.
Adrian von Bidder had made various blog posts with critical commentary
about the risks of social media and other devious enterprises.
Many of his concerns have been proven correct by the passage of time.
Yet I feel the manner in which Diana writes "I have to make an end to this
blog" has an air of disapproval for Adrian's work. Then again,
this must have been a very disturbing time for Diana and on top
of that, English may not be her native language so the tone
of her comments may not reflect her real thoughts and feelings about
the subject.
Some time later, Diana completely erased the blog, removed the DNS
entry for blog. and placed a picture on the main page
fortytwo.ch.
The picture's metadata tells us
it was taken on 20 January 2011 with a Canon EOS 40D, possibly
the camera Adrian discussed in some of his blog posts.
We know that other Debian Developers in Switzerland were subject
to social engineering attacks involving blackmail and public
humiliation. One of those cases was the blackmail of
Daniel Baumann. Did Adrian von Bidder receive similar messages
in the days before his heart attack?
Did Adrian von Bidder communicate with anybody before his
heart attack, for example, leaving a note? In English-speaking countries,
all these things are published by the coroner's office. In
Switzerland, it is the opposite, evidence is only given to those
in close proximity to the deceased. At the time, Diana may not
have known about the earlier suicide of Frans Pop. She may not
have realized there was the risk of a connection between deaths
in a single community. Now
the suicide cluster is public knowledge, is it time for a fresh
discussion about that?
Most cybersecurity experts around the world believe that
transparency is important for education and mitigating risks.
Here is a photo of Diana and Adrian on their wedding day:
Hitler and the Nazis were obsessed with the idea that Jews
could be identified by a distinctive smell. While America was
building the A-bomb, Hitler
diverted science funding to research the Jewish smell.
The smell was rumored to resemble sulfur.
It makes the case that there was a shift in the way that smell, beginning in the late nineteenth century, was used to not simply demarcate groups but, in addition, to supposedly detect ‘race’ and ethnicity.
Prominent Debian Developer Daniel Pocock has recently released
details of the
Swiss harassment judgment. His former landlady, an organizer of
the SVP senioren (far right Swiss seniors group) had started rumors
about a smell coming from Pocock's cats. Even the judge asked
if it could be acceptable to pose questions about this imaginary smell.
Obviously the judge was not familiar with this awkward similarity
to the persecution of Jewish and African people throughout
history.
For about six years now, people have been creating gossip about
harassment and abuse against various Debian co-authors. Nobody ever
provided any evidence.
Earlier this year, when I nominated in the European elections, the
misfits were desperate to attack me but they didn't have any grounds to do so.
They waited until the last minute before voting began and on
6 June 2024, the day before voting, they
published a document that appears to be invalid, full of forgeries, racism
and nonsense.
But wait, there really was a harassment case and a judgment.
With the Irish General Election approaching, I am considering whether
to nominate again and it is really important that people can see the
truth about who really harassed who.
Swiss racism, cats of colour, women harassing women and a 10,000 Swiss franc settlement
The only mistake I made was taking black cats to Zurich.
The real Debian harassment story is about women harassing women
and occasionally, a woman harassing our cats and women harassing men.
In Switzerland, both in the law and in the culture, when you have
a harassment problem like this the matter is usually settled privately
and everybody moves on with their life as quickly as possible.
Carla and our black cats, who are also female victims, were subject
to racism from a white Swiss woman. We received a payment of CHF 10,000.
Surely I would have rushed to publish that on my blog the same day.
But I didn't publish it before. When the WeMakeFedora case was resolved,
I immediately put it on my blog. But in the case of the harassment
in Zurich, I wanted to respect all the parties involved, I wanted to respect
the Swiss cultural approach to such disputes in Switzerland and just
put it out of my mind and get on with serious problems.
Nonetheless, Debianists, including people like Axel Beckert at
ETH Zurich and at the Google office in Zurich have been stirring
up rumors about the harassment and paw behavior for six years.
Ironically, the Google engineering headquarters for Europe
is located in Zurich and Google's role in spreading rumors about
the harassment case had actually undermined the privacy that
people used to take for granted in Switzerland.
Women harassing women: a common problem
In the case of serious violent crime against women, the majority
of perpetrators appear to be male.
In the case of less tangible crimes, like harassment, stalking,
racism and even sexism, we can find many cases where women are
either protagonists or associates of an offender.
The recent Netflix series
Baby Reindeer
cast a spotlight on the story of a woman harassing a male
employee at a bar.
In 2021, we saw a female volunteer, Molly de Blanc, started an online
petition
harassing her former boss, Dr Richard Stallman at FSF. Approximately
three thousand people joined the petition but a petition about a person
is not a real petition at all, it is harassment. de Blanc made the petition
more than two years after leaving her job at FSF.
In a previous blog, I looked at the case of another non-developing
Debian volunteer, Laura Arjona,
harassing one of my female interns in the Outreachy program.
After learning that this goes on behind mentors' backs, I didn't
volunteer to be a mentor again.
Then there was Amaya Rodrigo Sastre who helped spread the rumors
that Ted Walther's partner at the DebConf6 dinner was
alleged to be a prostitute. In fact, the woman was a dentist and
these rumors were disastrous for her reputation.
Ariadne Conill from the Alpine Linux project, which has no
relationship to Switzerland as far as I can tell, was spreading
the rumor that my intern in Google Summer of Code was my girlfriend.
The rumor was offensive to me but even more offensive to the intern
because
that was the year she got married.
Shortly before DebConf15, we received
nasty messages from Margarita (Marga) Manterola of Google telling us
that Carla is not welcome to eat the food at DebConf, despite the
fact that other woman like Marga go there with their husbands every year.
While waiting for the train to go down the
Uetliberg one day, Carla and I were talking to a British woman
in the playground beside the railway station. The woman told
us about her Swiss landlady, a little old lady, who had been
whinging and whining about the behavior of her small children.
The Swiss landlady had become quite obsessed and had even been
caught at the window taking pictures of the way the children played
inside their home.
Looking at the
invalid and falsified legal documents distributed
by rogue members of Debian, we can find various references to
my Irish heritage. Everybody seems to know that I was born and
raised in Australia. I acquired Irish citizenship because my mother
is from Ireland. We find that the racist women in Switzerland, and
we'll see more of them in this blog, are not classifying people based
on our skills and talents, they are obsessed about little things
like my mother's Irish heritage. In fact, some of these
documents were prepared by two women in Zurich,
Pascale Koster and Albane die Ziegler. The documents don't mention
that I am a citizen of three countries, they emphasize my Irish
heritage as some kind of a hint to their racist colleagues
that my mother and I should be treated badly in Zurich.
What we see here is another example of women being offensive to
other women.
One of the most well known examples of women exhibiting poor
behavior to other women in Zurich was the infamous Oprah Winfrey
handbag incident. A woman in the handbag shop refused to let
Oprah look at a particular handbag. Oprah gives a testimony about
her experience with the Swiss saleswoman (Kauffrau) in this
video:
This brings us to the point where we will consider the paw behavior
of a Swiss landlady towards Carla and our black cats, who are both female
cats, so there was a female offender and three female victims.
I don't wish to make the generalization that all women are like
this. I've worked with many professional women who act with
integrity in everything they do. But when we see gossipmongers
making up stories about harassment in groups like Debian, we need
to remember the risk of listening to attention seekers
and their paid lawyers/liars. Gossip and
social engineering attacks go hand in hand and if we care
about cybersecurity, we need to call out gossip behavior.
Harassment and racism are not only Swiss problems
Before rushing to any conclusion about racism in Switzerland,
we need to remember that there is racism in every country.
When we look at the concerns about Brexit in the United Kingdom,
there was a lot of racism during the campaign period before the
referendum. Some of the practical changes in the UK, like
canceling the driving licenses of foreigners, actually happened
before the Brexit referendum. Likewise, whenever there is a Swiss
referendum about the relationship with the EU, some people
may voice racist opinions about the subject but there may be
some valid political or economic discussions that take place
at the same time.
We can also ask the question: are there times when Swiss citizens
are subject to extreme acts of bullying or extreme injustice by
employers, landladies or the public authorities? In fact,
some examples do exist.
Looking at the JuristGate
affair, we can see that the rogue legal protection scheme, which
smells like a ponzi scheme, had both Swiss customers and foreign
customers. All the customers lost their money at the same time.
When FINMA shut down the rogue insurance, they hid the details
from everybody, both Swiss and foreign clients were kept in the dark
to an equal extent. Therefore, there was extraordinary injustice,
there were some foreign clients but racism wasn't the main theme
in JuristGate.
When I look at
the case of Adrian von Bidder (avbb / cmot),
the Debian Developer who died on our wedding day,
I wonder if he had one of the same bad experiences that
foreigners often complain about in Switzerland. For example, did one
of the health insurance companies bungle a treatment for his wife
or did an employer fail to make contributions to his pension scheme
and then go into liquidation?
Here is a photo of Diana and Adrian on their wedding day:
In Swiss culture, sensitivity about the cause of death is
an important cultural consideration. After blogging the
initial evidence about how the death was discussed in
the debian-private gossip channel, I came to realize that Adrian's
widow, Diana, was listed as a member of the Basel City
parliament. In such cases, there is obviously even more opportunity
to ask questions about the interaction between the death, any
environmental or cultural factors, whether in Debian or in his community
but at the same time, the cultural aversion to asking those questions
is a very steep obstacle.
Real harassment, real evidence ordered chronologically
Some time in 2017 or 2018, Chris Lamb, former leader of the
Debian project, started making mischievous references to harassment.
He didn't provide any facts, dates, victims or evidence.
Most of the larger property management companies in Zurich and
Switzerland are somewhat consistent in their application of tenancy
regulations.
When people find a nice apartment with a responsible landlord, they
usually keep the apartment for a very long time.
Some smaller buildings, usually sized between five and ten apartments,
are owned by a resident landlord/landlady. This gives rise to the
phenomena where the landlady and tenant may cross paths almost every day.
It goes without saying that the turnover of tenants in some of these
owner-occupier buildings is much higher than in the buildings owned by
a silent investor.
Web sites advertising the apartments sometimes have a checkbox and
filter option for potential tenants to exclude apartments with a resident
landlady (Vermieter wohnt im Mehrfamilienhaus). Most people
who have had a bad experience with one of these will go out of
their way to avoid them in future.
Due to the very high turnover in buildings with a resident landlady,
the number of advertisements for such apartments is disproportionate
to the number of buildings that don't have a resident landlady.
Laundry duties & the status of women
Very new buildings in Switzerland have a washing machine and
clothes dryer in every apartment. Most traditional buildings and
some new buildings have a laundry room or drying room shared by all the
tenants. Most buildings have a handwritten roster where the tenants
can reserve the machines for a particular day.
You may only have one reservation to use the laundry every two weeks.
If that reservation falls on a work day and you have multiple loads
of washing to do then it can be very inconvenient. Nonetheless,
nobody sees any urgency to change this system. There is a prevailing
attitude that the wife or girlfriend will stay home on the laundry day
and ensure that all the clothes are nicely washed, dried and folded
and the laundry room is left in a proper state for the tenant
who will use it on the following day.
Switzerland is notable for its neutral status and hosting diplomats
from around the world at the United Nations in Geneva. But if
the washing machine breaks down and one tenant's drying time
runs over into the next day, there is anything but diplomacy
and tenants regress to communicating with each other through handwritten
notes written in
one of the four official Swiss languages.
The application process, religious harassment and cats
When tenants arrive to visit a prospective apartment, they are
given an application form that must be completed for the landlord
or letting manager.
They tend to ask more questions than necessary. It is not unusual
to find questions about your religious affiliations on the form.
We can quickly find examples of these forms in a search engine
by searching for words like Anmeldungformular and
Konfession (
Example 1,
Example 2,
Example 3).
In effect, if your religion has been
persecuted in Switzerland,
you may well feel that filling out the application form
is an experience of harassment.
News articles appear from time to time about whether or not
you should declare your religion. (
Example 1,
Example 2,
Example 3).
Not every Anmeldungformular asks about religion but
it is almost certain they will ask about your pets and musical
instruments. It is a good idea to answer those questions honestly
in any country. While some landlords and letting agents will decline
certain requests, others will be quite
happy to direct you to the most suitable apartments for your
lifestyle.
Whenever we applied for any apartment in Switzerland, we did
so with total honesty and integrity. We declared our cats
(Katzen):
Specifically, we have written Hauskatzen, which literally
translates to house cats. In other words, we are not
talking about something exotic like a tiger or panther.
No room for undocumented aliens
The confession of cat ownership led to a flurry of paperwork
mediated by the letting agent. Everybody who rents an apartment
in Switzerland is expected to purchase a civil liability insurance
and pay three months of rent as a security deposit.
In our case, that simply wasn't enough. The landlady insisted
that we sign a guarantee against any paw behavior by our cats:
Costs anticipated by this document were already anticipated
by the security deposit and our civil liability insurance. Therefore,
I feel this additional cat contract was superfluous. Can we call
it harassment or bullying?
Fair wear and tear
Switzerland has high standards for construction and due to
the level of wealth, even the most mundane apartments typically
have very high quality components in their bathrooms and kitchens.
It is typical to have mixer taps on the showers and sinks, good water
pressure and wall mounted toilets.
When tenancies are concluded in Switzerland, the apartment or
house is subject to a forensic examination that may last several
hours.
It is expected that the tenant leaving an apartment will arrange
to have it cleaned back to the original state before the inspection
day.
Even if the bathroom is 30 or 40 years old, the high quality
components still look like new after each cleaning.
Nonetheless, internal components like washers and gaskets don't
last forever, no matter how beautiful the sinks and toilet bowls
appear on the outside.
In this particular apartment we experienced the failure of both
the shower mixer and the gasket joining the cistern to the toilet
bowl. Both of these things failed within a short span of time.
The plumber came promptly to make the necessary repairs.
Nonetheless, after the drama about whether our cats were a national
security risk, we were never on a good footing with this
particular landlady. She was 76 years old and the far right party,
of which she was a member, was constantly warning her to be
on the lookout for mischievous foreigners.
If you look at the far right propaganda circulated in advance of
referendums and elections in Switzerland, the foreigners are
typically depicted in black, like our cats.
At Kaltbad on the Rigi, we found a white cat in the snow:
A large professional landlord company with thousands of
apartments probably wouldn't worry about the cost of repairing
these washers and gaskets. On the other hand, for these owner-occupier
landladies who like to micro-manage their tenancies, some of them
stay up all night worrying about whether
tenants (or cats) do something like this as a prank.
Here is the report about the shower defect about two weeks
after we moved in. There is no way that tenants or cats could
have put rust into the pipes. These are simply the problems of
an old building.
Subject: bath / shower water problems
Date: Thu, 1 Dec 2016 09:39:29 +0100
From: Daniel Pocock <daniel@pocock.pro>
To: Letting agent
Hi [redacted],
The plumber visited today, he replaced the dishwasher door and the
shower hose.
He also looked at the flow from the hot water tap in the shower. He
found a lot of rust inside the tap.
He removed the hot and cold taps, cleaned out the taps and ran the water
directly from the pipes in the wall. A lot of rust came out of both hot
and cold pipes.
- the hot water pipe is now flowing better, but it is still less than normal
- water from both hot and cold pipes still has a slight red colour
He said he will contact you to explain and discuss how it can be fixed.
Regards,
Daniel
Cat smell letter
While I was on a trip to the UK, Carla received this ugly letter:
It says there is an unknown smell in the common areas and
it asks if the smell could come from our cats or deficiencies
in cleanliness.
Carla and the cats were really sad.
We contacted our legal insurance and had a lawyer draft a
response. We hoped that would be the end of the matter.
The window nazi
Then came the windows. There are 11 apartments in the
building and somebody would sometimes open one of the windows
in the stairwell and leave it open.
The landlady become obsessed with closing the windows and
leaving handwritten notes on the windows.
Mediation requested
After some months of receiving insults in the post and in the
common areas, it reached a point
where we had to take legal action. We demanded a mediation
session at the tribunal of Zurich.
Our cats were members of our family. Everybody loved our
cats. My Italian cousin came to stay with them on several occasions:
Remarkably, the landlady sent an expensive lawyer to repeat
the accusations about a cat smell, the window in the stairs and
a dirty towel that another tenant found in the washing room.
There were no fingerprints, no paw-prints, no video evidence,
no DNA evidence, not even a whisker to link any
of these problems to us. It was just a witchhunt and as we had
black cats, we were the most recent arrivals in the building,
and we were foreigners, we felt we had been victimized.
Here is the accusation about a disobedient tenant who opens the window
in the stairs:
Swiss lawyer tried to deceive Swiss judge about far right membership
Early in the mediation session, the lawyer for the landlady claimed
that it wasn't clear whether or not she was really a member of the
far right political party.
We were able to show the judge that the landlady had a web site
promoting the party. Here is one of the photos, she is chairing a meeting
and the poster attached to the table has her name and face on it.
The filename tells us it is a meeting of the far right seniors committee
(SVP senioren):
In this photo, she is standing beside then president of the
Kanton parliament, Dr. Christian Huber:
Shortly after the photo
was taken, Dr Huber resigned from the parliament and resigned
from the SVP in mysterious circumstances.
Dr Huber and his spouse spent the next ten years traveling around the
European Union by houseboat. This is ironic of course, a leader
from an anti-immigration/anti-EU party living like a refugee in a boat
in the EU. In Australia the far right uses the term
boat people as a derogatory term for immigrants who travel by boat.
Mystery smell: who is defaming who?
Here is the accusation about a mysterious smell. The lawyer
is saying it is not clear where it comes from because he doesn't
want to be caught defaming foreigners directly. He doesn't provide any
expert evidence or witnesses, he basically says the landlady has a
hunch about this smell and the judge should trust the landlady.
The letting agent is also in the room and if the rumor was credible
he would have surely commented on it. I don't think he wanted
to comment about the smell at all so it came down to the expensive
lawyer to talk this imaginary smell into existence.
When I hear references to these mysterious smells, I feel it is
a way for the jurists to give each other a wink and a nod and ask
for the foreigners to be punished.
Every time Carla went down to the laundry in the basement,
the little old lady would appear. We don't know if she had
video surveillance cameras or if she spent all her day going
up and down the steps to check on the laundry.
Nonetheless, Carla had become quite upset about the cat letter
and the intrusions in the laundry and at some point I had to
start doing the laundry because it was impossible for Carla to
go down there alone.
The landlady was taken aback by the sight of a man in the laundry.
She started calling Carla's employer. We don't know what she was
hoping to achieve. Was she trying to determine if Carla had absconded?
Or was she trying to find out why the employer expected Carla to work
on laundry day?
The lawyer sent a stern letter demanding that these phone calls to
Carla's employer must cease immediately.
Frau [----] hat letzte Woche beim Arbeitsort meiner Mandantin
angerufen und unter Vorwand, sie wolle mit ihr sprechen,
gegenüber der Chefin meiner Mandnatin während ca. 30 Minuten
meine Mandanten im Zusammenhang mit dem vorliegenden Verfahren
angeschwärzt, resp. diese in ihrer Ehre verletzt.
Ich fordere Sie auf, Ihre Klientin über die Tragweite der
Bestimmungen über strafbare Handlungen gegen üble Nachrede
und Verleumdung zu informieren.
Es gab und gibt keinen Grund der direkten Kontaktaufnahme und
insbesondere keinen Grund für Ihr Klientin, beim Arbeitsort
meiner Mandantin anzurufen.
Sollte es noch einmal vorkommen, dass Ihre Klientin gegenüber
meinen Mandanten oder Dritten ausfällig wird und sich sonst
rassistisch äussert, so wird dies entsprechende Konsequenzen haben.
Ich denke auch nicht, dass das Verhalten Ihrer Klientin die
Verhandlungsbereitschaft meiner Mandanten bezüglich des
vorliegenden Verfahrens erhöht.
and translated into English:
Last week, [----] called my client's place of work and,
under the pretext that she wanted to speak to her,
spent around 30 minutes denigrating my client in connection
with the current proceedings to my client's boss, and insulted her honor.
I request that you inform your client of the scope of the
provisions on criminal offenses against slander and defamation.
There was and is no reason to make direct contact and in particular
no reason for your client to call my client's place of work.
Should your client become abusive towards my client or third parties
or otherwise make racist comments, this will have the appropriate
consequences.
I also do not think that your client's behavior increases my
client's willingness to negotiate with regard to the current proceedings.
Would a female judge in Zurich be any more sympathetic than a
female landlady? Maybe not. Here, the landlady's lawyer is explaining that
if the man (me) is busy with my job, the woman (Carla) can look for
another flat. The judge and the translator are both female.
Nobody calls out the sexism.
The search for a flat in Zurich is not a trivial task. In
German, the press refer to it as the Wohnungslotterie. When
a new building is about to be completed, hundreds of prospective
tenants line up outside to submit copies of their Anmeldungformular
in person.
What we see here is Swiss feminism, that is feminism for Swiss women.
I don't think it's up to a man to give the definition of feminism.
But I feel it is safe to say that Swiss feminism or Australian feminism
are contradictions because it is basically privileged women from
rich countries who go to university and become jurists and meddle
in the lives of women from other countries.
One of the reasons we are in court in the first place is because
Carla didn't feel comfortable being that woman from latin America
who does laundry with the Swiss landlady looking over her shoulder.
When Swiss families want to apply for apartments,
they send their foreign nannies to stand in those queues and submit
the forms.
"In the early days ... every client meeting I would be asked to get the coffee. The other male graduates
were never asked to do such things,"
She's right: in more than twenty years since I graduated, nobody ever
asked me to make coffee in the workplace. And when I tried to share
responsibility for doing the laundry in Zurich, the landlady was
opposed to the idea. She seemed to feel that women like Carla were
easier to control.
In Renens, Canton Vaud, a white cat can sleep on the steps at
the railway station and nobody complains about the risk that
somebody might trip over the cat. Every ten minutes, the metro arrives
at the top of the steps and hundreds of people come down the steps to
search for their trains. There is a serious risk that somebody could
trip over the cat and suffer an injury. If it was a black cat, would
the police come with dogs to remove it?
Everybody in west Lausanne seems to know this cat but nobody
knows who it belongs to.
Here is the part of the trial where they talk about the landlady
calling Carla's workplace about the laundry:
Who owns that towel?
Given the lack of evidence about the imaginary cat smell, the
landlady had tried to diversify her legal strategy by introducing
a dirty towel that somebody found in the washing room.
Most landlords would simply provide a basket for
lost property. Even at Swiss prices, the cost of a basket for
these elusive towels and socks would be far less than the cost
of the lawyers.
The cat smell trial consisted of four jurists, an interpreter, the
letting manager and an engineer, myself. The combined cost of our
time was over CHF 2,000 per hour for three hours in court
debating the anxieties of a landlady who didn't show up.
In comparison, many Swiss residents drive over to Germany or
France each weekend for shopping. At
Action in France, you can buy another towel and a lost property
basket for a combined cost of less than ten Swiss francs.
The fact they tried to bring this towelgate affair into the courtroom
only proves that they had no serious case in the first place.
They were clutching at straws.
Speaking English in a Zurich courtroom
I think the judge realized that the landlady had a very weak case
and on top of that, the landlady's lawyer had been somewhat deceptive
about the political connection. The judge decided to continue the
mediation session using the English language.
The far right Swiss landlady was unable to sleep due to the
imaginary smell, the sight of a man doing laundry and our stubborn refusal
to take phone calls during our working hours about every little drama
in the missing towels department. Yet my family had far
more serious concerns due to my father's health. I tried to explain
that in the court but were they listening?
Switzerland is a very small country and many people live in the same
valley where they grew up with their parents. Even if they move from
their valley to a city like Zurich, they can always reach most of
their extended family with a short journey by train.
In the most hostile company where I worked in Switzerland, a line
manager's mother had developed a terminal illness and had less than
six months to live. The manager went back to his country for a number of
months and the company strategy, organization and culture was totally unable
to cope with this situation.
Nonetheless, in our case, Carla's aunt was getting very old and
my father was very ill. The financial cost of the mediation session
where we spoke about missing towels and the imaginary cat smell was
greater than the financial cost of a trip to Australia to see my father.
The judge and I seem to agree there are cultural differences but
the extent to which some people react to small differences is
extraordinary:
Defending the honor of black cats before a Swiss judge
Vous promettez d’être fidèle à la Constitution fédérale et à la Constitution du Canton de Vaud.
Vous promettez de maintenir et de défendre en toute occasion et de tout votre pouvoir les droits, les libertés et
l’indépendance de votre nouvelle patrie, de procurer et d’avancer son honneur et profit, comme aussi d’éviter tout ce qui
pourrait lui porter perte ou dommage
and translated into English:
You promise to be true to the federal constitution and the constitution
of the Canton of Vaud.
You promise to maintain and defend on every occasion and with all your
powers the rights, freedoms and independence of your new country,
to develop and advance her reputation and wealth and equally to
avoid all that could cause her loss or damage.
What does an oath like this mean in practice? In the Zurich courthouse,
I defended the honor and reputation of our black cats before a
Swiss tribunal:
Remarkably, the judge repeats the question about whether there
could be a smell. This was so offensive to us as a family.
In fact, these rumors about smells have Holocaust origins.
Hitler commissioned significant scientific research to determine
if the Jews have a distinctive smell. When the judge tried
to legitimize these black-cat-smell comments in Zurich, I couldn't believe
what I was hearing.
When the
Albanian whistleblowers came to Zurich, they slept with the
cats. Here is Anisa Kuci from OpenStreetmap, Wikimedia and
GNOME Foundation on our sofa bed with Buffy the black kitten
sleeping beside her:
If people want to confirm the cat smell was a lie, just ask Anisa.
Switzerland vs Australia, which country is more beautiful
I feel that honesty is always important in any relationship.
When we see courtrooms on television, the witnesses promise to
tell the whole truth, the complete truth and nothing but the truth.
I guess that mantra stuck in my head. I simply told the tribunal
that I didn't really want that apartment anyway because Australia
is more beautiful. At that very moment, the jurists stop speaking English
and revert to German.
In fact, both Switzerland and Australia have some amazing geographic
and cultural features and I think we were just unlucky with this
particular landlady from the SVP senioren (far right seniors) cabal.
Far right dictator or eccentric old lady?
While this landlady was definitely a member of the far right party,
her behavior was rather foolish and I don't think every member of the
far right party behaves like this. Many of the people in the far right
party own small businesses and they don't want to start silly disputes
with their customers and tourists over things like a missing towel.
In this case, I suspect the propaganda of the far right party
has become mixed up with the aging process and contributed to
behavior that is erratic.
Most political parties and religions try to exploit the insecurities
of little old ladies like this in the hope little old ladies
will leave bequests to the party or the religion in question.
With that in mind, I don't blame the landlady alone for the pain
my family experienced in Zurich.
Google and Debian forcing the harassment verdict into the spotlight
While we had to collect a lot of evidence at the time of the dispute,
I never imagined publishing this case on my blog.
The only reason I am publishing this is because of vague rumors
about a harassment case being distributed on the web sites of Debian,
the World Intellectual Property Organization (WIPO) in Geneva and
some other web sites.
I don't want to encourage cat enthusiasts to seek revenge against
this little old lady. If she is still alive today, and I haven't
even bothered to check, she would be well into her eighties and there
would be no benefit whatsoever from harassing her.
The case was resolved with a cash settlement of CHF 10,000, equivalent
to EUR 10,500 or USD 10,000.
The cats were transported in a box to a new home:
Here is the judgment in German. We've redacted parts of it
to avoid identifying anybody. Ultimately, this was another case
of a woman instigating harassment, a lot like Baby Reindeer:
Chris Lamb and Molly de Blanc violated Swiss privacy
Soon after the harassment case was finished, it was Chris Lamb
and Molly de Blanc who started a gossip campaign.
Some of these women spreading rumors in the free software
community are particularly vicious.
One of the cats, Floe, died shortly after the relocation.
de Blanc then showed up at FOSDEM in Brussels with her
infamous speech about
putting cats behind bars:
de Blanc's behavior was a horrible act of trolling after the
death of our beloved cat.
Carla and I did not choose to make the harassment verdict
public. We didn't have any vendetta with that little old lady. We just
wanted to get on with our lives.
The far right landlady paid the compensation money on time. She
has a right to get on with her life too. She is well into her
eighties now and Google is violating her privacy
with the ongoing gossip about harassment.
The ten thousand Swiss francs we received is less than half
the cost of the handbag that Oprah Winfrey wanted to see in
Bahnhofstrasse, Zurich.
What we see is a range of women, both the landlady and Molly de Blanc,
meddling in peoples' lives. Women and female cats
are victims of these stalkers but the stalkers are women too.
I haven't blogged until now: I should have done from Thursday onwards.
It's
a joy to be here in Cambridge at ARM HQ. Lots of people I recognise
from last year here: lots *not* here because this mini-conference is a
month before the next one in Toulouse and many people can't attend both.
Two
days worth of chatting, working on bits and pieces, chatting and
informal meetings was a very good and useful way to build relationships
and let teams find some space for themselves.
Lots of quiet hacking going on - a few loud conversations. A new ARM machine in mini-ITX format - see Steve McIntyre's blog on planet.debian.org about Rock 5 ITX.
Two
days worth of talks for Saturday and Sunday. For some people, this is a
first time. Lightning talks are particularly good to break down
barriers - three slides and five minutes (and the chance for a bit of
gamesmanship to break the rules creatively).
Longer talks: a
couple from Steve Capper of ARM were particularly helpful to those
interested in upcoming development. A couple of the talks in the
schedule are traditional: if the release team are here, they tell us
what they are doing, for example.
ARM are main sponsors and have
been very generous in giving us conference and facilities space. Fast
network, coffee and interested people - what's not to like :)
[EDIT/UPDATE - And my talk is finished and went fairly well: slides have now been uploaded and the talk is linked from the Mini-DebConf pages]
The thirteenth release of the qlcal package
arrivied at CRAN today.
qlcal
delivers the calendaring parts of QuantLib. It is provided (for the R
package) as a set of included files, so the package is self-contained
and does not depend on an external QuantLib library (which can be
demanding to build). qlcal covers
over sixty country / market calendars and can compute holiday lists, its
complement (i.e. business day lists) and much more. Examples
are in the README at the repository, the package page,
and course at the CRAN package
page.
This releases synchronizes qlcal with
the QuantLib release 1.36 (made
this week) and contains some minor updates to two calendars.
Changes in version 0.0.13
(2024-10-15)
Synchronized with QuantLib 1.36 released yesterday
Calendar updates for South Korea and Poland
Courtesy of my CRANberries, there
is a diffstat report for this
release. See the project page
and package documentation for more details, and more examples. If you
like this or other open-source work I do, you can sponsor me at
GitHub.
Way back (more than 10 years ago) when I was doing DVD-based backups,
I knew that normal DVDs/Blu-Rays are no long-term archival solutions,
and that if I was real about doing optical media backups, I need to
switch to M-Disc. I actually
bought a (small stack) of M-Disc Blu-Rays, but never used them.
I then switched to other backups solutions, and forgot about the whole
topic. Until, this week, while sorting stuff, I happened upon a set of
DVD backups from a range of years, and was very curious whether they
are still readable after many years.
And, to my surprise, there were no surprises! Went backward in time, and:
I also found stack of dual-layer DVD+R from 2012-2014, some for sure
Verbatim, and some unmarked (they were intended to be printed on), but
likely Verbatim as well. All worked just fine. Just that, even at
~8GiB per disk, backing up raw photo files took way too many disks,
even in 2014 😅.
At this point I was happy that all 12+ DVDs I found, ranging from 10
to 14 years, are all good. Then I found a batch of 3 CDs! Here the
results were mixed:
2003: two TDK “CD-R80�, “Mettalic�, 700MB: fully readable, after
21 years!
unknown year, likely around 1999-2003, but no later, “Creation�
CD-R, 700MB: read errors to the extent I can’t even read the disk
signature (isoinfo -d).
I think the takeaway is that for all explicitly selected media - TDK,
JVC and Verbatim - they hold for 10-20 years. Valid reads from summer
2003 is mind boggling for me, for (IIRC) organic media - not sure
about the “TDK metallic� substrate. And when you just pick whatever
(“Creation�), well, the results are mixed.
Note that in all this, it was about CDs and DVDs. I have no idea how
Blu-Rays behave, since I don’t think I ever wrote a Blu-Ray. In any
case, surprising to me, and makes me rethink a bit my backup
options. Sizes from 25 to 100GB Blu-Rays are reasonable for most
critical data. And they’re WORM, as opposed to most LTO media, which
is re-writable (and to some small extent, prone to accidental wiping).
Now, I should check those M-Disks to see if they can still be written
to, after 10 years 😀
I want to write to pour praise on some software I recently discovered.
I'm not up to speed on Pipewire—the latest piece of Linux plumbing related
to audio—nor how it relates to the other bits (Pulseaudio, ALSA, JACK, what
else?). I recently tried to plug something into the line-in port on my external
audio interface, and wished to hear it on the machine. A simple task, you'd
think.
I'll refrain from writing about the stuff that didn't work well and
focus on the thing that did: A little tool called Whisper, which
is designed to let you listen to a microphone through your speakers.
Whisper's UI. Screenshot from upstream.
Whisper does a great job of hiding the complexity of what lies beneath and
asking two questions: which microphone, and which speakers? In my case this
alone was not quite enough, as I was presented with two identically-named "SB
Live Extigy" "microphone" devices, but that's easily resolved with trial and
error.
RcppDate wraps
the featureful date
library written by Howard
Hinnant for use with R. This header-only modern C++ library has been
in pretty wide-spread use for a while now, and adds to C++11/C++14/C++17
what will be (with minor modifications) the ‘date’ library in C++20.
This release, the first in 3 1/2 years, syncs the code with the
recent date 3.0.2
release from a few days ago. It also updates a few packaging details
such as URLs, badges or continuous integration.
Changes in version 0.0.4
(2024-10-14)
Updated to upstream version 3.0.2 (and adjusting one
pragma)
Several small updates to overall packaging and testing
When setting up your YubiKey you have the option to require the user to touch the device to authorize an operation (be it signing, decrypting, or authenticating). While web browsers often provide clear prompts for this, other applications like SSH or GPG will not. Instead the operation will just hang without any visual indication that user input is required. The YubiKey itself will blink, but depending on where it is plugged in that is not very visible.
yubikey-touch-detector (fresh in unstable) solves this issue by providing a way for your desktop environment to signal the user that the device is waiting for a touch. It provides an event feed on a socket that other components can consume. It comes with libnotify support and there are some custom integrations for other environments.
For GNOME and KDE libnotify support should be sufficient, however you still need to turn it on:
I would still have preferred a more visible, more modal prompt. I guess that would be an exercise for another time, listening to the socket and presenting a window. But for now, desktop notifications will do for me.
PS: I have not managed to get SSH's no-touch-required to work with YubiKey 4, while it works just fine with a YubiKey 5.
A long time ago a computer was a woman (I think almost exclusively a women, not a man) who was employed to do a lot of repetitive mathematics – typically for accounting and stock / order processing.
Then along came Lyons, who deployed an artificial computer to perform
the same task, only with fewer errors in less time. Modern day
computing was born – we had entered the age of the Digital Computer.
These computers were large, consumed huge amounts of power but were precise, and gave repeatable, verifiable results.
Over time the huge mainframe digital computers have shrunk in size,
increased in performance, and consume far less power – so much so that
they often didn’t need the specialist CFC based, refrigerated liquid
cooling systems of their bigger mainframe counterparts, only requiring
forced air flow, and occasionally just convection cooling. They shrank
so far and became cheep enough that the Personal Computer became to be,
replacing the mainframe with its time shared resources with a machine
per user. Desktop or even portable “laptop” computers were everywhere.
We networked them together, so now we can share information around
the office, a few computers were given specialist tasks of being
available all the time so we could share documents, or host databases
these servers were basically PCs designed to operate 24×7, usually more
powerful than their desktop counterparts (or at least with faster
storage and networking).
Next we joined these networks together and the internet was born. The dream of a paperless office might actually become realised – we can now send email (and documents) from one organisation (or individual) to another via email. We can make our specialist computers applications available outside just the office and web servers / web apps come of age.
Fast forward a few years and all of a sudden we need huge data-halls
filled with “Rack scale” machines augmented with exotic GPUs and NPUs
again with refrigerated liquid cooling, all to do the same task that we
were doing previously without the magical buzzword that has been named
AI; because we all need another dot com bubble or block chain band
waggon to jump aboard. Our AI enabled searches take slightly longer,
consume magnitudes more power, and best of all the results we are given
may or may not be correct….
Progress, less precise answers, taking longer, consuming more power,
without any verification and often giving a different result if you
repeat your question AND we still need a personal computing device to
access this wondrous thing.
Remind me again why we are here?
(time lines and huge swaves of history simply ignored to make an
attempted comic point – this is intended to make a point and not be
scholarly work)
I've been exploring typesetting and formatting code within
text documents such as papers, or my thesis. Up until now,
I've been using the listings package without thinking
much about it. By default, some sample Haskell code
processed by listings looks like this (click any of the
images to see larger, non-blurry versions):
It's formatted with a monospaced font, with some keywords highlighted,
but not syntactic symbols.
There are several other options for typesetting and formatting code in LaTeX
documents. For Haskell in particular, there is the preprocessor lhs2tex,
The default output of which looks like this:
A proportional font, but it's taken pains to preserve vertical alignment, which
is syntactically significant for Haskell. It looks a little cluttered to me,
and I'm not a fan of nearly everything being italic. Again, symbols aren't
differentiated, but it has substituted them for more typographically
pleasing alternatives: -> has become →, and \ is now λ.
Another option is perhaps the newest, the LaTeX package minted, which
leverages the Python Pygments program. Here's the same code again. It
defaults to monospace (the choice of font seems a lot clearer to me than the
default for listings), no symbolic substitution, and liberal use of colour:
An informal survey of the samples so far showed that the minted output was
the most popular.
All of these packages can be configured to varying degrees. Here are some
examples of what I've achieved with a bit of tweaking
listings adjusted with colour and some symbols substituted (but sadly not the two together)
lhs2tex adjusted to be less italic, sans-serif and use some colour
All of this has got me wondering whether there are straightforward empirical
answers to some of these questions of style.
Firstly, I'm pretty convinced that symbolic substitution is valuable. When
writing Haskell, we write ->, \, /= etc. not because it's most legible,
but because it's most practical to type those symbols on the most widely
available keyboards and popular keyboard layouts.1 Of the three
options listed here, symbolic substitution is possible with listings and
lhs2tex, but I haven't figured out if minted can do it (which is really
the question: can pygments do it?)
I'm unsure about proportional versus monospaced fonts. We typically use
monospaced fonts for editing computer code, but that's at least partly for
historical reasons. Vertical alignment is often very important in source code,
and it can be easily achieved with monospaced text; it's also sometimes
important to have individual characters (., etc.) not be de-emphasised by being
smaller than any other character.
lhs2tex, at least, addresses vertical alignment whilst using proportional
fonts. I guess the importance of identifying individual significant characters
is just as true in a code sample within a larger document as it is within
plain source code.
From a (brief) scan of research on this topic, it seems that proportional
fonts result in marginally quicker reading times for regular prose. It's
not clear whether those results carry over into reading computer code in
particular, and the margin is slim in any case. The drawbacks of monospaced
text mostly apply when the volume of text is large, which is not the case
for the short code snippets I am working with.
I still have a few open questions:
Is colour useful for formatting code in a PDF document?
does this open up a can of accessibility worms?
What should be emphasised (or de-emphasised)
Why is the minted output most popular: Could the choice of font
be key? Aspects of the font other than proportionality (serifs? Size
of serifs? etc)
The Haskell package Data.List.Unicode lets the programmer
use a range of unicode symbols in place of ASCII approximations, such
as ∈ instead of elem, ≠ instead of /=. Sadly, it's not possible
to replace the denotation for an anonymous function, \, with λ this
way.↩
It's been a while since I've posted about arm64 hardware. The last
machine I spent my own money on was
a SolidRun
Macchiatobin, about 7 years ago. It's a small (mini-ITX) board
with a 4-core arm64 SoC (4 * Cortex-A72) on it, along with things like
a DIMM socket for memory, lots of networking, 3 SATA disk interfaces.
The Macchiatobin was a nice machine compared to many earlier
systems, but it took quite a bit of effort to get it working to my
liking. I replaced the on-board U-Boot firmware binary with an EDK2
build, and that helped. After a few iterations we got a new build
including graphical output on a PCIe graphics card. Now it worked much
more like a "normal" x86 computer.
I still have that machine running at home, and it's been a
reasonably reliable little build machine for arm development and
testing. It's starting to show its age, though - the onboard USB ports
no longer work, and so it's no longer useful for doing things like
installation testing. :-/
So...
I was involved in a conversation in the #debian-arm IRC channel a
few weeks ago, and diederik suggested
the Radxa Rock 5
ITX. It's another mini-ITX board, this time using a Rockchip
RK3588 CPU. Things have moved on - the CPU is now an 8-core big.LITTLE
config: 4*Cortex A76 and 4*Cortex A55. The board has NVMe on-board,
4*SATA, built-in Mali graphics from the CPU, soldered-on memory. Just
about everything you need on an SBC for a small low-power desktop, a
NAS or whatever. And for about half the price I paid for the
Macchiatobin. I hit "buy" on one of the listed websites. :-)
A few days ago, the new board landed. I picked the version with
24GB of RAM and bought the matching heatsink and fan. I set it up in
an existing case borrowed from another old machine and tried the Radxa
"Debian" build. All looked OK, but I clearly wasn't going to stay with
that. Onwards to running a native Debian setup!
I installed an EDK2 build
from https://github.com/edk2-porting/edk2-rk3588
onto the onboard SPI flash, then rebooted with a Debian 12.7
(Bookworm) arm64 installer image on a USB stick. How much trouble
could this be?
I was shocked! It Just Worked (TM)
I'm running a standard Debian arm64 system. The graphical installer
ran just fine. I installed onto the NVMe, adding an Xfce desktop for
some simple tests. Everything Just Worked. After many
years of fighting with a range of different arm machines (from simple
SBCs to desktops and servers), this was without doubt the most
straightforward setup I've ever done. Wow!
It's possible to go and spend a lot of money on
an Ampere machine, and
I've seen them work well too. But for a hobbyist user (or even a
smaller business), the Rock 5 ITX is a lovely option. Total cost to me
for the board with shipping fees, import duty, etc. was just over
£240. That's great value, and I can wholeheartedly recommend this
board!
The two things that are missing compared to the Macchiatobin? This
is soldered-on memory (but hey, 24G is plenty for me!) It also doesn't
have a PCIe slot, but it has sufficient onboard network, video and
storage interfaces that I think it will cover most people's needs.
Where's the catch? It seems these are very popular
right now, so it can be difficult to find these machines in stock
online.
FTAOD, I should also point out: I bought this machine entirely with
my own money, for my own use for development and testing. I've had no
contact with the Radxa or Rockchip folks at all here, I'm
just so happy with this machine that I've felt the
need to shout about it! :-)
I finally figured out how to have an application launcher with my usual Emacs
completion keybindings:
This is with Icomplete. If you use another completion framework it will look
different. Crucially, it’s what you are already used to using inside Emacs,
with the same completion style (flex vs. orderless vs. …), bindings etc..
The dmenu_emacsclient script is
here.
It relies on the function spw/sway-completing-read from my
init.el.
As usual, this code is available for your reuse under the terms of the GNU
GPL. Please see the license and copyright information in the linked files.
You also probably want a for_window directive in your Sway config to enable
floating the window, and perhaps to resize it. Enjoy having your Emacs
completion bindings for application launching, too!
As DebConf22 was coming to an end, in Kosovo, talking with Eeveelweezel they
invited me to prepare a talk to give for the Chicago Python User
Group. I replied that I’m not really that much of a Python
guy… But would think about a topic. Two years passed. I meet Eeveelweezel
again for DebConf24 in Busan, South Korea. And the topic came up again. I had
thought of some ideas, but none really pleased me. Again, I do write some Python
when needed, and I teach using Python, as it’s the language I find my students
can best cope with. But delivering a talk to ChiPy?
As I give this filesystem as a project to my students (and not as a mere
homework), I always ask them to try and provide a good, polished, professional
interface, not just the simplistic menu I often get. And I tell them the best
possible interface would be if they provide support for FIUnamFS transparently,
usable by the user without thinking too much about it. With high probability,
that would mean: Use FUSE.
But, in the six semesters I’ve used this project (with 30-40 students per
semester group), only one student has bitten the bullet and presented a FUSE
implementation.
And of course, there isn’t a single interface to work from. In Python only, we
can find
python-fuse,
Pyfuse,
Fusepy… Where to start from?
…So I setup to try and help.
Over the past couple of weeks, I have been slowly working on my own version, and
presenting it as a progressive set of tasks, adding filesystem calls, and
being careful to thoroughly document what I write (but… maybe my documentation
ends up obfuscating the intent? I hope not — and, read on, I’ve provided some
remediation).
I registered a GitLab project for a hand-holding guide to writing FUSE-based
filesystems in Python. This
is a project where I present several working FUSE filesystem implementations,
some of them RAM-based, some passthrough-based, and I intend to add to this also
filesystems backed on pseudo-block-devices (for implementations such as my
FIUnamFS).
They all provide something that could be seen as useful, in a way that’s easy to
teach, in just some tens of lines. And, in case my comments/documentation are
too long to read, uncommentfs will happily strip all comments and whitespace
automatically! 😉
Of course, I will also share this project with my students in the next couple of
weeks… And hope it manages to lure them into implementing FUSE in Python. At
some point, I shall report!
This month I accepted 441 and rejected 29 packages. The overall number of packages that got accepted was 448.
I couldn’t believe my eyes, but this month I really accepted the same number of packages as last month.
Debian LTS
This was my hundred-twenty-third month that I did some work for the Debian LTS initiative, started by Raphael Hertzog at Freexian. During my allocated time I uploaded or worked on:
[unstable] libcupsfilters security update to fix one CVE related to validation of IPP attributes obtained from remote printers
[unstable] cups-filters security update to fix two CVEs related to validation of IPP attributes obtained from remote printers
[unstable] cups security update to fix one CVE related to validation of IPP attributes obtained from remote printers
[DSA 5778-1] prepared package for cups-filters security update to fix two CVEs related to validation of IPP attributes obtained from remote printers
[DSA 5779-1] prepared package for cups security update to fix one CVE related to validation of IPP attributes obtained from remote printers
[DLA 3905-1] cups-filters security update to fix two CVEs related to validation of IPP attributes obtained from remote printers
[DLA 3904-1] cups security update to fix one CVE related to validation of IPP attributes obtained from remote printers
[DLA 3905-1] cups-filters security update to fix two CVEs related to validation of IPP attributes obtained from remote printers
Despite the announcement the package libppd in Debian is not affected by the CVEs related to CUPS. By pure chance there is an unrelated package with the same name in Debian. I also answered some question about the CUPS related uploads. Due to the CUPS issues, I postponed my work on other packages to October.
Last but not least I did a week of FD this month and attended the monthly LTS/ELTS meeting.
Debian ELTS
This month was the seventy-fourth ELTS month. During my allocated time I uploaded or worked on:
[ELA-1186-1]cups-filters security update for two CVEs in Stretch and Buster to fix the IPP attribute related CVEs.
[ELA-1187-1]cups-filters security update for one CVE in Jessie to fix the IPP attribute related CVEs (the version in Jessie was not affected by the other CVE).
I also started to work on updates for cups in Buster, Stretch and Jessie, but their uploads will happen only in October.
I also did a week of FD and attended the monthly LTS/ELTS meeting.
Debian Printing
This month I uploaded …
… libcupsfilters to also fix a dependency and autopkgtest issue besides the security fix mentioned above.
… splix for a new upstream version. This package is managed now by OpenPrinting.
Last but not least I tried to prepare an update for hplip. Unfortunately this is a nerve-stretching task and I need some more time.
Most of the uploads were related to package migration to testing. As some of them are in non-free or contrib, one has to build all binary versions. From my point of view handling packages in non-free or contrib could be very much improved, but well, they are not part of Debian …
Anyway, starting in December there is an Outreachy project that takes care of automatic updates of these packages. So hopefully it will be much easier to keep those package up to date. I will keep you informed.
Debian IoT
This month I uploaded new upstream or bugfix versions of:
This month I did source uploads of all the packages that were prepared last month by Nathan and started the transition. It went rather smooth except for a few packages where the new version did not propagate to the tracker and they got stuck in old failing autopkgtest. Anyway, in the end all packages migrated to testing.
I also uploaded new upstream releases or fixed bugs in:
The Sovol SV08
is a 3D printer which is a semi-assembled clone of Voron 2.4,
an open-source design. It's not the cheapest of printers, but for
what you get, it's extremely good value for money—as long as you can
deal with certain, err, quality issues.
Anyway, I have one, and one of the fun things about an open design
is that you can switch out things to your liking. (If you just want a tool,
buy something else. Bambu P1S, for instance, if you can live with
a rather closed ecosystem. It's a bit like an iPhone in that aspect,
really.) So I've put together a spreadsheet with some of the more common
choices:
It doesn't contain any of the really difficult mods, and it also
doesn't cover pure printables. And none of the dreaded macro stuff
that people seem to be obsessing over (it's really like being
in the 90s with people's mIRC scripts all over again sometimes :-/),
except where needed to make hardware work.
This time I seem to be settling on either Commit Mono or Space
Mono. For now I'm using Commit Mono because it's a little more
compressed than Fira and does have a italic version. I don't like how
Space Mono's parenthesis (()) is "squarish", it feels visually
ambiguous with the square brackets ([]), a big no-no for my primary
use case (code).
So here I am using a new font, again. It required changing a bunch of
configuration files in my home directory (which is in a private
repository, sorry) and Emacs configuration (thankfully that's
public!).
One gotcha is I realized I didn't actually have a global font
configuration in Emacs, as some Faces define their own font
family, which overrides the frame defaults.
This is what it looks like, before:
After:
(Notice how those screenshots are not sharp? I'm surprised too. The
originals look sharp on my display, I suspect this is something to
do with the Wayland transition. I've tried with both grim and
flameshot, for what its worth.)
They are pretty similar! Commit Mono feels a bit more vertically
compressed maybe too much so, actually -- the line height feels too
low. But it's heavily customizable so that's something that's
relatively easy to fix, if it's really a problem. Its weight is also a
little heavier and wider than Fira which I find a little distracting
right now, but maybe I'll get used to it.
I like how the ampersand (&) is more traditional, although I'll miss
the exotic one Fira produced... I like how the back quotes (`,
GRAVE ACCENT) drop down low, nicely aligned with the apostrophe. As
I mentioned before, I like how the bar on the "f" aligns with the
other top of letters, something in Fira mono that really annoys me now
that I've noticed it (it's not aligned!).
A UTF-8 test file
Here's the test sheet I've made up to test various characters. I could
have sworn I had a good one like this lying around somewhere but
couldn't find it so here it is, I guess.
So there you have it, got completely nerd swiped by typography
again. Now I can go back to writing a too-long proposal again.
Sources and inspiration for the above:
the unicode(1) command, to lookup individual characters to
disambiguate, for example, - (U+002D HYPHEN-MINUS, the minus
sign next to zero on US keyboards) and − (U+2212 MINUS SIGN, a
math symbol)
searchable list of characters and their names - roughly
equivalent to the unicode(1) command, but in one page, amazingly
the /usr/share/unicode database doesn't have any one file like
this
UTF-8 encoded plain text file - nice examples of edge cases,
curly quotes example and box drawing alignment test which,
incidentally, showed me I needed specific faces customisation in
Emacs to get the Markdown code areas to display properly, also the
idea of comparing various dashes
In my previous blog post about fonts, I
had a list of alternative fonts, but it seems people are not digging
through this, so I figured I would redo the list here to preempt "but
have you tried Jetbrains mono" kind of comments.
My requirements are:
no ligatures: yes, in the previous post, I wanted ligatures but
I have changed my mind. after testing this, I find them distracting,
confusing, and they often break the monospace nature of the display
(note that some folks wrote emacs code to selectively enable
ligatures which is an interesting compromise)z
monospace: this is to display code
italics: often used when writing Markdown, where I do make use of
italics... Emacs falls back to underlining text when lacking italics
which is hard to read
free-ish, ultimately should be packaged in Debian
Here is the list of alternatives I have considered in the past and why
I'm not using them:
agave: recommended by tarzeau, not sure I like the lowercase
a, a bit too exotic, packaged as fonts-agave
Cascadia code: optional ligatures, multilingual, not liking the
alignment, ambiguous parenthesis (look too much like square
brackets), new default for Windows Terminal and Visual Studio,
packaged as fonts-cascadia-code
Fira Code: ligatures, was using Fira Mono from which it is derived,
lacking italics except for forks, interestingly, Fira Code succeeds
the alignment test but Fira Mono fails to show the X signs properly!
packaged as fonts-firacode
Hack: no ligatures, very similar to Fira, italics, good
alternative, fails the X test in box alignment, packaged as
fonts-hack
IBM Plex: irritating website, replaces Helvetica as the IBM
corporate font, no ligatures by default, italics, proportional alternatives,
serifs and sans, multiple languages, partial failure in box alignment test (X signs),
fancy curly braces contrast perhaps too much with the rest of the
font, packaged in Debian as fonts-ibm-plex
Inconsolata: no ligatures, maybe italics? more compressed than
others, feels a little out of balance because of that, packaged in
Debian as fonts-inconsolata
Intel One Mono: nice legibility, no ligatures, alignment issues
in box drawing, not packaged in Debian
Iosevka: optional ligatures, italics, multilingual, good
legibility, has a proportional option, serifs and sans, line height
issue in box drawing, fails dash test, not in Debian
Monoid: optional ligatures, feels much "thinner" than
Jetbrains, not liking alignment or spacing on that one, ambiguous
2Z, problems rendering box drawing, packaged as fonts-monoid
Mononoki: no ligatures, looks good, good alternative, suggested
by the Debian fonts team as part of fonts-recommended, problems
rendering box drawing, em dash bigger than en dash, packaged as
fonts-mononoki
spleen: bitmap font, old school, spacing issue in box drawing
test, packaged as fonts-spleen
sudo: personal project, no ligatures, zero originally not
dotted, relied on metrics for legibility, spacing issue in box
drawing, not in Debian
victor mono: italics are cursive by default (distracting),
ligatures by default, looks good, more compressed than commit mono,
good candidate otherwise, has a nice and compact proof sheet
So, if I get tired of Commit Mono, I might probably try, in order:
Hack
Jetbrains Mono
IBM Plex Mono
Iosevka, Monoki and Intel One Mono are also good options, but have
alignment problems. Iosevka is particularly disappointing as the EM
DASH metrics are just completely wrong (much too wide).
Also note that there is now a package in Debian called fnt to
manage fonts like this locally, including in-line previews (that don't
work in bookworm but should be improved in trixie and later).
Our reports attempt to outline what we’ve been up to over the past month, highlighting news items from elsewhere in tech where they are related. As ever, if you are interested in contributing to the project, please visit our Contribute page on our website.
Binsider can perform static and dynamic analysis, inspect strings, examine linked libraries, and perform hexdumps, all within a user-friendly terminal user interface!
95% fixed by [merge request] !12680 when -fobject-determinism is enabled. […]
The linked merge request has since been merged, and Rodrigo goes on to say that:
After that patch is merged, there are some rarer bugs in both interface file determinism (eg. #25170) and in object determinism (eg. #25269) that need to be taken care of, but the great majority of the work needed to get there should have been merged already. When merged, I think we should close this one in favour of the more specific determinism issues like the two linked above.
Fay Stegerman let everyone know that she started a thread on the Fediverse about the problems caused by unreproducible zlib/deflate compression in .zip and .apk files and later followed up with the results of her subsequent investigation.
Long-time developer kpcyrd wrote that “there has been a recent public discussion on the Arch Linux GitLab [instance] about the challenges and possible opportunities for making the Linux kernel package reproducible”, all relating to the CONFIG_MODULE_SIG flag. […]
Bernhard M. Wiedemann followed-up to an in-person conversation at our recent Hamburg 2024 summit on the potential presence for Reproducible Builds in recognised standards. […]
Fay Stegerman also wrote about her worry about the “possible repercussions for RB tooling of Debian migrating from zlib to zlib-ng” as reproducibility requires identical compressed data streams. […]
Martin Monperrus wrote the list announcing the latest release of maven-lockfile that is designed aid “building Maven projects with integrity”. […]
Lastly, Bernhard M. Wiedemann wrote about potential role of reproducible builds in combatting silent data corruption, as detailed in a recent Tweet and scholarly paper on faulty CPU cores. […]
This is a report of Part 1 of my journey: building 100% bit-reproducible packages for every package that makes up [openSUSE’s] minimalVM image. This target was chosen as the smallest useful result/artifact. The larger package-sets get, the more disk-space and build-power is required to build/verify all of them.
A hermetic build system manages its own build dependencies, isolated from the host file system, thereby securing the build process. Although, in recent years, new artifact-based build technologies like Bazel offer build hermeticity as a core functionality, no empirical study has evaluated how effectively these new build technologies achieve build hermeticity. This paper studies 2,439 non-hermetic build dependency packages of 70 Bazel-using open-source projects by analyzing 150 million Linux system file calls collected in their build processes. We found that none of the studied projects has a completely hermetic build process, largely due to the use of non-hermetic top-level toolchains. […]
Distribution work
In Debian this month, 14 reviews of Debian packages were added, 12 were updated and 20 were removed, all adding to our knowledge about identified issues. A number of issue types were updated as well. […][…]
In addition, Holger opened 4 bugs against the debrebuild component of the devscripts suite of tools. In particular:
#1081839: Fails with E: mmdebstrap failed to run error.
Last month, an issue was filed to update the Salsa CI pipeline (used by 1,000s of Debian packages) to no longer test for reproducibility with reprotest’s build_path variation. Holger Levsen provided a rationale for this change in the issue, which has already been made to the tests being performed by tests.reproducible-builds.org. This month, this issue was closed by Santiago R. R., nicely explaining that build path variation is no longer the default, and, if desired, how developers may enable it again.
diffoscope is our in-depth and content-aware diff utility that can locate and diagnose reproducibility issues. This month, Chris Lamb made the following changes, including preparing and uploading version 278 to Debian:
New features:
Add a helpful contextual message to the output if comparing Debian .orig tarballs within .dsc files without the ability to “fuzzy-match” away the leading directory. […]
Bug fixes:
Drop removal of calculated os.path.basename from GNU readelf output. […]
Correctly invert “X% similar” value and do not emit “100% similar”. […]
Misc:
Temporarily remove procyon-decompiler from Build-Depends as it was removed from testing (via #1057532). (#1082636)
disorderfs is our FUSE-based filesystem that deliberately introduces non-determinism into system calls to reliably flush out reproducibility issues. This month, version 0.5.11-4 was uploaded to Debian unstable by Holger Levsen making the following changes:
Replace build-dependency on the obsolete pkg-config package with one on pkgconf, following a Lintian check. […]
Bump Standards-Version field to 4.7.0, with no related changes needed. […]
In addition, reprotest is our tool for building the same source code twice in different environments and then checking the binaries produced by each build for any differences. This month, version 0.7.28 was uploaded to Debian unstable by Holger Levsen including a change by Jelle van der Waa to move away from the pipes Python module to shlex, as the former will be removed in Python version 3.13 […].
Android toolchain core count issue reported
Fay Stegerman reported an issue with the Android toolchain where a part of the build system generates a different classes.dex file (and thus a different .apk) depending on the number of cores available during the build, thereby breaking Reproducible Builds:
We’ve rebuilt [tag v3.6.1] multiple times (each time in a fresh container): with 2, 4, 6, 8, and 16 cores available, respectively:
With 2 and 4 cores we always get an unsigned APK with SHA-256 14763d682c9286ef….
With 6, 8, and 16 cores we get an unsigned APK with SHA-256 35324ba4c492760… instead.
reproducibility settings [being] applied to some of Gradle’s built-in tasks that should really be the default. Compatible with Java 8 and Gradle 8.3 or later.
Website updates
There were a rather substantial number of improvements made to our website this month, including:
Chris Lamb:
Attempt to use GitLab CI to ‘artifact’ the website; hopefully useful for testing branches. […]
Correct the linting rule whilst building the website. […]
Make a number of small changes to Kees’ post written by Vagrant. […][…][…]
Jelle van der Waa completely modernised the System Images documentation, noting that “a lot has changed since 2017(!); ext4, erofs and FAT filesystems can now be made reproducible”. […]
Developer RyanSquared replaced the continuous integration test link for Arch Linux on our Projects page with an external instance […][…] as well as updated the documentation to reflect the dependencies required to build the website […].
The Reproducible Builds project detects, dissects and attempts to fix as many currently-unreproducible packages as possible. We endeavour to send all of our patches upstream where appropriate. This month, we wrote a large number of such patches, including:
sphobjinv (duplicates fix from Debian bug #1082706)
Reproducibility testing framework
The Reproducible Builds project operates a comprehensive testing framework running primarily at tests.reproducible-builds.org in order to check packages and other artifacts for reproducibility. In September, a number of changes were made by Holger Levsen, including:
Add support for powercycling OpenStack instances. […]
Update the fail2ban to ban hosts for 4 weeks in total […][…] and take care to never ban our own Jenkins instance. […]
In addition, Vagrant Cascadian recorded a disk failure for the virt32b and virt64b nodes […], performed some maintenance of the cbxi4a node […][…] and marked most armhf architecture systems as being back online.
Finally, If you are interested in contributing to the Reproducible Builds project, please visit our Contribute page on our website. However, you can get in touch with us via:
I'm pleased to welcome Louis-Philippe Véronneau as a new Lintian
maintainer. He humorously acknowledged his new role, stating,
"Apparently I'm a Lintian maintainer now". I remain confident that
we can, and should, continue modernizing our policy checker, and I see
this as one important step toward that goal.
SPDX name / license tools
There was a discussion about deprecating the unique names for DEP-5 and
migrating to fully compliant SPDX names.
Simon McVittie wrote: "Perhaps our Debian-specific names are
better, but the relevant question is whether they are sufficiently
better to outweigh the benefit of sharing effort and specifications with
the rest of the world (and I don't think they are)." Also Charles
Plessy sees the value of deprecating the Debian ones and align on
SPDX.
The thread on debian-devel list contains several practical hints for
writing debian/copyright files.
I continued reaching out to teams in September. One common pattern I've
noticed is that most teams lack a clear strategy for attracting new
contributors. Here's an example snippet from one of my outreach emails,
which is representative of the typical approach:
Q: Do you have some strategy to gather new contributors for your team?
A: No.
Q: Can I do anything for you?
A: Everything that can help to have more than 3 guys :-D
Well, only the first answer, "No," is typical. To help the JavaScript
team, I'd like to invite anyone with JavaScript experience to join the
team's mailing list and offer to learn and contribute. While I've only
built a JavaScript package once, I know this team has developed
excellent tools that are widely adopted by others. It's an active and
efficient team, making it a great starting point for those looking to
get involved in Debian. You might also want to check out the
"Little tutorial for JS-Team beginners".
Given the lack of a strategy to actively recruit new contributors--a
common theme in the responses I've received--I recommend reviewing my
talk from DebConf23 about teams. The Debian Med team would have
struggled significantly in my absence (I've paused almost all work with
the team since becoming DPL) if I hadn't consistently focused on
bringing in new members. I'm genuinely proud of how the team has managed
to keep up with the workload (thank you, Debian Med team!). Of course,
onboarding newcomers takes time, and there's no guarantee of long-term
success, but if you don't make the effort, you'll never find out.
OS underpaid
The Register, in its article titled
"Open Source Maintainers Underpaid, Swamped by Security, Going Gray",
summarizes the 2024 State of the
Open Source Maintainer Report. I find this to be an interesting read,
both in general and in connection with the challenges mentioned in the
previous paragraph about finding new team members.
Freexian specializes in Free Software with a particular focus on Debian
GNU/Linux. Freexian can assist with consulting, training, technical support,
packaging, or software development on projects involving use or development of Free
software.
All of Freexian's employees and partners are well-known contributors in the Free
Software community, a choice that is integral to Freexian's business model.
About the Debian Partners Program
The Debian Partners Program
was created to recognize companies and organizations that help and provide
continuous support to the project with services, finances, equipment, vendor
support, and a slew of other technical and non-technical services.
Partners provide critical assistance, help, and support which has advanced and
continues to further our work in providing the 'Universal Operating System' to
the world.
After (quite) a summer break, here comes the 4th article of the 5-episode blog post series on Polis, written by Guido Berhörster, member of staff at my company Fre(i)e Software GmbH.
Have fun with the read on Guido's work on Polis,
Mike
Creating (a) new frontend(s) for Polis (this article)
Current status and roadmap
4. Creating (a) new frontend(s) for Polis
Why a new frontend was needed...
Our initial experiences of working with Polis, the effort required to implement
more invasive changes and the desire of iterating changes more rapidly
ultimately lead to the decision to create a new foundation for frontend
development that would be independent of but compatible with the upstream
project.
Our primary objective was thus not to develop another frontend but rather to
make frontend development more flexible and to facilitate experimentation and
rapid prototyping of different frontends by providing abstraction layers and
building blocks.
This also implied developing a corresponding backend since the Polis backend
is tightly coupled to the frontend and is neither intended to be used by
third-party projects nor supporting cross-domain requests due to the
expectation of being embedded as an iframe on third-party websites.
The long-term plan for achieving our objectives is to provide three abstraction
layers for building frontends:
a stable cross-domain HTTP API
a low-level JavaScript library for interacting with the HTTP API
a high-level library of WebComponents as a framework-neutral way of rapidly
building frontends
The Particiapp Project
Under the umbrella of the Particiapp project we have so far developed two
new components:
the example frontend project which currently contains both the client
library and an experimental example frontend built with it
Both the participation frontend and backend are fully compatible and require
an existing Polis installation and can be run alongside the upstream frontend.
More specifically, the administration frontend and common backend are required to
administrate conversations and send out notifications and the statistics
processing server is required for processing the voting results.
Particiapi server
For the backend the Python language and the Flask framework were
chosen as a technological basis mainly due to developer mindshare, a large
community and ecosystem and the smaller dependency chain and maintenance
overhead compared to Node.js/npm. Instead of integrating specific identity
providers we adopted the OpenID Connectstandard as an abstraction
layer for authentication which allows delegating authentication either to a
self-hosted identity provider or a large number of existing external
identity providers.
Particiapp Example Frontend
The experimental example frontend serves both as a test bed for the client
library and as a tool for better understanding the needs of frontend designers.
It also features a completely redesigned user interface and results
visualization in line with our goals. Branded variants are currently used for
evaluation and testing by the stakeholders.
In order to simplify evaluation, development, testing and deployment a Docker
Compose configuration is made available which contains all necessary
components for running Polis with our experimental example frontend. In
addition, a development environment is provided which includes a preconfigured
OpenID Connect identity provider (KeyCloak), SMTP-Server with web
interface (MailDev), and a database frontend (PgAdmin). The new
frontend can also be tested using our public demo server.
Almost all of my Debian contributions this month were
sponsored by Freexian.
You can also support my work directly via
Liberapay.
Pydantic
My main Debian project for the month turned out to be getting
Pydantic back into a good state in
Debian testing. I’ve used Pydantic quite a bit in various projects, most
recently in Debusine, so
I have an interest in making sure it works well in Debian. However, it had
been stalled on 1.10.17 for quite a while due to the complexities of getting
2.x packaged. This was partly making sure everything else could cope with
the transition, but in practice mostly sorting out packaging of its new
Rust dependencies. Several
other people (notably Alexandre Detiste, Andreas Tille, Drew Parsons, and
Timo Röhling) had made some good progress on this, but nobody had quite got
it over the line and it seemed a bit stuck.
Learning Rust is on my to-do list, but merely not knowing a language hasn’t
stopped me before.
So I learned how the Debian Rust team’s packaging
works, upgraded a few
packages to new upstream versions (including
rust-half
and upstream rust-idna test
fixes), and packaged rust-jiter. After a
lot of waiting around for various things and chasing some failures in other
packages I was eventually able to get current versions of both pydantic-core
and pydantic into testing.
I’m looking forward to being able to drop our clunky v1 compatibility code
once debusine can rely on running on trixie!
I upgraded python-yubihsm, yubihsm-connector, and yubihsm-shell to new
upstream versions.
I noticed that I could enable some tests in python-yubihsm and
yubihsm-shell; I’d previously thought the whole test suite required a real
YubiHSM device, but when I looked closer it turned out that this was only
true for some tests.
I fixed yubihsm-shell build failures on some 32-bit architectures (upstream
PRs #431,
#432), and also made it
build
reproducibly.
buildbot was in a bit of a mess due to
being incompatible with SQLAlchemy 2.0. Fortunately by the time I got to it
upstream had committed a workable set of patches, and the main difficulty
was figuring out what to cherry-pick since they haven’t made a new upstream
release with all of that yet. I figured this out and got us up to 4.0.3.
Adrian Bunk asked whether python-zipp
should be removed from trixie. I spent some time investigating this and
concluded that the answer was no, but looking into it was an interesting
exercise anyway.
On the other hand, I looked into flask-appbuilder, concluded that it should
be removed, and filed a removal request.
I upgraded importlib-resources, ipywidgets, jsonpickle, pydantic-settings,
pylint (fixing a test failure),
python-aiohttp-session, python-apptools, python-asyncssh,
python-django-celery-beat, python-django-rules, python-limits,
python-multidict, python-persistent, python-pkginfo, python-rt, python-spur,
python-zipp, stravalib, transmissionrpc, vulture, zodbpickle,
zope.exceptions (adopting it),
zope.i18nmessageid, zope.proxy, and zope.security to new upstream versions.
debmirror
The experimental and *-proposed-updates suites used to not have
Contents-* files, and a long time ago debmirror was changed to just skip
those files in those suites. They were added to the Debian archive some
time ago, but debmirror carried on skipping them anyway. Once I realized
what was going on, I removed these unnecessary special cases
(#819925,
#1080168).
Another short status update of what happened on my side last
month. Besides the usual amount of housekeeping last month was a lot
about getting old issues resolved by finishing some stale merge
requests and work in pogress MRs. I also pushed out the Phosh 0.42.0
Release
Sanitize versions as this otherwise breaks the libphosh-rs build (MR)
lockscreen: Swap deck and carousel to avoid triggering the plugins page when entering pin and let the lockscreen shrink to smaller sizes (MR) (two more year old usability issues out of the way)
Ensure we send enough feedback when phone is blanked/locked (MR). This should be way easier now for apps as they
don't need to do anything and we can avoid duplicate feedback sent from e.g. Chatty.
Fix possible use after free when activating notifications on the lock screen (MR)
Don't lose preedit when switching applications, opening menus, etc (MR). This fixes the case (e.g. with word completion in phosh-osk-stub enabled) where it looks to the user as if the last typed word would get lost when switching from a text editor to another app or when opening a menu
Fix word salad with presage completer when entering cursor navigation mode (and in some other cases)
(MR 1). Presage
has the best completion but was marked experimental due to that.
Submit preedit on changes to terminal and emoji layout (MR)
Unbreak and modernize CI a bit (MR). A passing CI is so much more motivating for contributers and reviewers.
Fotema
Fix app-id and hence the icon shown in Phosh's overview (MR)
Help Development
If you want to support my work see donations. This includes
a list of hardware we want to improve support for. Thanks a lot to all current and past donors.
The Reproducible Builds project relies on several projects, supporters and sponsors for financial support, but they are also valued as ambassadors who spread the word about our project and the work that we do.
Vagrant Cascadian: Could you tell me a bit about yourself? What sort
of things do you work on?
Kees Cook: I’m a Free Software junkie living in Portland, Oregon, USA.
I have been focusing on the upstream Linux kernel’s protection
of itself. There is a lot of support that the kernel provides
userspace to defend itself, but when I first started focusing on this
there was not as much attention given to the kernel protecting
itself. As userspace got more hardened the kernel itself became a
bigger target. Almost 9 years ago I formally announced the Kernel Self-Protection Project
because the work necessary was way more than my time and expertise could do
alone. So I just try to get people to help as much as possible; people who
understand the ARM architecture, people who understand the memory management
subsystem to help, people who understand how to make the kernel less buggy.
Vagrant: Could you describe the path that lead you to working on this
sort of thing?
Kees: I have always been interested in security through the aspect of
exploitable flaws. I always thought it was like a magic trick to make a
computer do something that it was very much not designed to do and seeing how
easy it is to subvert bugs. I wanted to improve that fragility. In 2006, I
started working at Canonical on Ubuntu and was mainly focusing on bringing
Debian and Ubuntu up to what was the state of the art for Fedora and Gentoo’s
security hardening efforts. Both had really pioneered a lot of userspace
hardening with compiler flags and ELF
stuff and many other things for hardened
binaries. On the whole, Debian had not really paid attention to it. Debian’s
packaging building process at the time was sort of a chaotic free-for-all as
there wasn’t centralized build methodology for defining things. Luckily that
did slowly change over the years. In Ubuntu we had the opportunity to apply top
down build rules for hardening all the packages. In 2011 Chrome OS was
following along and took advantage of a bunch of the security hardening work as
they were based on ebuild out of Gentoo
and when they looked for someone to
help out they reached out to me. We recognized the Linux kernel was pretty much
the weakest link in the Chrome OS security posture and I joined them to help
solve that. Their userspace was pretty well handled but the kernel had a lot
of weaknesses, so focusing on hardening was the next place to go. When I
compared notes with other users of the Linux kernel within Google there were a
number of common concerns and desires. Chrome OS already had an “upstream
first” requirement, so I tried to consolidate the concerns and solve them
upstream. It was challenging to land anything in other kernel team repos at
Google, as they (correctly) wanted to minimize their delta from upstream, so I
needed to work on any major improvements entirely in upstream and had a lot of
support from Google to do that. As such, my focus shifted further from working
directly on Chrome OS into being entirely upstream and being more of a
consultant to internal teams, helping with integration or sometimes
backporting. Since the volume of needed work was so gigantic I needed to find
ways to inspire other developers (both inside and outside of Google) to help.
Once I had a budget I tried to get folks paid (or hired) to work on these areas
when it wasn’t already their job.
Vagrant: So my understanding of some of your recent work is basically
defining undefined behavior in the language or compiler?
Kees: I’ve found the term “undefined behavior” to have a really strict
meaning within the compiler community, so I have tried to redefine my goal as
eliminating “unexpected behavior” or “ambiguous language constructs”. At the
end of the day ambiguity leads to bugs, and bugs lead to exploitable security
flaws. I’ve been taking a four-pronged approach: supporting the work people are
doing to get rid of ambiguity, identify new areas where ambiguity needs to be
removed, actually removing that ambiguity from the C language, and then dealing
with any needed refactoring in the Linux kernel source to adapt to the new
constraints.
None of this is particularly novel; people have recognized how dangerous some
of these language constructs are for decades and decades but I think it is a
combination of hard problems and a lot of refactoring that nobody has the
interest/resources to do. So, we have been incrementally going after the lowest
hanging fruit. One clear example in recent years was the elimination of C’s
“implicit fall-through” in switch statements. The language would just fall
through between adjacent cases if a break (or other code flow directive)
wasn’t present. But this is ambiguous: is the code meant to fall-through, or
did the author just forget a break statement? By defining the “[[fallthrough]]” statement,
and requiring its use in
Linux,
all switch statements now have explicit code flow, and the entire class of
bugs disappeared. During our refactoring we actually found that 1 in 10 added
“[[fallthrough]]” statements were actually missing break statements. This
was an extraordinarily common bug!
So getting rid of that ambiguity is where we have been. Another area I’ve been
spending a bit of time on lately is looking at how defensive security work has
challenges associated with metrics. How do you measure your defensive security
impact? You can’t say “because we installed locks on the doors, 20% fewer
break-ins have happened.” Much of our signal is always secondary or
retrospective, which is frustrating: “This class of flaw was used X much over
the last decade so, and if we have eliminated that class of flaw and will never
see it again, what is the impact?” Is the impact infinity? Attackers will just
move to the next easiest thing. But it means that exploitation gets
incrementally more difficult. As attack surfaces are reduced, the expense of
exploitation goes up.
Vagrant: So it is hard to identify how effective this is… how bad would it be
if people just gave up?
Kees: I think it would be pretty bad, because as we have seen, using
secondary factors, the work we have done in the industry at large, not just the
Linux kernel, has had an impact. What we, Microsoft, Apple, and everyone else
is doing for their respective software ecosystems, has shown that the price of
functional exploits in the black market has gone up. Especially for really
egregious stuff like a zero-click remote code execution.
If those were cheap then obviously we are not doing something right, and it
becomes clear that it’s trivial for anyone to attack the infrastructure that
our lives depend on. But thankfully we have seen over the last two decades that
prices for exploits keep going up and up into millions of dollars. I think it
is important to keep working on that because, as a central piece of modern
computer infrastructure, the Linux kernel has a giant target painted on it. If
we give up, we have to accept that our computers are not doing what they were
designed to do, which I can’t accept. The safety of my grandparents shouldn’t
be any different from the safety of journalists, and political activists, and
anyone else who might be the target of attacks. We need to be able to trust our
devices otherwise why use them at all?
Vagrant: What has been your biggest success in recent years?
Kees: I think with all these things I am not the only actor. Almost
everything that we have been successful at has been because of a lot
of people’s work, and one of the big ones that has been coordinated
across the ecosystem and across compilers was initializing stack variables to 0 by default.
This feature was added in
Clang,
GCC,
and MSVC
across the board even though there were a lot of fears about forking the C language.
The worry was that developers would come to depend on zero-initialized stack
variables, but this hasn’t been the case because we still warn about
uninitialized variables when the compiler can figure that out. So you still
still get the warnings at compile time but now you can count on the contents of
your stack at run-time and we drop an entire class of uninitialized variable flaws.
While the exploitation of this class has mostly been around memory content
exposure, it has also been used for control flow attacks.
So that was politically and technically a large challenge: convincing people it
was necessary, showing its utility, and implementing it in a way that everyone
would be happy with, resulting in the elimination of a large and persistent
class of flaws in C.
Vagrant: In a world where things are generally Reproducible do you see ways
in which that might affect your work?
Kees: One of the questions I frequently get is, “What version of the Linux
kernel has feature $foo?” If I know how things are built, I can answer with
just a version number. In a Reproducible Builds scenario I can count on the
compiler version, compiler flags, kernel configuration, etc. all those things
are known, so I can actually answer definitively that a certain feature exists.
So that is an area where Reproducible Builds affects me most directly.
Indirectly, it is just being able to trust the binaries you are running are
going to behave the same for the same build environment is critical for sane
testing.
Kees: I have! One subset of tree-wide refactoring that we do when getting
rid of ambiguous language usage in the kernel is when we have to make source
level changes to satisfy some new compiler requirement but where the binary
output is not expected to change at all. It is mostly about getting the
compiler to understand what is happening, what is intended in the cases where
the old ambiguity does actually match the new unambiguous description of what
is intended. The binary shouldn’t change. We have
used diffoscope to compare
the before and after binaries to confirm that “yep, there is no change in
binary”.
Vagrant: You cannot just use checksums for that?
Kees: For the most part, we need to only compare the text segments. We try
to hold as much stable as we can, following the
Reproducible Builds documentation for the kernel,
but there are macros in the kernel that are sensitive to source line numbers
and as a result those will change the layout of the data segment (and sometimes
the text segment too). With diffoscope there’s flexibility where I can exclude
or include different comparisons. Sometimes I just go look at what diffoscope
is doing and do that manually, because I can tweak that a little harder, but
diffoscope is definitely the default. Diffoscope is awesome!
Vagrant: Where has reproducible builds affected you?
Kees: One of the notable wins of reproducible builds lately was
dealing with the fallout of the XZ backdoor and just being able to ask
the question “is my build environment running the expected
code?” and to be able to compare the output generated from one
install that never had a vulnerable XZ and one that did have a
vulnerable XZ and compare the results of what you get. That was
important for kernel builds because the XZ threat actor was working to
expand their influence and capabilities to include Linux kernel
builds, but they didn’t finish their work before they were noticed. I
think what happened with Debian proving the build infrastructure was not affected is an
important example of how people would have needed to verify the kernel
builds too.
Vagrant: What do you want to see for the near or distant future in security work?
Kees: For reproducible builds in the kernel, in the work that has been
going on in the ClangBuiltLinux project, one of the driving forces of
code and usability quality has been the
continuous integration work. As soon as something breaks, on the
kernel side, the Clang side, or something in between the two, we get a
fast signal and can chase it and fix the bugs quickly. I would like to
see someone with funding to maintain a reproducible kernel build
CI. There have been places where there are certain
architecture configurations or certain build configuration where we lose
reproducibility and right now we have sort of a standard open source
development feedback loop where those things get fixed but the time
in between introduction and fix can be large. Getting a CI for
reproducible kernels would give us the opportunity to shorten that
time.
Vagrant: Well, thanks for that! Any last closing thoughts?
Kees: I am a big fan of reproducible builds, thank you for all your work.
The world is a safer place because of it.
Vagrant: Likewise for your work!
For more information about the Reproducible Builds project, please see our website at
reproducible-builds.org. If you are interested in
ensuring the ongoing security of the software that underpins our civilisation
and wish to sponsor the Reproducible Builds project, please reach out to the
project by emailing
contact@reproducible-builds.org.
During COVID companies suddenly found themselves able to offer remote working where it hadn’t previously been on offer. That’s changed over the past 2 or so years, with most places I’m aware of moving back from a fully remote situation to either some sort of hybrid, or even full time office attendance. For example last week Amazon announced a full return to office, having already pulled remote-hired workers in for 3 days a week.
I’ve seen a lot of folk stating they’ll never work in an office again, and that RTO is insanity. Despite being lucky enough to work fully remotely (for a role I’d been approached about before, but was never prepared to relocate for), I feel the objections from those who are pro-remote often fail to consider the nuances involved. So let’s talk about some of the reasons why companies might want to enforce some sort of RTO.
Real estate value
Let’s clear this one up first. It’s not about real estate value, for most companies. City planners and real estate investors might care, but even if your average company owned their building they’d close it in an instant all other things being equal. An unoccupied building costs a lot less to maintain. And plenty of companies rent and would save money even if there’s a substantial exit fee.
Occupancy levels
That said, once you have anyone in the building the equation changes. If you’re having to provide power, heating, internet, security/front desk staff etc, you want to make sure you’re getting your money’s worth. There’s no point heating a building that can seat 100 for only 10 people present. One option is to downsize the building, but that leads to not being able to assign everyone a desk, for example. No one I know likes hot desking. There are also scheduling problems about ensuring there are enough desks for everyone who might turn up on a certain day, and you’ve ruled out the option of company/office wide events.
Coexistence builds relationships
As a remote worker I wish it wasn’t true that most people find it easier to form relationships in person, but it is. Some of this can be worked on with specific “teambuilding” style events, rather than in office working, but I know plenty of folk who hate those as much as they hate the idea of being in the office. I am lucky in that I work with a bunch of folk who are terminally online, so it’s much easier to have those casual conversations even being remote, but I also accept I miss out on some things because I’m just not in the office regularly enough. You might not care about this (“I just need to put my head down and code, not talk to people”), but don’t discount it as a valid reason why companies might want their workers to be in the office. This often matters even more for folk at the start of their career, where having a bunch of experience folk around to help them learn and figure things out ends up working much better in person (my first job offered to let me go mostly remote when I moved to Norwich, but I said no as I knew I wasn’t ready for it yet).
Coexistence allows for unexpected interactions
People hate the phrase “water cooler chat”, and I get that, but it covers the idea of casual conversations that just won’t happen the same way when people are remote. I experienced this while running Black Cat; every time Simon and I met up in person we had a bunch of useful conversations even though we were on IRC together normally, and had a VoIP setup that meant we regularly talked too. Equally when I was at Nebulon there were conversations I overheard in the office where I was able to correct a misconception or provide extra context. Some of this can be replicated with the right online chat culture, but I’ve found many places end up with folk taking conversations to DMs, or they happen in “private” channels. It happens more naturally in an office environment.
It’s easier for bad managers to manage bad performers
Again, this falls into the category of things that shouldn’t be true, but are. Remote working has increased the ability for people who want to slack off to do so without being easily detected. Ideally what you want is that these folk, if they fail to perform, are then performance managed out of the organisation. That’s hard though, there are (rightly) a bunch of rights workers have (I’m writing from a UK perspective) around the procedure that needs to be followed. Managers need organisational support in this to make sure they get it right (and folk are given a chance to improve), which is often lacking.
Summary
Look, I get there are strong reasons why offering remote is a great thing from the company perspective, but what I’ve tried to outline here is that a return-to-office mandate can have some compelling reasons behind it too. Some of those might be things that wouldn’t exist in an ideal world, but unfortunately fixing them is a bigger issue than just changing where folk work from. Not acknowledging that just makes any reaction against office work seem ill-informed, to me.