Saturday, 25 December 2021

New German Sovereign Tech Fund will fund open source digital infrastructure to avert the next log4j

XKCD cartoon of an intricate tower made of blocks, all resting on a tiny block near the bottom, whose removal would topple the building. The top is called All modern digital infrastrucutre. The tiny block is marked as A project some random person in Nebraska has been thanklessly maintaining since 2003

The famous XKCD cartoon has resulted in an open source digital infrastructure fund. Thank you Randall.

Late in the afternoon, just before a national holiday, is not the best time to get attention. Which is probably the main reason that the press did not (yet) write about what Franziska Brantner (the new Green deputy minister for the economy) wrote on Twitter:

We will tackle the Sovereign Tech Fund! Log4j has shown that sustainably secured and reliable open source solutions are the basis for the innovative strength and digital sovereignty of the German economy. We will therefore promote open source enabling technologies from 2022 onwards.

[[Log4j]] is a security vulnerability in a 21-year old Java library that is used a lot, which is easy to exploit and existed for almost a decade before being noticed. As a Free and Open Source Software (FOSS) it was used widely and produces a lot of value, despite there not being much funding for producing FOSS. In this way much of the digital economy depends on the dedication of unpayed hobbyists, as XKCD Explained explains well.

The German Sovereign Tech Fund will step into this gap. We will have to see how the government will implement it, but the name comes from a feasibility study by the Open Knowledge Foundation, which proposed a fund to support "the development, scaling and maintenance of digital and foundational technologies. The goal of the fund could be to sustainably strengthen the open source ecosystem, with a focus on security, resilience, technological diversity, and the people behind the code."

Such a fund had not explicitly made it into the coalition agreement of the new government to the lament of the FOSS community. Although it does fit to the spirit of the agreement. 

Deputy minister Franziska Brantner carbon copied Patrick Beuth, a journalist who recently wrote about log4j in the magazine Der Spiegel and mentioned the Sovereign Tech Fund as a solution. So log4j seems to have been the clincher.

This announcement adds to a period of hope for digital rights. Most of my life they have become worse, more privacy for the powerful, more vulnerability for us. Things which were protected in the analogue world (taking to each other, sending a letter) have been criminalized and subjected to surveillance. The fast creation of abusive monopolies is the official business model in Silicon valley. Social media monopolies sprouted who do not care how much damage they do to society and our democracy, while Europe was increasingly becoming a digital colony. 

However, lately with the EU privacy law, the rise of the Fediverse, the upcoming EU Digital Services Act and a good coalition agreement in Germany, it is starting to look like it is actually possible for digital right to improve.

This proposal is for a fund of 10 million Euro per year, which is a good start. Especially when similar EU proposals also manage to get funded. There is also project funding for new software tools: the Prototype Fund in Germany or the Next Generation Internet (NGI) and NGI-zero initiative in Europe. 

What I feel is still missing are stable public institutions where coders can jointly work on large tasks, such as maintaining Firefox or extending what is possible in the Fediverse. If we would compare the situation in software to science, we now have funding for projects by the National Science Foundation and agencies, but there are no equivalents yet of the National Institute of Health, research institutes or universities.

More in general we need a real solution to invest in goods and services with enormous societal and economic value that do not have much market value (research and development, security, (preventative) healthcare, weather services, justice, software, (digital) infrastructure, governance, media, ...). We are no longer in the 19th century. These kinds of cases are an increasing large part of the future economy.

Related reading

Patrick Beuth (Der Spiegel): Wie löscht man ein brennendes Internet?

XKCD Explained on the XKCD on software dependencies.

The digitization section of the coalition agreement in English.

Monday the 27th of December there is a session on the Sovereign Tech Fund at the remote Chaos Computer Congress.

Digital Services Act: Greens/EFA successes

Micro-blogging for scientists without nasties and surveillance

Thursday, 6 May 2021

We launched a new group to promote the translation of the scientific literature

Tell your story, tell your journey, they say. Climate Outreach advised: tell about how you came to accept climate change is a problem. Maybe I am too young, but still not being 50 I have accepted climate change was a risk we should care about already as a kid.

Also otherwise, I do not remember suddenly changing my mind often, so that I could talk about my journey. Where the word "remember" may do a lot of the work. Is it useful not to remember such things to make it easier on you to change your mind? Or do many people work with really narrow uncertainty intervals even when they do not have a clue yet?

But when it comes to translations of scientific articles, I changed a lot. When I was doing cloud research I used to think that knowing English was just one of the skills a scientist needs. Just like logic, statistics, coding, knowing the literature, public speaking, and so on.

Working on historical climate data changed this. I regularly have to communicate with people from weather services from all over the world and many do not speak English (well), while they do work that is crucial for science. Given how hard we make it for them to participate they do an amazing job; I guess the World Meteorological Organization translating all their reports in many languages helps.

The most "journey" moment was at the Data Management Workshop in Peru, where I was the only one not speaking Spanish. A colleague told me that she translated important scientific articles into Spanish and send them by email to her colleagues. Just like Albert Einstein translated scientific articles into English for those who did not master the language of science at the time.

This got me thinking about a database where such translations could be made available. When you search for an article and can see which translations are available. Or where you can search for translated articles on a specific topic. Such a resource would make producing translations more worthwhile and would thus hopefully stimulate their production.

Gathering literature, bookmarks on this topic and noticing who else was interested in this topic, I have invited a group of people to see if we can collaborate on this topic. After a series of pandemic video calls, we decided to launch as a group, somewhat unimaginatively called: "Translate Science". Please find below the part of our launch blog post about why translations are important.

(To be fair to me, and I like being fair to me, for a fundamental science needing expensive instruments such as cloud studies it makes more sense to simply do it in English. While for sciences that directly impact people, climate, health, agriculture, two-way communication within science, with the orbit around science and with society is much more important.

But even in the clouds sciences I should probably have paid more attention to studies in other languages. One of our group members works on turbulence and droplets and found many worthwhile papers in Russian. I had never considered that and might have found some turbulent gems there as well.)


The importance of translated articles

English as a common language has made global communication within science easier. However, this has made communication with non-English communities harder. For English-speakers it is easy to overestimate how many people speak English because we mostly deal with foreigners who do speak English. It is thought that that about one billion people speak English. That means that seven billion people do not. For example, at many weather services in the Global South only few people master English, but they use the translated guidance reports of the World Meteorological Organization (WMO) a lot. For the WMO, as a membership organization of the weather services, where every weather service has one vote, translating all its guidance reports into many languages is a priority.

Non-English or multilingual speakers, in both African (and non-African) continents, could participate in science on an equal footing by having a reliable system where scientific work written in non-English language is accepted and translated into English (or any other language) and vice versa. Language barriers should not waste scientific talent.

Translated scientific articles open science to regular people, science enthusiasts, activists, advisors, trainers, consultants, architects, doctors, journalists, planners, administrators, technicians and scientists. Such a lower barrier to participating in science is especially important on topics such as climate change, environment, agriculture and health. The easier knowledge transfer goes both ways: people benefiting from scientific knowledge and people having knowledge scientists should know. Translations thus help both science and society. They aid innovation and tackling the big global challenges in the fields of climate change, agriculture and health.

Translated scientific articles speed up scientific progress by tapping into more knowledge and avoiding double work. They thus improve the quality and efficiency of science. Translations can improve public disclosure, scientific engagement and science literacy. The production of translated scientific articles also creates a training dataset to improve automatic translations, which for most languages is still lacking.

The full post at the Translate Science blog explains more about who we are, what we would like to do to promote translations and how you can join.


Thursday, 22 April 2021

The confusing politics behind John Stossel asking Are We Doomed?


As member of Climate Feedback I just reviewed a YouTube video by John Stossel. In that review I could only respond to factual claims, which were the boring age-old denier evergreens. Thus not surprisingly the video got a solid "very low" scientific credibility. But it got over 25 million views, so I guess responding was worth it.

The politics of the video were much more "interesting". As in: "May you live in interesting times". Other options would have been: crazy, confusing, weird.

That starts with the title of the video: "Are We Doomed?". Is John Stossel suggesting that damages are irrelevant if they are not world ending? I would be surprised if that were his general threshold for action. "Shall we build a road?". Well, "Are We Doomed?" "Should we fund the police? Well, "Are We Doomed?" "Shall I eat an American taco?" Well, "Are We Doomed?"

Are we not to invest in a more prosperous future unless we are otherwise doomed? That does not seem to be the normal criterion for rational investments any sane person or corporation would use.

Then there is his stuff about sea level rise:

"Are you telling me that people in Miami are so dumb that they are just going to sit there and drown?”

That remind me of a similar dumb statement by public intellectual Ben Shapiro (I hope people hear the sarcasm, in the US you can never be sure) and the wonderful response to it by H Bomber Guy:

Bomber also concludes that this, this, ... whatever it is, has nothing to do with science:

"How have things reached a point, where someone thinks they can get away with saying something this ridiculous in front of an audience of people? And how have things reached the point where some people in that audience won't recognize it for the obvious ignorant bullshit that it is?
This led me down a particular hole of discovery. I realized that climate deniers aren't just wrong, they're obviously wrong. In very clear ways, and that makes the whole thing so much more interesting. How does this work if it's so paper thin?"

Politically interesting is that Stossel wants Floridians to get lost and Dutch people to pay an enormous price, in this video, while the next Stossel video Facebook suggests has the tagline: "Get off my property". And Wikipedia claims that Stossel is a "Libertarian pundit".

So do we have to accept any damages Stossel wants to us to suffer under? Do we have to leave our house behind? Does Stossel get to destroy our community and our family networks? Is Stossel selling authoritarianism where he gets to decide who suffers? Or is Stossel selling markets with free voluntary transaction and property rights?

In America, lacking a diversity of parties, both ideologies are within the same (Republican) party, but these are two fundamentally different ideas. But either you are a Conservative and believe in property rights or you are an Authoritarian and think you can destroy other people's property when you have the power.

You can reconcile these two ideas with the third ideological current in the Republican party: childish Libertarianism, where you get to pretend that the actions of person X never affect person Y. An ideology for teenagers and a lived reality for the donor class that funds US politics and media, who never suffer consequences for their terrible behavior.

But in this video Stossel rejects this childish idea and accepts that Florida suffers damages:

"Are you telling me that people in Miami are so dumb that they are just going to sit there and drown?”

So, John Stossel, do you believe in property rights or don't you?

Friday, 16 April 2021

Antigen rapid tests much less effective for screening than previously thought according to top German virologist Drosten

Hidden in a long German language podcast on the pandemic Prof. Dr. Christian Drosten talked about an observation that has serious policy implications.

At the moment this is not yet based on any peer reviewed studies, but mostly on his observations and those of his colleagues running large diagnosis labs. So it is important to note that he is a top diagnostic virologist from German who specialized on emerging and Corona viruses and made the first SARS-CoV-2 PRC test.

In the Anglo-American news Drosten is often introduced as the German Fauci. This fits as being one of the most trusted national sources of information. But Drosten has much more expertise, both Corona virusses and diagnostic testing are his beat.

Tim Lohn wrote an article about this in Bloomberg: "Rapid Covid Tests Are Missing Early Infections, Virologist Says." And found two experts making similar claims.

Let me give a longer and more technical explanation than Tim Lohn of what Prof. Dr. Christian Drosten claims. Especially because there is no peer reviewed study yet, I feel the explanation is important.

If you have COVID symptoms (day 0), sleep on it and test the next day the antigen tests are very reliable. But on day zero itself and especially on the one or two days before where you were already infectious they are not as reliable. So they are good for (self-)diagnosis, but less good for screening, for catching those first days of infectiousness. The PCR tests are sensitive enough for those pre-symptomatic cases, if only people would test with PCR that early and would immediately get the result.


Figure from Jitka Polechová et al.

In those pre-symptomatic days there is already a high viral load, but this is mostly active virus. The antigen test detects the presence of the capsid of the virus, the protective shell of the virus. The PCR test detects virus RNA. When infecting a cell, the capsid proteines are produced first, before the RNA is produced. So in that respect one might expect the rapid tests to be able to find virus a few hours earlier.

But here we are talking about a few days. The antigen test can best detected capsids in a probe sample when epithelial cells die and mix with the mucus, which takes a few days. So the difference between the days before and after symptoms is the amount of dead virus material, which the rapid tests can detect to get reliable results. That is the reason why in the time after symptom onset the antigen tests predict infectiousness well. But in those early days possibly not.

This was not detected before because the probes used to study how well the tests work were mostly from symptomatic people; it is hard to get get positive probes from people who are infectious before they are symptomatic. Because you do not often have pre-symptomatic cases with both a PCR and an anti-gen tests, also the observations of Drosten are based on just a few cases. He strongly encouraged systematic studies to be made and published, but this will take a few months.

In the Bloomberg article Tim Lohn quotes Rebecca Smith who found something similar:

In a paper published in March -- not yet peer reviewed -- researchers led by Rebecca L. Smith at the University of Illinois at Urbana-Champaign found that, among other things, PCR tests were indeed better at detecting infections early on than a Quidel rapid antigen test. But the difference narrowed after a few days, along with when the different tests were repeatedly used on people.

The article also quotes Jitka Polechová of the University of Vienna, who wrote a review comparing PCR tests to antigen tests:

“Given that PCR tests results are usually not returned within a day, both testing methods are similarly effective in preventing spread if used correctly and frequently.”

This is a valid argument for comparing the tests when are used for diagnostics or as additional precautions for dangerous activities that have to take place.

However, at least in Germany, rapid tests are also used as part of opening up the economy. Here people can, for example, go into the theatre or a restaurant after having been tested. This is something one would not use a PCR for, because it would not be fast enough. These people at theatres and restaurants may think they are nearly 100% safe, but actually 3 of the on average 8 infectious days would not be detected. If, in addition, people behave more dangerously, thinking they are safe, opening a restaurant this way may not be much less dangerous than opening a restaurant without any testing.

So we have to rethink this way of opening up activities inside and rather try to meet people outside.

Related reading

Original source: Das Coronavirus-Update von NDR Info, edition 84: "(84) Nicht auf Tests und Impfungen verlassen". Time stamp: "00:48:09 Diagnostik-Lücke bei Schnelltests"

Northern German public media (NDR) article: 'Drosten: "Schnelltests sind wohl weniger zuverlässig als gedacht."' Translated: Drosten: "Rapid tests are probably less reliable than expected"

Tim Lohn in Bloomberg: "Rapid Covid Tests Are Missing Early Infections, Virologist Says."

Jitka Polechová, Kory D. Johnson, Pavel Payne,Alex Crozier, Mathias Beiglböck, Pavel Plevka, Eva Schernhammer. Rapid antigen tests: their sensitivity, benefits forepidemic control,and use in Austrian schools. Not reviewed preprint.

Friday, 22 January 2021

New paper: Spanish and German climatologists on how to remove errors from observed climate trends

This picture shows three meteorological shelters next to each other in Murcia (Spain). The rightmost shelter is a replica of the Montsouri (French) screen, in use in Spain and many European countries in the late 19th century and early 20th century. Leftmost, Stevenson screen equipped with conventional meteorological instruments, a set-up used globally for most of the 20th century. In the middle, Stevenson screen equipped with automatic sensors. The Montsouri screen is better ventilated, but because some solar radiation can get onto the thermometer it registers somewhat higher temperatures than a Stevenson screen. Picture: Project SCREEN, Center for Climate Change, Universitat Rovira i Virgili, Spain.

The instrumental climate record is human cultural heritage, the product of the diligent work of many generations of people all over the world. But changes in the way temperature was measured and in the surrounding of weather stations can produce spurious trends. An international team, with participation of the University Rovira i Virgili (Spain), State Meteorological Agency (AEMET, Spain) and University of Bonn (Germany), has made a great endeavour to provide reliable tests for the methods used to computationally eliminate such spurious trends. These so-called “homogenization methods“ are a key step to turn the enormous effort of the observers into accurate climate change data products. The results have been published in the prestigious Journal of Climate of the American Meteorological Society. The research was funded by the Spanish Ministry of Economy and Competitiveness.

Climate observations often go back more than a century, to times before we had electricity or cars. Such long time spans make it virtually impossible to keep the measurement conditions the same across time. The best-known problem is the growth of cities around urban weather stations. Cities tend to be warmer, for example due to reduced evaporation by plants or because high buildings block cooling. This can be seen comparing urban stations with surrounding rural stations. It is less talked about, but there are similar problems due to the spread of irrigation.

The most common reason for jumps in the observed data are relocations of weather stations. Volunteer observers tend to make observations near their homes; when they retire and a new volunteer takes over the tasks, this can produce temperature jumps. Even for professional observations keeping the locations the same over centuries can be a challenge either due to urban growth effects making sites unsuitable or organizational changes leading to new premises. Climatologist from Bonn, Dr. Victor Venema, one of the authors: “a quite typical organizational change is that weather offices that used to be in cities were transferred to newly build airports needing observations and predictions. The weather station in Bonn used to be on a field in village Poppelsdorf, which is now a quarter of Bonn and after several relocations the station is currently at the airport Cologne-Bonn.

For global trends, the most important changes are technological changes of the same kinds and with similar effects all over the world. Now we are, for instance, in a period with widespread automation of the observational networks.

Appropriate computer programs for the automatic homogenization of climatic time series are the result of several years of development work. They work by comparing nearby stations with each other and looking for changes that only happen in one of them, as opposed to climatic changes that influence all stations.

To scrutinize these homogenization methods the research team created a dataset that closely mimics observed climate datasets including the mentioned spurious changes. In this way, the spurious changes are known and one can study how well they are removed by homogenization. Compared to previous studies, the testing datasets showed much more diversity; real station networks also show a lot of diversity due to differences in their management. The researchers especially took care to produce networks with widely varying station densities; in a dense network it is easier to see a small spurious change in a station. The test dataset was larger than ever containing 1900 station networks, which allowed the scientists to accurately determine the differences between the top automatic homogenization methods that have been developed by research groups from Europe and the Americas. Because of the large size of the testing dataset, only automatic homogenization methods could be tested.

The international author group found that it is much more difficult to improve the network-mean average climate signals than to improve the accuracy of station time series.

The Spanish homogenization methods excelled. The method developed at the Centre for Climate Change, Univ. Rovira i Virgili, Vila-seca, Spain, by Hungarian climatologist Dr. Peter Domonkos was found to be the best at homogenizing both individual station series and regional network mean series. The method of the State Meteorological Agency (AEMET), Unit of Islas Baleares, Palma, Spain, developed by Dr. José A. Guijarro was a close second.

When it comes to removing systematic trend errors from many networks, and especially of networks where alike spurious changes happen in many stations at similar dates, the homogenization method of the American National Oceanic and Atmospheric Agency (NOAA) performed best. This is a method that was designed to homogenize station datasets at the global scale where the main concern is the reliable estimation of global trends.

The earlier used Open Screen used at station Uccle in Belgium, with two modern closed thermometer Stevenson screens with a double-louvred walls in the background.

Quotes from participating researchers

Dr. Peter Domonkos, who earlier was a weather observer and now writes a book about time series homogenization: “This study has shown the value of large testing datasets and demonstrates another reason why automatic homogenization methods are important: they can be tested much better, which aids their development.

Prof. Dr. Manola Brunet, who is the director of the Centre for Climate Change, Univ. Rovira i Virgili, Vila-seca, Spain, Visiting Fellow at the Climatic Research Unit, University of East Anglia, Norwich, UK and Vice-President of the World Meteorological Services Technical Commission said: “The study showed how important dense station networks are to make homogenization methods powerful and thus to compute accurate observed trends. Unfortunately, still a lot of climate data needs to be digitized to contribute to an even better homogenization and quality control.

Dr. Javier Sigró from the Centre for Climate Change, Univ. Rovira i Virgili, Vila-seca, Spain: “Homogenization is often a first step that allows us to go into the archives and find out what happened to the observations that produced the spurious jumps. Better homogenization methods mean that we can do this in a much more targeted way.

Dr. José A. Guijarro: “Not only the results of the project may help users to choose the method most suited to their needs; it also helped developers to improve their software showing their strengths and weaknesses, and will allow further improvements in the future.

Dr. Victor Venema: “In a previous similar study we found that homogenization methods that were designed to handle difficult cases where a station has multiple spurious jumps were clearly better. Interestingly, this study did not find this. It may be that it is more a matter of methods being carefully fine-tuned and tested.

Dr. Peter Domonkos: “The accuracy of homogenization methods will likely improve further, however, we never should forget that the spatially dense and high quality climate observations is the most important pillar of our knowledge about climate change and climate variability.

Press releases

Spanish weather service, AEMET: Un equipo internacional de climatólogos estudia cómo minimizar errores en las tendencias climáticas observadas

URV university in Tarragona, Catalonian: Un equip internacional de climatòlegs estudia com es poden minimitzar errades en les tendències climàtiques observades

URV university, Spanish: Un equipo internacional de climatólogos estudia cómo se pueden minimizar errores en las tendencias climáticas observadas

URV university, English: An international team of climatologists is studying how to minimise errors in observed climate trends

Articles

Tarragona 21: Climatòlegs de la URV estudien com es poden minimitzar errades en les tendències climàtiques observades

Genius Science, French: Une équipe de climatologues étudie comment minimiser les erreurs dans la tendance climatique observée

Phys.org: A team of climatologists is studying how to minimize errors in observed climate trend