T

Teymour Aldridge

It's me, an armchair [insert whatever I happen to be writing about now]

cognitive complexity and failures of computer programs

One of the most frequently overlooked performance metrics is the cognitive complexity of a codebase. If engineers experience high friction when trying to change a codebase, all efforts to make the code faster will be dramatically hindered. A codebase that is a joy for engineers to work with is a codebase that will see the most long-term optimizations. Codebases that burn people out will not see long-term success unless they receive tons of funding to replace people who flee the project after short periods of activity. Organizational instability is a high-quality predictive metric for the bugginess of a codebase.

https://sled.rs/perf.html

germany, the census, and the fight against racism

Data collection, processing and collation can be effective in fighting racism, but it can also be used to further a racist agenda. The Nazi regime ran large-scale censuses in the 1930s, collecting data on Jewish people (and other groups). The data collected was then used to assist in the orchestration of the Holocaust. [2] The Nazi regime is a good example of how "data-driven" decision making does not excuse the requirement for moral considerations and reflections.

Because of this the German language is quite probably unique in being the only langauge to have a word (Volkszählungsurteil) for the court ruling declaring that a (mandatory) state census is in violation of the constitution. [1] The ban on the census, handed down by the court in Karlsruhe in 1983 effectively stopped a census from being conducted in Germany. The ruling was not an isolated event – it builds on a general feeling in Germany that data protection is important; Germany was the first country (in 1970) to introduce any form of data protection legislation (in Hessen); in 1978 federal law was ratified, which specified that the state could collect and use data only for the purpose specified, with valid reasons and for limited periods of time.

In more recent times (and this is clearly a completely different case to the Nazi dictatorship) the British Home Office has used data to support its attempts to create a "hostile environment." From collecting data from charities in order to deport rought sleepers [3], to using data from the National Health Service to locate people that it wishes to deport [4], data processing in order to oppress ethnic minorities remains prevalent. This is the sort of data processing that half a century of data protection legislation in Germany attempts to prevent (though I suspect that the authors of the legislation were not very concerned with the rights of people from ethnic minority backgrounds).

The use of data is a double-edged sword; of course it can be used to fight racism, but it can also be used to further racist projects. It can be used to drive police reform that means justice for ethnic minorities; it can also be used by police to justify sending officers to neighbourhoods which are predominantly inhabited by people from ethnic minority backgrounds [5].

When considering how data collection can be used to combat racism, data cannot be considered value-neutral. It can be used to shine a light on racism that we could not otherwise perceive (for example the work of W.E. B. Du Bois), but it can also be used to racist ends. It is important not to forget that modern statistians stand on the shoulders of giants – racist, eugenisist giants (e.g.
Francis Galton, Karl Pearson), and that the privileging of data above all else is hardly a healthy epistemology.

P.S. This isn't to say that Germany shouldn't collect data on ethnic minorities – it's just that advocating broadly for more data processing is not a sensible strategy. Only data processing conducted transparently, for stated reasons, and in a way that does not endanger people's rights can be legitimate (in a state which relies on the consent of its citizens to govern).

[1] To be fair, however, the German language does words for many things one wouldn't really anticipate including (until recently) the word Rindfleischetikettierungsüberwachungsaufgabenübertragungsgesetz (which was the name of a law changing something about the monitoring of how beef is labelled).

[2] For more about this see IBM and the Holocaust, Edwin Black

[3] https://www.theguardian.com/uk-news/2017/aug/19/home-office-secret-emails-data-homeless-eu-nationals

[4] https://www.independent.co.uk/news/uk/home-news/home-office-nhs-data-sharing-immigration-enforcement-a8761396.html

[5] See Cathy O'Neill's "Weapons of Math Destruction" (can't remember the exact
chapter/page unfortunately)

hegel

Involvement in a system of opinion and prejudice on the authority of others and involvement in it from personal conviction differ only in respect of the vanity that goes with the latter policy.

The Phenomology of Spirit

testing macros with side-effects using `trybuild`

Note: this is probably only of interest if you write (or want to write) Rust procedural macros.

I was recently writing a test for a procedural macro which looked like this:

#[test]
fn test_classes_to_file() {
    let t = trybuild::TestCases::new();
    t.pass("tests/write_file/pass.rs");
    let file = read_to_string(&format!(
        "{}/styles.css",
        std::env::var("CARGO_MANIFEST_DIR").unwrap()
    ))
    .unwrap();
    assert!(file.contains("{font-size:24px;font-family:sans-serif;}"));
}

It compiles a file which invokes a procedural macro. The procedural macro in question will write some data (some auto-generated CSS) into a file, and I wanted to make sure that it writes the correct CSS into the file. Unfortunately, in its current state the test fails (but not because the macro is performing incorrrectly!)

The way in which trybuild runs tests means that tests are added and then run when TestCases is dropped. I might question this API design were it not for the fact that most macros are just simple(ish) mappings of code from "A" to "B." That is a good way to write macros!!!

Anyway, in this case, the trick is to ensure that TestCases is dropped before checking that the file has been written to.

#[test]
fn test_classes_to_file() {
    let t = trybuild::TestCases::new();
    t.pass("tests/write_file/pass.rs");
        std::mem::drop(t);
    let file = read_to_string(&format!(
        "{}/styles.css",
        std::env::var("CARGO_MANIFEST_DIR").unwrap()
    ))
    .unwrap();
    assert!(file.contains("{font-size:24px;font-family:sans-serif;}"));
}