"Homogenization algorithms" refers to processes or methods that make things more similar or uniform. It involves techniques that bring different elements closer together or remove variations to create a standardized or consistent result.
Full definition
The step - change
homogenization algorithm which had been developed to remove non-climatic biases such as siting biases was shown to be seriously problematic.
Their homogenization algorithm is woefully inappropriate.
Similarly, when NCDC's pairwise
homogenization algorithm is run without the TOBs adjustment being applied first, the end result is very similar to what you get when you explicitly correct for TOBs, as discussed in Williams et al (2012).