I have read Algebraic classifiers: a generic approach to fast cross-validation,
online training, and parallel training and was amazed by the performance of the derived algorithms. However, it seems that beyond Naive Bayes (and GBMs) there are not many algorithms adapted to the framework.
Are there any other papers who worked on different classifiers ? (SVMs, Random Forests)
I read a bit into the article you mentioned, to me it seems like a construction using the approach from algebraic statistics. You may want to have a look at:
Cencov, Nikolai Nikolaevich. Statistical decision rules and optimal inference. No. 53. American Mathematical Soc., 2000.
This book is a bit out-of-date, one reason is that there are not many people interested in “categorical applications” nowadays, its original print is around 1980s. But almost all modern algebraic studies in statistics can be traced back to this title.
Another very readable introduction used in the paper you mentioned is:
Drton, Mathias, Bernd Sturmfels, and Seth Sullivant. Lectures on algebraic statistics. Vol. 39. Springer Science & Business Media, 2008.
The paper you mentioned in your question is an application of monoid theoretical construction onto classifying problem, which looks interesting. So hope these references help.