This is the review code for the NeurIPS'24 submission:

(*) "Hyperbolic Embeddings of Supervised Models"

Code provided without any warranty, use it at your own risks. 

=========================== DO NOT DISTRIBUTE ======================================

====================================================================================
Brief description: 

this code allows to learn boosted combinations of decision trees
to optimize the log-/logistic-loss (*). A GUI embeds the resulting models in 
Poincar\'e model of hyperbolic geometry, following the description in the sub.
The program allows to simulate noise in the training data (symmetric labeled noise, 
see the bottom) and if several algorithms are run simultaneously, automatically 
compares the outputs of algorithms 2-by-2 using a Student paired t-test (not
necessarily useful for visualization). 

Training is done using a X-folds stratified CV (X = tunable, 10 by default).

The program can also save the models learned and a number of useful statistics.

Poincar\'e embedding / t-self is run on the last sequence of model training using 
-- if several runs are made, the last one is displayed only.

MDTs are computed for trees learned with @LogLoss, the last one(s) are plotted
(all trees are plottable, not just in the model but also among splits in CV,
see the online help at runtime)

====================================================================================
Java bits: 

* most variables that could be interesting to change (e.g. to speed-up processing,
polish graphical output, etc.) are in Misc.java (interface Debuggable)

====================================================================================
HowTo:

compile e.g. via ./compile.sh

run via e.g.: java -Xmx10000m Experiments -R resource_abalone.txt 

(example resource file on UCI abalone in /Datasets: see the format of file .features)

resource_abalone.txt contains all parameters to be used (see example file)

the file must contain lines like:

@ALGORITHM,@LogLoss,10,200

@ALGORITHM = general tag, keep it
{@LogLoss} = loss, one choice used in the ICML sub, keep it
X,Y = #trees, max size of trees

At display time, a list of key appears in the shell to manipulate the display;
the details of the MDT displayed in Poincar\'e disk are displayed in the 
command line. Note that the program can automatically crop and save the GUI display.

At runtime, the program displays in the shell the MDTs and can save boosted models 
as well.

====================================================================================
Additional bits, *not covered in the paper*: 

in the resource file, changing the value @ETA_NOISE allows to change the symmetric
label noise in training a-la Long and Servedio -- not analyzed in the submission, 
but can be useful to test models resistance against noise -- enjoy.

====================================================================================
Any question ? we will be happy to answer them via OpenReview

