CheckersGuy wrote:

Oh, I thought there were better index functions. Better = less memory

I am using this for checkers as well and initial resullts were quite promising. May I ask which optimization algorithm u use for the logistic regression ? I tried stochastic graidient descent which works but is still rather slow. The Gauss Newton method is next on my list if I find a good matrix library which supports sparse matrices

I wouldn't bother with stochastic methods; the eval score function is a simple convex function. Stochastic methods just add complexity in this case.

I am using a multithreaded conjungated gradient algorithm for optimization . It works fairly well, but gradient descent works as well and is much easier to program.

For small number of weights and data points dragon runs pretty quickly. As the number of weights/data points increase, they don't fit into cache anymore and the algorithm slows down.

As for the number of training positions; i think 5-10 million is a good point to start and experiment with. As you increase the training set, the performance will keep increasing every time you double the amount of positions.

Michel