A new version of nnetsauce (v0.3.1)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
A new version (v0.3.1) of nnetsauce
is now available. The stable version on PyPi, and a development version on Github. Notable changes for this new version are:
- The inclusion of an upper bound on the error rate of Adaboost: crucial, because the error rate at each iteration has to be at least as good as random guess’.
- New quasi-randomized networks models for regression and classification, with two shrinkage parameters (for model regularization).
The full list of changes can always be found here on Github and a notebook describing some of the new models (for classification) here for 4 datasets (with a snippet below on a wine classification dataset).
Contributions/remarks are welcome as usual, you can submit a pull request on Github.
Note: I am currently looking for a gig. You can hire me on Malt or send me an email: thierry dot moudiki at pm dot me. I can do descriptive statistics, data preparation, feature engineering, model calibration, training and validation, and model outputs’ interpretation. I am fluent in Python, R, SQL, Microsoft Excel, Visual Basic (among others) and French. My résumé? Here!
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.