All entries.
Showing Items 21-40 of 676 on page 2 of 34: Previous 1 2 3 4 5 6 7 Next Last

Logo JMLR JKernelMachines 3.0

by dpicard - May 4, 2016, 17:53:28 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 94083 views, 19075 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 4 votes)

About: machine learning library in java for easy development of new kernels and kernel algorithms

Changes:

Version 3.0

/! Warning: this version is incompatible with previous code

  • change license to BSD 3-clauses
  • change package name to net.jkernelmachines
  • change to maven build system (available through central)
  • online training interfaces to allow continuous online learning
  • add a new budget oriented kernel classifier
  • new kernel and processing especially for strings

Logo JMLR libDAI 0.3.2

by jorism - July 17, 2015, 15:59:55 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 92854 views, 18118 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: libDAI provides free & open source implementations of various (approximate) inference methods for graphical models with discrete variables, including Bayesian networks and Markov Random Fields.

Changes:

Release 0.3.2 fixes various bugs and adds GLC (Generalized Loop Corrections) written by Siamak Ravanbakhsh.


Logo Somoclu 1.7.5

by peterwittek - March 1, 2018, 23:30:34 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 90515 views, 16563 downloads, 0 subscriptions

About: Somoclu is a massively parallel implementation of self-organizing maps. It relies on OpenMP for multicore execution, MPI for distributing the workload, and it can be accelerated by CUDA on a GPU cluster. A sparse kernel is also included, which is useful for training maps on vector spaces generated in text mining processes. Apart from a command line interface, Python, Julia, R, and MATLAB are supported.

Changes:
  • New: A Makefile for mingw to build on Windows.
  • Changed: PR #94 added a much more efficient sparse kernel.
  • Changed: boilerplate code for Julia greatly improved.
  • Changed: Code cleanup, pre-processor macros simplified.
  • Changed: Adapted to Seaborn API changes in plotting heatmaps.

Logo JMLR GPstuff 4.7

by avehtari - June 9, 2016, 17:45:15 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 87538 views, 20059 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2016-06-09 Version 4.7

Development and release branches available at https://github.com/gpstuff-dev/gpstuff

New features

  • Simple Bayesian Optimization demo

Improvements

  • Improved use of PSIS
  • More options added to gp_monotonic
  • Monotonicity now works for additive covariance functions with selected variables
  • Possibility to use gpcf_squared.m-covariance function with derivative observations/monotonicity
  • Default behaviour made more robust by changing default jitter from 1e-9 to 1e-6
  • LA-LOO uses the cavity method as the default (see Vehtari et al (2016). Bayesian leave-one-out cross-validation approximations for Gaussian latent variable models. JMLR, accpeted for publication)
  • Selected variables -option works now better with monotonicity

Bugfixes

  • small error in derivative observation computation fixed
  • several minor bug fixes

Logo JMLR MultiBoost 1.2.02

by busarobi - March 31, 2014, 16:13:04 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 86259 views, 14120 downloads, 0 subscriptions

About: MultiBoost is a multi-purpose boosting package implemented in C++. It is based on the multi-class/multi-task AdaBoost.MH algorithm [Schapire-Singer, 1999]. Basic base learners (stumps, trees, products, Haar filters for image processing) can be easily complemented by new data representations and the corresponding base learners, without interfering with the main boosting engine.

Changes:

Major changes :

  • The “early stopping” feature can now based on any metric output with the --outputinfo command line argument.

  • Early stopping now works with --slowresume command line argument.

Minor fixes:

  • More informative output when testing.

  • Various compilation glitch with recent clang (OsX/Linux).


Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 4.1

by hn - November 27, 2017, 19:26:13 CET [ Project Homepage BibTeX Download ] 84970 views, 18856 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave/Matlab implementation of inference and prediction with Gaussian process models. The toolbox offers exact inference, approximate inference for non-Gaussian likelihoods (Laplace's Method, Expectation Propagation, Variational Bayes) as well for large datasets (FITC, VFE, KISS-GP). A wide range of covariance, likelihood, mean and hyperprior functions allows to create very complex GP models.

Changes:

Logdet-estimation functionality for grid-based approximate covariances

  • Lanczos subspace estimation

  • Chebyshef polynomial expansion

More generic infEP functionality

  • dense computations and sparse approximations using the same code

  • covering KL inference as a special cas of EP

New infKL function contributed by Emtiyaz Khan and Wu Lin

  • Conjugate-Computation Variational Inference algorithm

  • much more scalable than previous versions

Time-series covariance functions on the positive real line

  • covW (i-times integrated) Wiener process covariance

  • covOU (i-times integrated) Ornstein-Uhlenbeck process covariance (contributed by Juan Pablo Carbajal)

  • covULL underdamped linear Langevin process covariance (contributed by Robert MacKay)

  • covFBM Fractional Brownian motion covariance

New covariance functions

  • covWarp implements k(w(x),w(z)) where w is a "warping" function

  • covMatern has been extended to also accept non-integer distance parameters


Logo r-cran-e1071 1.6-8

by r-cran-robot - January 1, 2018, 00:00:07 CET [ Project Homepage BibTeX Download ] 82712 views, 18931 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 1 vote)

About: Misc Functions of the Department of Statistics, Probability Theory Group (Formerly

Changes:

Fetched by r-cran-robot on 2018-01-01 00:00:07.696284


Logo ADAMS 17.12.0

by fracpete - December 20, 2017, 09:38:32 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 82571 views, 16437 downloads, 0 subscriptions

About: The Advanced Data mining And Machine learning System (ADAMS) is a flexible workflow engine aimed at quickly building and maintaining data-driven, reactive workflows, easily integrated into business processes.

Changes:

Some highlights:

  • Code base was moved to Github
  • Nearly 90 new actors, 25 new conversions
  • much improved deeplearning4j module
  • experimental support for Microsoft's CNTK deep learning framework
  • rsync module
  • MEKA webservice module
  • improved support for image annotations
  • improved LaTeX support
  • Websocket support

Logo r-cran-party 1.0-6

by r-cran-robot - January 9, 2013, 00:00:00 CET [ Project Homepage BibTeX Download ] 81368 views, 20092 downloads, 0 subscriptions

About: A Laboratory for Recursive Partytioning

Changes:

Fetched by r-cran-robot on 2013-04-01 00:00:06.775432


Logo JMLR Waffles 2014-07-05

by mgashler - July 20, 2014, 04:53:54 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 80684 views, 18756 downloads, 0 subscriptions

About: Script-friendly command-line tools for machine learning and data mining tasks. (The command-line tools wrap functionality from a public domain C++ class library.)

Changes:

Added support for CUDA GPU-parallelized neural network layers, and several other new features. Full list of changes at http://waffles.sourceforge.net/docs/changelog.html


Logo BRML toolbox 070711

by DavidBarber - July 17, 2011, 19:30:15 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 76727 views, 6963 downloads, 0 subscriptions

About: Bayesian Reasoning and Machine Learning toolbox

Changes:

Fixed some small bugs and updated some demos.


Logo Accord.NET Framework 3.8.0

by cesarsouza - October 23, 2017, 20:50:27 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 76403 views, 15215 downloads, 0 subscriptions

About: The Accord.NET Framework is a .NET machine learning framework combined with audio and image processing libraries completely written in C#. It is a complete framework for building production-grade computer vision, computer audition, signal processing and statistics applications even for commercial use. A comprehensive set of sample applications provide a fast start to get up and running quickly, and an extensive online documentation helps fill in the details.

Changes:

For a complete list of changes, please see the full release notes at the release details page at:

https://github.com/accord-net/framework/releases/tag/v3.8.0


Logo r-cran-Boruta 6.0.0

by r-cran-robot - September 1, 2018, 00:00:04 CET [ Project Homepage BibTeX Download ] 74575 views, 17250 downloads, 0 subscriptions

About: Wrapper Algorithm for All Relevant Feature Selection

Changes:

Fetched by r-cran-robot on 2018-09-01 00:00:04.516878


Logo FEAST 2.0.0

by apocock - January 8, 2017, 00:49:19 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 74097 views, 12473 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: FEAST provides implementations of common mutual information based filter feature selection algorithms (mim, mifs, mrmr, cmim, icap, jmi, disr, fcbf, etc), and an implementation of RELIEF. Written for C/C++ & Matlab.

Changes:

Major refactoring of FEAST to improve speed and portability.

  • FEAST now clones the input data if it's floating point and discretises it to unsigned ints once in a single pass. This improves the speed by about 30%.
  • FEAST now has unsigned int entry points which avoid this discretisation and are much faster if the data is already categorical.
  • Added weighted feature selection algorithms to FEAST which can be used for cost-sensitive feature selection.
  • Added a Java API using JNI.
  • FEAST now returns the internal score for each feature according to the criterion. Available in all three APIs.
  • Rearranged the repository to make it easier to work with. Header files are now in `include`, source in `src`, the MATLAB API is in `matlab/` and the Java API is in `java/`.
  • FEAST now compiles cleanly using `-std=c89 -Wall -Werror`.

Logo Milk 0.5

by luispedro - November 7, 2012, 13:08:28 CET [ Project Homepage BibTeX Download ] 74083 views, 19362 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarEmpty StarEmpty Star
(based on 2 votes)

About: Python Machine Learning Toolkit

Changes:

Added LASSO (using coordinate descent optimization). Made SVM classification (learning and applying) much faster: 2.5x speedup on yeast UCI dataset.


Logo mldata-utils 0.5.0

by sonne - April 8, 2011, 10:02:44 CET [ Project Homepage BibTeX Download ] 73753 views, 15931 downloads, 0 subscriptions

About: Tools to convert datasets from various formats to various formats, performance measures and API functions to communicate with mldata.org

Changes:
  • Change task file format, such that data splits can have a variable number items and put into up to 256 categories of training/validation/test/not used/...
  • Various bugfixes.

Logo KeBABS 1.5.4

by UBod - July 28, 2017, 09:55:04 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 72731 views, 14366 downloads, 0 subscriptions

Rating Empty StarEmpty StarEmpty StarEmpty StarEmpty Star
(based on 1 vote)

About: Kernel-Based Analysis of Biological Sequences

Changes:
  • importing apcluster package for avoiding method clashes
  • improved and completed change history in inst/NEWS and package vignette

Logo Cognitive Foundry 3.4.2

by Baz - October 30, 2015, 06:53:03 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 71332 views, 14237 downloads, 0 subscriptions

About: The Cognitive Foundry is a modular Java software library of machine learning components and algorithms designed for research and applications.

Changes:
  • General:
    • Upgraded MTJ to 1.0.3.
  • Common:
    • Added package for hash function computation including Eva, FNV-1a, MD5, Murmur2, Prime, SHA1, SHA2
    • Added callback-based forEach implementations to Vector and InfiniteVector, which can be faster for iterating through some vector types.
    • Optimized DenseVector by removing a layer of indirection.
    • Added method to compute set of percentiles in UnivariateStatisticsUtil and fixed issue with percentile interpolation.
    • Added utility class for enumerating combinations.
    • Adjusted ScalarMap implementation hierarchy.
    • Added method for copying a map to VectorFactory and moved createVectorCapacity up from SparseVectorFactory.
    • Added method for creating square identity matrix to MatrixFactory.
    • Added Random implementation that uses a cached set of values.
  • Learning:
    • Implemented feature hashing.
    • Added factory for random forests.
    • Implemented uniform distribution over integer values.
    • Added Chi-squared similarity.
    • Added KL divergence.
    • Added general conditional probability distribution.
    • Added interfaces for Regression, UnivariateRegression, and MultivariateRegression.
    • Fixed null pointer exception that can happen in K-means with an empty cluster.
    • Fixed name of maxClusters property on AgglomerativeClusterer (was called maxMinDistance).
  • Text:
    • Improvements to LDA Gibbs sampler.

Logo BayesOpt, a Bayesian Optimization toolbox 0.8.2

by rmcantin - December 9, 2015, 04:53:31 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 68451 views, 13439 downloads, 0 subscriptions

About: BayesOpt is an efficient, C++ implementation of the Bayesian optimization methodology for nonlinear-optimization, experimental design and stochastic bandits. In the literature it is also called Sequential Kriging Optimization (SKO) or Efficient Global Optimization (EGO). There are also interfaces for C, Matlab/Octave and Python.

Changes:

-Fixed bug in save/restore. -Fixed bug in initial design.


Logo hca 0.63

by wbuntine - April 26, 2016, 15:35:03 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 66110 views, 8966 downloads, 0 subscriptions

About: Multi-core non-parametric and bursty topic models (HDP-LDA, DCMLDA, and other variants of LDA) implemented in C using efficient Gibbs sampling, with hyperparameter sampling and other flexible controls.

Changes:

Corrected the new normalised Gamma model for topics so it works with multicore. Improvements to documentation. Added an asymptotic version of the generalised Stirling numbers so it longer fails when they run out of bounds on bigger data.


Showing Items 21-40 of 676 on page 2 of 34: Previous 1 2 3 4 5 6 7 Next Last