Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
project outcomes
—————-
radx: forward automatic differentiation in R
tada: templated automatic differentiation in C++
development summary
——————-
During the summer of 2010, under the Google Summer of Code program,
I was assigned the project of implementing an engine for Automatic
Differentiation in R. The implementation involved building a fully
functional system for computing numerical derivatives (specifically
of the first and second degrees — jacobian and hessian matrices)
in an efficient manner. Numerical derivatives are extremely useful in
optimization. For example, Newton’s method for unconstrained optimization
requires both the hessian and the gradient of the objective function
and an efficient method to solve the resulting linear system Hx = -g,
usually a preconditioned conjugate gradient method.
Before the mid-term deadline, I had completed the most of the features
that are part of radx as of now. This includes univariate and multivariate
differentiation of vector functions. Computation of first and higher
order pure and partial derivatives are carried out through univariate
Taylor propagation and exact interpolation. Please see documentation
for references and technical material. Also are included a handful of
demos that illustrate what radx can do. Functionally radx has fulfilled
most of the goals that it set out to achieve, except for the support
for reverse mode automatic differentiation.
Post mid-term, according to the timeline I had set for myself, I
was to build the reverse mode differentiation in radx. While easy on
paper this involved, however, the building of a computational tracer
within R. As I had no previous experience with building such tracers,
I chose to instead use my time to build a more powerful backend for
the automatic differentiation engine using C++ for performance and
flexibility. Technically, while this counts as missing functionality,
it is not as serious as one might think as derivatives can still
be computed using the existing forward mode engine within radx. The
difficulty arises only when we require the computation of gradients
for functions that have lot of independent variables. Unfortunately,
however, most real world problems involve such kind of functions and
computing their sensitivities to the independent variables accurately
and quickly is important. Although this hasn’t been done thus far, it
is expected that tada will be fully interfaced with radx so that radx
will use the computational kernels inside of tada to compute derivatives.
tada is now a fast and extensible AD engine written in C++ that can
compute derivatives using a great many scalar types. While building
a fast and correct system tada has been built with an emphasis on
loose coupling between the derivative algorithms and the actual data
types that it uses to compute the derivatives. This has enabled the
computation using a great many scalar types like arbitrary precision
floats, complex numbers and even rational numbers for exact results
(provided operations involve only non-transcendental functions). Support
for interval arithmetic is an important missing feature and it’s inclusion
is on the cards. Correctness of every propagation step in the univariate
Taylor propagation algorithm has been checked by cross checking results
with the symbolic differentiation program, SymPy in unit tests written
with CPPUnit.
tada is expected to be actively developed even past the end of the GSoC
program to continue adding features, building on its strong support for
varied data types. The development summary and future for tada can be
seen in the TODO file in the included folder.
people
——
mentor: Prof. John Nash
student: Chidambaram Annamalai
follow
——
You can follow development of the projects here:
radx: http://github.com/quantumelixir/radx
tada: http://github.com/quantumelixir/tada
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.