You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I sent these questions in an email to Dr Yang but I thought I would post here too
I have read the paper on FastAD and I am very interested in using fastAD for my algorithm
I was wondering does FastAD work with Rcpp? If so how can I install it? I think it should be possible but I just wanted to check (i'm new to C++)
I have used "autodiff" library (https://autodiff.github.io/) however I have found it to not be much faster than numerical differentiation for my application - have you used this before? I noticed in the paper you didn't benchmark against it
Also I was wondering if it possible to compute a gradient w.r.t a std::vector filled with eigen matrices? (or any other 3D or higher dimensional structure)? or will all the parameters need to be input into a vector or matrix and then reformatted back into the container needed for the rest of the model afterwards?
Is it possible to do use fastAD just within a standard function (rather than using "int main()" etc)? Im new to C++ and have just been using functions for everything (also using it through R via Rcpp)
The text was updated successfully, but these errors were encountered:
Please see https://github.com/eddelbuettel/rcppfastad -- in response to your StackOverflow question. It is a (truly minimal) package: We can do this as FastAD is nicely self-contained. With Eigen given via RcppEigen we just add the headers for FastAD. The package has one simple example from Black-Scholes; I have extended it to compute an additional derivative 'vega' as well as a proof of concept.
I sent these questions in an email to Dr Yang but I thought I would post here too
I have read the paper on FastAD and I am very interested in using fastAD for my algorithm
I was wondering does FastAD work with Rcpp? If so how can I install it? I think it should be possible but I just wanted to check (i'm new to C++)
I have used "autodiff" library (https://autodiff.github.io/) however I have found it to not be much faster than numerical differentiation for my application - have you used this before? I noticed in the paper you didn't benchmark against it
Also I was wondering if it possible to compute a gradient w.r.t a std::vector filled with eigen matrices? (or any other 3D or higher dimensional structure)? or will all the parameters need to be input into a vector or matrix and then reformatted back into the container needed for the rest of the model afterwards?
Is it possible to do use fastAD just within a standard function (rather than using "int main()" etc)? Im new to C++ and have just been using functions for everything (also using it through R via Rcpp)
The text was updated successfully, but these errors were encountered: