Replies: 1 comment 4 replies
-
Just as an FYI, we're constraining GPs with virtual observations in a downstream project here https://github.com/facebookresearch/aepsych/blob/main/aepsych/models/derivative_gp.py. This is specific to variational GPs and derivative psuedo-observations, and used for monotonicity constraints (https://github.com/facebookresearch/aepsych/blob/main/aepsych/models/monotonic_rejection_gp.py), so it's not as general as what's in the Agrell et al. paper, but might be a good example to work from. If this is of interest in other domains, I also wonder if we could potentially look into upstreaming our models from AEPsych into gpytorch that you can then extend? Not sure what the core devs think. |
Beta Was this translation helpful? Give feedback.
-
Hi, I'm working on shape constraints for Gaussian processes following the implementation of Agrell2019. Toy examples are already running using a baseline numpy implementation (below an example of enforcing convexity at the virtual observation points and using a RBF kernel).
Now I want to incorporate it into GPyTorch to make it more efficient (especially considering higher dimensions). From my understanding I have to write a costum prediction strategy (equations (5) and (6)), am I right? My approach would be to use an ExactGP as base class and incorporate the costum prediction strategy or do you have another idea of incorporating this into GPyTorch?
Furthermore, would there be an interest of incorporating such a thing into GPyTorch considering also other shape constraints such as box constraints, constraints on the first derivative e.g. to enforce monotonicity, or constraints on the second derivative (e.g. to enforce concavity/convexity)? Then I would maybe try to incorporate it in a more general fashion to for further use.
Beta Was this translation helpful? Give feedback.
All reactions