Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revert back to allowing inducing point method to be optional #486

Closed
wants to merge 10 commits into from

Conversation

JasonKChow
Copy link
Contributor

Summary: Inducing point methods can now be initialized within models again with them knowing their own dimensionality. So we allow the inducing point args to be completely optional again. This also lets us reorder the args to be in a more sensible order.

Differential Revision: D67226225

Summary:
The test_finish_criteria test uses the MonotonicRejectionGP alongside many mocks. This seems really brittle meaning that it fails when very subtle things changes in the model (and is seemingly impossible to debug and maintain).

The min_post_range finish condition actually cares about the values of the model so we move this out of that test into its own acutally using a model we fit (GPClassificationModel) so that we can test it easier.

Differential Revision: D67066880
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Dec 14, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D67226225

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D67226225

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Dec 14, 2024
…kresearch#486)

Summary:
Pull Request resolved: facebookresearch#486

Inducing point methods can now be initialized within models again with them knowing their own dimensionality. So we allow the inducing point args to be completely optional again. This also lets us reorder the args to be in a more sensible order.

Differential Revision: D67226225
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D67226225

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Dec 14, 2024
…kresearch#486)

Summary:
Pull Request resolved: facebookresearch#486

Inducing point methods can now be initialized within models again with them knowing their own dimensionality. So we allow the inducing point args to be completely optional again. This also lets us reorder the args to be in a more sensible order.

Differential Revision: D67226225
JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Dec 14, 2024
…kresearch#486)

Summary:
Pull Request resolved: facebookresearch#486

Inducing point methods can now be initialized within models again with them knowing their own dimensionality. So we allow the inducing point args to be completely optional again. This also lets us reorder the args to be in a more sensible order.

Differential Revision: D67226225
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D67226225

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D67226225

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Dec 14, 2024
…kresearch#486)

Summary:
Pull Request resolved: facebookresearch#486

Inducing point methods can now be initialized within models again with them knowing their own dimensionality. So we allow the inducing point args to be completely optional again. This also lets us reorder the args to be in a more sensible order.

Differential Revision: D67226225
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D67226225

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Dec 14, 2024
…kresearch#486)

Summary:
Pull Request resolved: facebookresearch#486

Inducing point methods can now be initialized within models again with them knowing their own dimensionality. So we allow the inducing point args to be completely optional again. This also lets us reorder the args to be in a more sensible order.

Differential Revision: D67226225
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D67226225

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Dec 14, 2024
…kresearch#486)

Summary:
Pull Request resolved: facebookresearch#486

Inducing point methods can now be initialized within models again with them knowing their own dimensionality. So we allow the inducing point args to be completely optional again. This also lets us reorder the args to be in a more sensible order.

Differential Revision: D67226225
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D67226225

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Dec 14, 2024
…kresearch#486)

Summary:
Pull Request resolved: facebookresearch#486

Inducing point methods can now be initialized within models again with them knowing their own dimensionality. So we allow the inducing point args to be completely optional again. This also lets us reorder the args to be in a more sensible order.

Differential Revision: D67226225
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D67226225

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Dec 14, 2024
…kresearch#486)

Summary:
Pull Request resolved: facebookresearch#486

Inducing point methods can now be initialized within models again with them knowing their own dimensionality. So we allow the inducing point args to be completely optional again. This also lets us reorder the args to be in a more sensible order.

Differential Revision: D67226225
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D67226225

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Dec 14, 2024
…kresearch#486)

Summary:
Pull Request resolved: facebookresearch#486

Inducing point methods can now be initialized within models again with them knowing their own dimensionality. So we allow the inducing point args to be completely optional again. This also lets us reorder the args to be in a more sensible order.

Differential Revision: D67226225
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D67226225

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Dec 14, 2024
…kresearch#486)

Summary:
Pull Request resolved: facebookresearch#486

Inducing point methods can now be initialized within models again with them knowing their own dimensionality. So we allow the inducing point args to be completely optional again. This also lets us reorder the args to be in a more sensible order.

Differential Revision: D67226225
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D67226225

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D67226225

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Dec 16, 2024
…kresearch#486)

Summary:
Pull Request resolved: facebookresearch#486

Inducing point methods can now be initialized within models again with them knowing their own dimensionality. So we allow the inducing point args to be completely optional again. This also lets us reorder the args to be in a more sensible order.

Differential Revision: D67226225
yalsaffar and others added 4 commits December 16, 2024 11:20
…ncy (facebookresearch#475)

Summary:
Pull Request resolved: facebookresearch#475

#### PR Description:
This PR builds upon the changes introduced in facebookresearch#467 by fully removing the usage of lower and upper bounds within all models.

## Key updates include:

- **Removal of Bounds and Dimension from Models:**
  All `lb`, `ub`, and `dim` parameters have been fully removed from all models. The responsibility for computing inducing points is now entirely managed by allocators, ensuring a cleaner and more modular design.

- **Dependency Shift to Allocators:**
  All model dependencies on dimensions (e.g., for computing the `covar_module`) are now managed through the `dim` property of the allocator object. This centralizes the handling of bounds and dimensions in the allocator system.

- **Updated Allocators Functionality:**
  Allocators like `AutoAllocator` and `KMeansAllocator` now optionally take bounds. Bounds are essential in scenarios where no input data is provided, as they ensure fallback to the `DummyAllocator` (e.g., during initialization from configuration).

- **Tests:**
  - All tests have been updated to make use of the allocator object as a requirement.
  - Most tests pass bounds to the allocator, while others initialize bounds via configuration.
  - Tests now fully align with the updated model-allocator interaction and function as expected.

Let me know if any changes are needed! :)

Pull Request resolved: facebookresearch#468

Differential Revision: D66968212
Summary:
Move the inducing point allocator classes to its own directory under models and split each allocator out to their own files.

Modified imports to match the refactor and cleaned up imports around them.

Differential Revision: D67044580
…facebookresearch#478)

Summary:
Pull Request resolved: facebookresearch#478

Inducing point allocator classes had extra methods and attributes that were not necessary or misnamed. These have been cleaned up in the BaseAllocator class and all of its children are similarly updated.

The dummy allocator is no longer needed, instead, the allocators can just make dummy points based on this dimensionality. This sets the last_allocator_used attribute to be None to signify no actual allocator was used in creating the last set of points.

This also requires all allocators to know its dimensionality at least as it is used by the dummy allocator when there's no inputs.

All models that use an allocator now initializes the allocators outside as there's no trivial default for allocators anymore.

Differential Revision: D67059839
Summary:
The init of GPClassification was saving exta atttributes unnecessarily as well as getting largely disorganized in the refactors.

We reorganize GPClassificationModel's init and its children.

Differential Revision: D67104228
JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Dec 16, 2024
…kresearch#486)

Summary:
Pull Request resolved: facebookresearch#486

Inducing point methods can now be initialized within models again with them knowing their own dimensionality. So we allow the inducing point args to be completely optional again. This also lets us reorder the args to be in a more sensible order.

Differential Revision: D67226225
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D67226225

Summary:
Unnecessary utility function replaced by just calling the Allocator class method.

This diff removes using strings to select allocators.

Differential Revision: D67068021
…eir own functions

Summary: Methods like get_min, inv_query, and dim_grid have been completely removed from models and made into utility functions. Strategy used to call these now call the utility functions. The functions are better encapsulated such that they work exactly the same way whether they're being called separately or from within a strategy.

Differential Revision: D67118446
…ts don't match their dim

Summary:
It's not guaranteed that the inputs will match the dimensionality of the inducing point allocator. For allocators that directly act on the inputs, this needs to be handled. This usually happens when the inputs are augmented with extra indices (e.g., MonotonicRejectionGP).

The KMeans allocator needs to slice off the extra parts of the dimension.

GreedyVarianceAllocator needs to keep them for evaluating with the kernel, then slice them off from the results to make inducing points the right shape.

Differential Revision: D67158938
…acebookresearch#483)

Summary:
Pull Request resolved: facebookresearch#483

The default allocator is the AutoAllocator, which we've changed to use the GreedyVarianceAllocator instead of KMeans.

This is probably at least as good as before while needing less computation.

A side effect is that we will mostly avoid the magic number 100 slowing down certain CPU array acceleration libraries. All the 99s that were set as defaults have been changed back to the reasonable 100.

Differential Revision: D67120468

Reviewed By: crasanders
…kresearch#486)

Summary:
Pull Request resolved: facebookresearch#486

Inducing point methods can now be initialized within models again with them knowing their own dimensionality. So we allow the inducing point args to be completely optional again. This also lets us reorder the args to be in a more sensible order.

Reviewed By: crasanders

Differential Revision: D67226225
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D67226225

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in a077874.

JasonKChow added a commit that referenced this pull request Dec 18, 2024
Summary:
Pull Request resolved: #486

Inducing point methods can now be initialized within models again with them knowing their own dimensionality. So we allow the inducing point args to be completely optional again. This also lets us reorder the args to be in a more sensible order.

Reviewed By: crasanders

Differential Revision: D67226225

fbshipit-source-id: b23f94d16aa3c52ddf4002157dc73cf78cea9e5c
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants