Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🧹 Cleanup Neural Network task-dependent logic #95

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

lindapaiste
Copy link
Contributor

@lindapaiste lindapaiste commented Mar 17, 2024

Reboot of the changes in this commit: ml5js/ml5-library@f8e4b8f
from ml5js/ml5-library#1410

I branched this on top of the previous 2 PRs, expecting them to be merged first. Changes are in 51846a3

The idea is essentially that all logic which depends on which task is selected (classification, regression or imageClassification) is all declared in one place in a new getTask file. This makes it a lot easier to see how we are handling each situation. And easier to potentially add more task presets in the future.

The rest of the notes copied from the previous PR:


Introduces a level of abstraction for the "task" of the neural network:

getTask.ts

  • Define an interface NNTask which specifies what information a task needs to provide:
    • Its default layers.
    • Its compile options.
    • Any overrides to the NN default options.
    • How to create sample data to use when option neuroEvolution is set.
    • Its name (not currently not used, possible 🪓)
  • Create three concrete implementations of that interface for the three tasks: classification, regression, and imageClassification.
    • imageClassification uses some of the same functions as classification.
    • getSampleData is only possible in a limited set of circumstances (see issue #1409), so I'm throwing a lot of specific errors when those circumstances aren't met.
  • Export a function getTask to convert a string task name into a NNTask object.

NeuralNetwork/index.ts

  • Constructor sets an instance property this.task with the task object based on options.task. This defaults to regression which seems to match what was in the default layers before.
  • Remove large blocks of code in addDefaultLayers() and compile() which was moved into the task objects. Now we call a function on this.task:
    • const layers = this.task.createLayers(inputUnits, hiddenUnits, outputUnits);
    • const options = this.task.getCompileOptions(this.options.learningRate);

NeuralNetwork/NeuralNetwork.ts

  • I'm like 99% sure that using optimizer.call(this, learningRate) to set the thisArg on the optimizer doesn't actually make a difference? (Please correct me if I am wrong). I removed setOptimizerFunction(), which, despite the name, just creates an optimizer and doesn't set anything. The instantiation of the optimizer is handled in the getCompileOptions() function of the task, which gets the learningRate as an argument.

Comment on lines +543 to +547
if (this.options.layers && this.options.layers.length > 2) {
this.createNetworkLayers(this.options.layers);
} else {
this.addDefaultLayers();
}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I removed the modification of this.options.layers because it felt weird/sloppy/wrong. Now I'm slightly concerned that this change could create problems with the .copy() method, which uses this.options to create the cloned neural network. Let's put this PR on hold for right now until I have a change to look into it more.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for keeping a close eye on this @lindapaiste! The copy() method is critical for the neuroevolution examples I developed for the nature of code book.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants