Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suggestion of init method #9

Open
NicolasDenoyelle opened this issue Dec 7, 2016 · 9 comments
Open

Suggestion of init method #9

NicolasDenoyelle opened this issue Dec 7, 2016 · 9 comments

Comments

@NicolasDenoyelle
Copy link

Dear Christoph,
I would like suggest a feature to add in your great package, that is cruelly missing in R version.
It would be nice to propose an initialization method copying the weights of an already trained network, that would allow to iteratively train a network with new examples.
Or even better: an input parameter in high level functions (mlp, elman ...) to pass an already existing network to update.
Thank you for telling me if i am missing this functionality. I actually didn't find a way to achieve this browsing R documentation of RSNNS.

Best Regards,
Nicolas

@cbergmeir
Copy link
Owner

Hi,

yes, true, it would be a nice feature. You can already accomplish this by using the low-level API, though I admit it doesn't have the comfort of the high-level API then. To include it in the high-level API is possible but will take a bit of work to do.

@CarstenLange
Copy link

Hi Christoph, I have the same problem. Is there a way that you can guide me on how to initialize an MLP network with weights and activation bias parameters using the low-level API. I could not find it in the RSNNS docs and not in the SNNS 4.1 doc either. Likely, I missed it. I am new to RSNNS (used Siemen's SENN before). I really like your R implementation as it gives the maximum degree of freedom.
The background for my question is that in an R loop I would like to stop the training (e.g., every iteration), manipulate the training data, and continue. But whenever I continue the weights get initialized.

@CarstenLange
Copy link

Hi Christoph, I experimented a litte and was able to manipulate the weights of my NN model (CarstensModel) as follows.Not sure if you meant this with low-level Api:
CarstensModel$unitDefinitions$unitBias<-0.5
CarstensModel$fullWeightMatrix[,3]<-0
What I can't resolve is that if I run the model with MLP(...) the weights will be initialized randomly. How can I avoid that.

@CarstenLange
Copy link

Sorry, I was wrong that did not work either.

@cbergmeir
Copy link
Owner

Hi, did you check the demos included in the R package? Have a look at the irisSnnsR demo, there towards the end you have the training loop. My understanding is that within this loop you want to load new data, right? You should be able to do that with

patset <- snnsObject$createPatSet(inputs, outputs)
snnsObject$setCurrPatSet(patset$set_no)

...

the fullWeightMatrix and unitBias parameters are only read from the SNNS kernel, not written back, so modifying these should not have any effect.

@CarstenLange
Copy link

CarstenLange commented Sep 7, 2017 via email

@cbergmeir
Copy link
Owner

Ok, I hope it will work.

Viele Grüße :)
Christoph

@CarstenLange
Copy link

It worked well. Thank you so much!

One problem I still could not figure out is how to save a weight matrix (including the biases of hidden neurons). You mentioned it would be possible in the low-level Api. I could not find any command in the RSNNS and the SNNS documentation. I tried the following: created a new Snns object (mySnnsObject <- SnnsRObjectFactory()) and tried to copy the complete snns object (mySnnsObject =SnnsObject). However this only sets a pointer to the new object (i.e. when I change SnnsObject then mySnnsObject will cange also. Any suggestions? Thank you so much in advance.

@cbergmeir
Copy link
Owner

cbergmeir commented Sep 21, 2017

I'm not so sure what you want to do...
maybe snnsObject$serializeNet("RSNNS_untitled") or snnsObject$extractNetInfo() do what you need. To deep-copy an SnnsR object, I just adapted the constructor to allow for doing this. So, either get the newest version from github, or simply use the following function:

SnnsRObjectFactory <- function(x = NULL){
  
  snnsObject <- new( "SnnsR")
  
  snnsObject@variables <- new.env()
  
  if(is.null(x)) {
    
    snnsObject@variables$snnsCLibPointer <- .Call("SnnsCLib__new", PACKAGE="RSNNS")
    snnsObject@variables$serialization <- ""    
  } else {
    snnsObject@variables$snnsCLibPointer <- new("externalptr") 
    serNet <- x$serializeNet("RSNNS_untitled")
    snnsObject@variables$serialization <- serNet$serialization
  }
  
  snnsObject
}

Now, you can deep-copy snnsObject as follows:

newSnnsObject <- SnnsRObjectFactory(snnsObject)

To verify the result:

snnsObject$extractNetInfo()
newSnnsObject$extractNetInfo()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants