taskGneural Network - Tasks: task #14202, Implement Dropout regularization...

 
 

You are not allowed to post comments on this tracker with your current authentication level.

task #14202: Implement Dropout regularization for nnet.

Submitter:  Ray Dillinger <rayd>
Submitted:  Sun 30 Oct 2016 05:05:19 PM UTC
   
 
Should Start On:  Sun 30 Oct 2016 07:00:00 AM UTC Should be Finished on:  Fri 30 Dec 2016 08:00:00 AM UTC
Category:  None Priority:  5 - Normal
Status:  None Privacy:  Public
Assigned to:  None Percent Complete:  0%
Open/Closed:  Open Effort:  0.00

Sun 30 Oct 2016 05:05:19 PM UTC, original submission:  

Under dropout regularization, during training some randomly-selected fraction of the nodes simply don't activate, and the signal from all other nodes is multiplied by the complement of that fraction. 

This is a very effective way to prevent overfitting, even when the subsequent layer has more weights than the previous.

Ray Dillinger <rayd>
Group Member

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

No files currently attached

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -email is unavailable- added by rayd (Submitted the item)
  •  

    There are 0 votes so far. Votes easily highlight which items people would like to see resolved in priority, independently of the priority of the item set by tracker managers.

     

    Follows 1 latest change.

    Date Changed by Updated Field Previous Value => Replaced by
    2016-10-30 rayd Should be Finished on2016-10-30 2016-12-30

    Back to the top

    Powered by Savane 3.13-f8d8.
    Corresponding source code