### Tags: cross, dataset, datasetinto, divert, matlab, programming, theni, three-fold, traing, training, validation

# Traing dataset,validation dataset,testing dataset.

On Programmer » Matlab

2,484 words with 1 Comments; publish: Thu, 08 May 2008 00:52:00 GMT; (20085.94, « »)

First,i divert dataset into training dataset and test dataset. Then

i use three-fold cross validation way to divert training dataset

into training dataset and validation dataset. During the training,i

use early stopping by validation dataset.Then i calculate average MSE

and choose the structure of neural network.But i donot know how to

determine weights of neural network and get final neural network

model which i can use test dataset to evaluate the neural network

model.

Please help me!

*http://matlab.todaysummary.com/q_matlab_60400.html*

All Comments

Leave a comment...

- 1 Comments
- xinglifan.matlab.todaysummary.com.gmail.com wrote:
> First,i divert

Replace the term "divert" with "partition" (my preference) or

"split".

> dataset into training dataset and test dataset. Then

> i use three-fold cross validation way to divert training dataset

> into training dataset and validation dataset.

What you are describing below is not called 3-fold XVAL. The

proper terms are "Early Stopping" and "Stopped Training". See

the comp.ai.neural-nets FAQ. It also explains both f-fold and

leave-v-out cross-validation.

Search in Google Groups using

greg-heath XVAL

greg-heath cross-validation

for more details on cross-validation.

> use early stopping by validation dataset.Then i calculate average

> MSE

delete the adjective "average" ; the "M" in MSE already implies

averaging over the individual input vector squared errors.

Use the adjective "average" when you are averaging over the MSE

of different designs (e.g. in 10-fold XVAL).

> and choose the structure of neural network.

Do you mean number of hidden nodes, weights, or both?

> But i do not know how to

> determine weights of neural network and get final neural network

> model which i can use test dataset to evaluate the neural network

> model.

> Please help me!

Make multiple runs over (say) 10 to 30 different weight initializations

and choose the best design based on validation set error.

The test set is used for the final evaluation once the best design is

chosen.

If the test set results are unsatisfactory, the data set should be

repartitioned for a new design in order to make sure that the

new test set is independent of the new design.

Hope this helps.

Greg

#1; Thu, 08 May 2008 00:53:00 GMT

- xinglifan.matlab.todaysummary.com.gmail.com wrote: