This is an example of a page. Unlike posts, which are displayed on your blog’s front page in the order they’re published, pages are better suited for more timeless content that you want to be easily accessible, like your About or Contact information. Click the Edit link to make changes to this page or add another page.

### Like this:

Like Loading...

Please how can I reference your posts incase i used them for research

Hi Aminu, I would cite it like a regular web page – http://blog.apastyle.org/apastyle/2010/11/how-to-cite-something-you-found-on-a-website-in-apa-style.html

I often provide citations from the primary literature, perhaps see those if you need a citation on theory behind a specific post.

I plan on submitting the neural network stuff in a package, which can be cited when available.

Thanks for your interest!

hi. im doing research regarding forecasting using neural network. i read the data should be separated into training and testing data. my question is how to separate the data into testing and training?im totally lost here.where can i refer if u have suggestion on good reference. tq

fg

I’m not an expert, but I don’t believe there are any explicit rules defining the best method for splitting the data… the important part is that you always evaluate the model using an independent (testing) dataset. Neural networks are really good at overfitting and modelling noise in the data that might be a specific characteristic of the training dataset. I’ve often split the data into groups using an arbitrary ratio, e.g., 3:1, such that 75% of the observations are the training and 25% are the testing. The split has to be random so that the training/testing datasets are similar. There are many ways to approach the issue… a quick literature search should provide some clues.

Hi. I saw you are developing a package for minnig data from Strava with R. I’m also working on functions to get data from the Strava API. I have no experience making libraries or packages, and I’m just a beginner programming in R and working with APIs. So there might be way better ways to acomplish the tasks, but so far I have implemented functions for authentication and getting most of the resources from the API with the httr library. If you want to check it out: https://github.com/ptdrow/Rtrava

Hi Pedro, I knew it was only a matter of time before someone else started working on this! This helps immensely as I haven’t had any success accessing the API… My plan was to develop two sets of functions, one that scrapes the data w/o using authentication and a second that accesses the API. I will try some of your functions to see if I can access the API w/ my token. Maybe we can co-author a package if this works out??

That would be great. My plan is to study mobility patterns of urban cyclists in the cities and maybe creating advanced features for Strava users. I’m learning R and data science through the John Hopkins University’s Data Science specialization on Coursera, and so far it has been a great introduction to R and getting data from the web (including APIs). Please send me and email with your feedback of the code and info on how we could collaborate if you want.

PS.: I just updated the code to add comments on the input variables for the functions

Will do, I plan on taking a look this evening!

Hey, I’m currently trying to use my neural network. I have trained it and tested and now I am using a readline function to ask the user for input. My question is how would I use these variables or answers as an input for the neural network and see what the neural network comes up with. I’m currently using the neuralnet package. Could you tell me how I would do this. Thanks

Hi Nick,

This seems like a strange way to get predictions from a neural network. I don’t think there is a predict method for neuralnet models. Try using the nnet (nnet function) or the RSNNS (mlp function) packages. Then you can use the predict method for the model, i.e., predict(my_model, new = newdata), where my_model is your trained/tested model and newdata is a dataframe of user supplied explanatory variables to use for prediction. Hope that helps.

-Marcus

Here is my current code.

##nueral network

setwd(“C:/Users/Nick/Desktop/canmam”)

set.seed(1234)

library(“neuralnet”)

library(“nnet”)

library(“MASS”)

dataset <- read.csv("C:/Users/Nick/Desktop/canmam/mamm.csv")

trainset <- dataset[1:150, ]

testset <- dataset[151:200, ]

creditnet <- neuralnet(status ~ x1 + x2 + x3 + x4 + x5, trainset, hidden = 8, lifesign = "minimal", linear.output = FALSE, threshold = 0.1)

temp_test <- subset(testset, select = c("x1","x2","x3","x4","x5"))

creditnet.results <- compute(creditnet, temp_test)

results <- data.frame(actual = testset$status, prediction = creditnet.results$net.result)

results[1:150, ]

results$prediction <- round(results$prediction)

results[1:49, ]

x1 <- readline("enter the Bi-Radr: ")

x2 <- readline("enter the age: ")

x3 <- readline("enter the shape: ")

x4 <- readline("enter the margin: ")

x5 <- readline("enter the density: ")

df1 = data.frame(x1, x2, x3, x4, x5)

table[df] <- factor(table[[df1]])

df <- model.matrix( ~ x1 + x2 + x3 + x4 + x5, data = df1)

#df <- factor(df)

user.results <- compute(creditnet, new = df)

#prdct <- predict(creditnet, df)

#x <- data.matrix(a)

I tried your method but I keep getting an error. Could you help me out?? I really need to finish this program.

I have a backpropagation algorithm of my own. The main result is a weight matrix. If I call your function with these matrix in the first parameter, what would I obtain?

Hi Rafael,

Most of the functions in the NeuralNetTools package have methods for numeric inputs (e.g., a vector of weight values from a model). The lekprofile function is the only one that does not since it requires predictions from a fitted model in R. See the examples in the help files for the plotnet and garson functions for using numeric functions. The input weights must be a numeric vector with a specific order. For example…

The weight vector shows the weights for each hidden node in sequence, starting with the bias input for each node, then the weights for each output node in sequence, starting with the bias input for each output node. There is an example in the blog post here that illustrates the order.

Hope that helps.

-Marcus

Hi “beckmw”….i want to mention your post on average dissertation length in my ABD Survival Guide newsletter as a fun statistic . How would you like me to list your name and any other identifying characteristics? Is it Marcus Beck at U Michigan? Are you a doctoral student at this time? Please email me your answer. Thanks! Gayle

Hi Gayle, thanks for the support. I did my doctorate at University of Minnesota and I’m currently a post-doc with the USEPA.

Hi, one question…. What is the activation function of nnet?

Thanks for you attention

It’s logistic by default, linear if linout = TRUE.

Hi, do you have the R codes of other of the pseudocodes in the Ecological Detective book? I would be really interested in comparing my results with someone’s else and in understanding how to solve those where I am getting stuck… thanks! Paolo

Hi Paolo, I used to have some of it but unfortunately it’s buried on an old computer. Let that be a lesson in reproducibility! I know you can find some of it with some digging online. Best of luck.