R语言实现对模型的参数优化与评价KS曲线、ROC曲线、深度学习模型训练、交叉验证、网格搜索

目录

一、模型性能评估

1、数据预测评估

2、概率预测评估

二、模型参数优化

1、训练集、验证集、测试集的引入

2、k折线交叉验证

2、网格搜索


一、模型性能评估

1、数据预测评估

### 数据预测评估 #### 加载包,不存在就进行在线下载后加载if(!require(mlbench)) install.packages("mlbench")library(mlbench)data("BostonHousing")# 数据分区library(caret)library(ggplot2)library(lattice)index <- createDataPartition(BostonHousing$medv,p = 0.75,list = FALSE)train <- BostonHousing[index,]test <- BostonHousing[-index,]# 利用训练集构建模型,并对测试集进行预测set.seed(1234)fit <- lm(medv ~ .,data = train)pred <- predict(fit,newdata = test)# 自定义函数计算数值预测模型的评估指标numericIndex <- function(obs,pred){# 计算平均绝对误差MAEMAE <- mean(abs(obs-pred))# 计算均方误差MSEMSE <- mean((obs-pred)^2)# 计算均方根误差RMSERMSE <- sqrt(mean((obs-pred)^2))# 计算归一化均方误差NMSE <- sum((obs-pred)^2)/(sum((obs-mean(obs))^2))# 计算判定系数RsquaredRsqured <- cor(pred,obs)^2# 返回向量形式return(c('MAE' = MAE,'MSE' = MSE,'RMSE' = RMSE,'NMSE' = NMSE,'Rsqured' = Rsqured))}# 计算各指标度量值numericIndex(test$medv,pred)

# 利用caret包library(caret)postResample(pred,test$medv)

2、概率预测评估

### 混淆矩阵 #### install.packages("DAAG")library(DAAG)data(anesthetic)anes1=glm(factor(nomove)~conc,family=binomial(link='logit'),data=anesthetic)# 对模型做出预测结果pre=predict(anes1,type='response') # 得到的是样本为1类别时的预测概率值# 以0.5作为分界点result <- ifelse(pre>0.5,1,0)# 构建混淆矩阵confusion<-table(actual=anesthetic$nomove,predict=result)confusion# 计算各指标(1为正样本,0为负样本)(TP <- confusion[4])(TN <- confusion[1])(FP <- confusion[3])(FN <- confusion[2])(Accuracy <- (sum(TN) + sum(TP))/sum(confusion)) #准确率(Accuracy <- (TN + TP)/sum(confusion)) #准确率(Precision <- TP/(TP+FP)) # 精度(Recall <- TP/(TP+FN)) # 灵敏性/召回率(F1 <- 2*TP/(2*TP+FP+FN)) # F1-score(FPR <- FP/(TN+FP)) #假正率
# 使用confusionMatrix函数library(caret)confusionMatrix(data = factor(result), # 预测结果reference = factor(anesthetic$nomove), # 实际结果positive = '1', # 指定类别1为正样本mode = "prec_recall") # 设置为精度和查全率模式
### ROC曲线  #### 构建结果数据集result <- data.frame(pre_prob = pre,true_label = anesthetic$nomove)result <- result[order(result$pre_prob,decreasing = T),] # 按照预测概率值进行降序排序result$cumsum <-  cumsum(rep(1,nrow(result))) # 统计累计样本数量result$poscumsum <- cumsum(result$true_label) # 统计累计正样本数量result$tpr <- round(result$poscumsum/sum(result$true_label==1),3) # 计算真正率result$fpr <- round((result$cumsum-result$poscumsum)/sum(result$true_label==0),3) # 计算假正率result$lift <- round((result$poscumsum/result$cumsum)/(sum(result$true_label==1)/nrow(result)),2) # 计算提升度head(result)tail(result)# 画出roc曲线library(ggplot2)if(!require(ROCR)) install.packages("ROCR")library(ROCR)ggplot(result) +geom_line(aes(x = result$fpr, y = result$tpr),color = "red1",size = 1.2) +geom_segment(aes(x = 0, y = 0, xend = 1, yend = 1), color = "grey", lty = 2,size = 1.2) +annotate("text", x = 0.5, y = 1.05,label=paste('AUC:',round(ROCR::performance(prediction(result$pre_prob, result$true_label),'auc')@y.values[[1]],3)),size=6, alpha=0.8) +scale_x_continuous(breaks=seq(0,1,.2))+scale_y_continuous(breaks=seq(0,1,.2))+xlab("False Postive Rate")+ylab("True Postive Rate")+ggtitle(label="ROC - Chart")+theme_bw()+theme(plot.title=element_text(colour="gray24",size=12,face="bold"),plot.background = element_rect(fill = "gray90"),axis.title=element_text(size=10),axis.text=element_text(colour="gray35"))

# 利用ROCR包绘制roc曲线library(ROCR)pred1 <- prediction(pre,anesthetic$nomove)# 设置参数,横轴为假正率fpr,纵轴为真正率tprperf <- performance(pred1,'tpr','fpr')# 绘制ROC曲线plot(perf,main = "利用ROCR包绘制ROC曲线")

# 计算AUC值auc.adj <- performance(pred1,'auc')auc <- auc.adj@y.values[[1]]auc# 画出KS曲线ggplot(result) +geom_line(aes((1:nrow(result))/nrow(result),result$tpr),colour = "red2",size = 1.2) +geom_line(aes((1:nrow(result))/nrow(result),result$fpr),colour = "blue3",size = 1.2) +annotate("text", x = 0.5, y = 1.05, label=paste("KS=", round(which.max(result$tpr-result$fpr)/nrow(result), 4),"at Pop=", round(max(result$tpr-result$fpr), 4)), size=6, alpha=0.8)+scale_x_continuous(breaks=seq(0,1,.2))+scale_y_continuous(breaks=seq(0,1,.2))+xlab("Total Population Rate")+ylab("TP/FP Rate")+ggtitle(label="KS - Chart")+theme_bw()+theme(plot.title=element_text(colour="gray24",size=12,face="bold"),plot.background = element_rect(fill = "gray90"),axis.title=element_text(size=10),axis.text=element_text(colour="gray35"))

# 画累积提升图ggplot(result) +geom_line(aes(x = (1:nrow(result))/nrow(result), y = result$lift),color = "red3",size = 1.2) +scale_x_continuous(breaks=seq(0,1,.2))+xlab("Total Population Rate")+ylab("Lift value")+ggtitle(label="LIFT - Chart")+theme_bw()+theme(plot.title=element_text(colour="gray24",size=12,face="bold"),plot.background = element_rect(fill = "gray90"),axis.title=element_text(size=10),axis.text=element_text(colour="gray35"))

# 读入封装好的R代码source('自定义绘制各种曲线函数.R')# 加载ROCR.simple数据集library(ROCR)data(ROCR.simple)# 绘制各种曲线pc <- plotCurve(pre_prob=ROCR.simple$predictions,true_label=ROCR.simple$labels)# 查看各种曲线library(gridExtra)grid.arrange(pc$roc_curve,pc$ks_curve,pc$lift_curve,ncol = 3)

二、模型参数优化

1、训练集、验证集、测试集的引入


###   训练集、验证集、测试集的引入  ####注意:以下代码需要安装tensorflow和keras包才能运行devtools::install_github("rstudio/tensorflow")library(tensorflow)install_tensorflow()library(keras)# 导入数据集library(keras)c(c(x_train,y_train),c(x_test,y_test )) %<-% dataset_mnist()# 查看数据集的维度cat('x_train shape:',dim(x_train))cat('y_train shape:',dim(y_train))cat('x_test shape:',dim(x_test))cat('y_test shape:',dim(y_test))# 对数字图像进行可视化par(mfrow=c(3,3))for(i in 1:9){plot(as.raster(x_train[i,,],max = 255))title(main = paste0('数字标签为:',y_train[i]))}

par(mfrow = c(1,1))


# 数据预处理x_train <- array_reshape(x_train,c(nrow(x_train),784))x_test <- array_reshape(x_test,c(nrow(x_test),784))x_train <- x_train / 255x_test <- x_test / 255y_train <- to_categorical(y_train,10)y_test <- to_categorical(y_test,10)# 构建网络结构model <- keras_model_sequential()model %>%layer_dense(units = 256,activation = 'relu',input_shape = c(784)) %>%layer_dense(units = 128,activation = 'relu') %>%layer_dense(units = 10,activation = 'softmax')summary(model)

> # 编译和训练深度学习模型> model %>%+   compile(loss = 'categorical_crossentropy',+           optimizer = optimizer_rmsprop(),+           metrics = c('accuracy'))> history <- model %>% fit(+   x_train,y_train,+   epochs = 10,batch_size = 128,+   validation_split = 0.2+ )

Epoch 1/10

  1/375 ━━━━━━━━━━━━━━━━━━━━ 2:25 389ms/step - accuracy: 0.0547 - loss: 2.3528

 19/375 ━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.5331 - loss: 1.5280   

 39/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.6426 - loss: 1.2044

 60/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.6974 - loss: 1.0292

 80/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.7294 - loss: 0.9236

 99/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.7511 - loss: 0.8515

119/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.7683 - loss: 0.7934

140/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.7827 - loss: 0.7446

160/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.7938 - loss: 0.7066

179/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8028 - loss: 0.6759

201/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8117 - loss: 0.6454

220/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8185 - loss: 0.6224

240/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8247 - loss: 0.6009

261/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8305 - loss: 0.5809

282/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8357 - loss: 0.5630

303/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8404 - loss: 0.5468

323/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8445 - loss: 0.5327

344/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8484 - loss: 0.5191

363/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8517 - loss: 0.5077

375/375 ━━━━━━━━━━━━━━━━━━━━ 2s 4ms/step - accuracy: 0.8538 - loss: 0.5004 - val_accuracy: 0.9590 - val_loss: 0.1390

Epoch 2/10

  1/375 ━━━━━━━━━━━━━━━━━━━━ 8s 22ms/step - accuracy: 0.9688 - loss: 0.1577

 19/375 ━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.9593 - loss: 0.1446

 37/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9586 - loss: 0.1431

 55/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9581 - loss: 0.1421

 72/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9581 - loss: 0.1414

 92/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9581 - loss: 0.1412

111/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9580 - loss: 0.1407

130/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9582 - loss: 0.1397

150/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9585 - loss: 0.1387

171/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9587 - loss: 0.1377

191/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9589 - loss: 0.1367

211/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9592 - loss: 0.1358

230/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9594 - loss: 0.1349

250/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9596 - loss: 0.1340

269/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9598 - loss: 0.1332

291/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9601 - loss: 0.1322

311/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9603 - loss: 0.1314

331/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9605 - loss: 0.1307

352/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9607 - loss: 0.1300

372/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9609 - loss: 0.1293

375/375 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.9610 - loss: 0.1292 - val_accuracy: 0.9680 - val_loss: 0.1072

Epoch 3/10

  1/375 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9453 - loss: 0.1397

 21/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9727 - loss: 0.0838

 41/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9750 - loss: 0.0806

 59/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9759 - loss: 0.0788

 78/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9763 - loss: 0.0776

 99/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9764 - loss: 0.0771

119/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9765 - loss: 0.0770

139/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9764 - loss: 0.0773

161/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9764 - loss: 0.0776

183/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9763 - loss: 0.0778

205/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9763 - loss: 0.0778

224/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9763 - loss: 0.0778

244/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9764 - loss: 0.0777

264/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9764 - loss: 0.0777

282/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9764 - loss: 0.0776

301/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9765 - loss: 0.0775

319/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9765 - loss: 0.0774

337/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9765 - loss: 0.0773

356/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9765 - loss: 0.0773

375/375 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.9766 - loss: 0.0772 - val_accuracy: 0.9735 - val_loss: 0.0908

Epoch 4/10

  1/375 ━━━━━━━━━━━━━━━━━━━━ 8s 24ms/step - accuracy: 0.9766 - loss: 0.0345

 22/375 ━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9827 - loss: 0.0557

 42/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9834 - loss: 0.0553

 63/375 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9832 - loss: 0.0555

 85/375 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9830 - loss: 0.0560

105/375 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9830 - loss: 0.0561

125/375 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9830 - loss: 0.0561

146/375 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9830 - loss: 0.0562

167/375 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9829 - loss: 0.0563

186/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9829 - loss: 0.0564

204/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9829 - loss: 0.0564

221/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9828 - loss: 0.0565

241/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9828 - loss: 0.0565

261/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9828 - loss: 0.0565

281/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9828 - loss: 0.0564

301/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9828 - loss: 0.0564

320/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9828 - loss: 0.0563

339/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9828 - loss: 0.0562

357/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9828 - loss: 0.0562

375/375 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.9828 - loss: 0.0562 - val_accuracy: 0.9747 - val_loss: 0.0845

Epoch 5/10

  1/375 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 1.0000 - loss: 0.0048

 21/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9920 - loss: 0.0268

 41/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9910 - loss: 0.0300

 62/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9907 - loss: 0.0303

 82/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9904 - loss: 0.0309

102/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9900 - loss: 0.0317

122/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9897 - loss: 0.0325

142/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9895 - loss: 0.0333

163/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9893 - loss: 0.0339

183/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9892 - loss: 0.0344

203/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9890 - loss: 0.0350

223/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9889 - loss: 0.0354

244/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9888 - loss: 0.0359

262/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9887 - loss: 0.0362

280/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9886 - loss: 0.0366

300/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9885 - loss: 0.0369

321/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9884 - loss: 0.0372

341/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9883 - loss: 0.0375

360/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9883 - loss: 0.0377

375/375 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.9882 - loss: 0.0379 - val_accuracy: 0.9728 - val_loss: 0.0921

Epoch 6/10

  1/375 ━━━━━━━━━━━━━━━━━━━━ 9s 25ms/step - accuracy: 1.0000 - loss: 0.0120

 20/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9924 - loss: 0.0235

 39/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9915 - loss: 0.0258

 58/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9911 - loss: 0.0267

 78/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9910 - loss: 0.0270

 99/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9908 - loss: 0.0273

118/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9907 - loss: 0.0277

138/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9907 - loss: 0.0280

157/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9906 - loss: 0.0284

175/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9905 - loss: 0.0288

194/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9904 - loss: 0.0291

213/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9904 - loss: 0.0294

233/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9903 - loss: 0.0296

254/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9903 - loss: 0.0298

275/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9903 - loss: 0.0300

296/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9903 - loss: 0.0302

317/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9902 - loss: 0.0303

337/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9902 - loss: 0.0305

358/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9902 - loss: 0.0306

375/375 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.9902 - loss: 0.0307 - val_accuracy: 0.9768 - val_loss: 0.0857

Epoch 7/10

  1/375 ━━━━━━━━━━━━━━━━━━━━ 9s 25ms/step - accuracy: 1.0000 - loss: 0.0091

 20/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9955 - loss: 0.0147

 39/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9948 - loss: 0.0171

 58/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9946 - loss: 0.0183

 77/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9945 - loss: 0.0192

 95/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9944 - loss: 0.0196

114/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9944 - loss: 0.0197

133/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9943 - loss: 0.0199

154/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9943 - loss: 0.0201

175/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9941 - loss: 0.0203

195/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9940 - loss: 0.0206

216/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9939 - loss: 0.0208

237/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9938 - loss: 0.0211

258/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9937 - loss: 0.0213

278/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9936 - loss: 0.0215

299/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9935 - loss: 0.0218

319/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9934 - loss: 0.0220

339/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9933 - loss: 0.0222

359/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9933 - loss: 0.0223

375/375 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.9932 - loss: 0.0225 - val_accuracy: 0.9763 - val_loss: 0.0927

Epoch 8/10

  1/375 ━━━━━━━━━━━━━━━━━━━━ 8s 22ms/step - accuracy: 1.0000 - loss: 0.0030

 21/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9955 - loss: 0.0162

 42/375 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9952 - loss: 0.0177

 62/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9950 - loss: 0.0180

 83/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9950 - loss: 0.0181

104/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9950 - loss: 0.0179

125/375 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9949 - loss: 0.0180

147/375 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9948 - loss: 0.0181

168/375 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9947 - loss: 0.0181

188/375 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9946 - loss: 0.0181

209/375 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9945 - loss: 0.0181

229/375 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9945 - loss: 0.0182

247/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9945 - loss: 0.0182

265/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9944 - loss: 0.0182

284/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9944 - loss: 0.0182

303/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9944 - loss: 0.0183

322/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9944 - loss: 0.0183

341/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9943 - loss: 0.0183

358/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9943 - loss: 0.0184

375/375 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.9943 - loss: 0.0184 - val_accuracy: 0.9790 - val_loss: 0.0842

Epoch 9/10

  1/375 ━━━━━━━━━━━━━━━━━━━━ 8s 24ms/step - accuracy: 1.0000 - loss: 0.0019

 20/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9972 - loss: 0.0090

 40/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9971 - loss: 0.0098

 60/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9970 - loss: 0.0100

 79/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9970 - loss: 0.0102

100/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9969 - loss: 0.0103

120/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9968 - loss: 0.0106

140/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9968 - loss: 0.0108

161/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9967 - loss: 0.0110

181/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9967 - loss: 0.0111

201/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9967 - loss: 0.0113

222/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9967 - loss: 0.0114

242/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9966 - loss: 0.0116

260/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9965 - loss: 0.0117

277/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9965 - loss: 0.0118

298/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9964 - loss: 0.0119

319/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9964 - loss: 0.0121

340/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9963 - loss: 0.0122

360/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9963 - loss: 0.0124

375/375 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.9962 - loss: 0.0125 - val_accuracy: 0.9783 - val_loss: 0.0885

Epoch 10/10

  1/375 ━━━━━━━━━━━━━━━━━━━━ 30s 82ms/step - accuracy: 1.0000 - loss: 0.0014

 20/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9981 - loss: 0.0071 

 40/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9973 - loss: 0.0084

 59/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9970 - loss: 0.0088

 78/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9970 - loss: 0.0090

 98/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9969 - loss: 0.0093

118/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9969 - loss: 0.0094

137/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9969 - loss: 0.0096

156/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9968 - loss: 0.0098

176/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9967 - loss: 0.0100

195/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9967 - loss: 0.0101

215/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9966 - loss: 0.0102

236/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9966 - loss: 0.0103

256/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9966 - loss: 0.0105

276/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9965 - loss: 0.0106

296/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9965 - loss: 0.0106

316/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9965 - loss: 0.0107

335/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9965 - loss: 0.0107

354/375 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9965 - loss: 0.0107

374/375 ━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9965 - loss: 0.0108

375/375 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.9965 - loss: 0.0108 - val_accuracy: 0.9798 - val_loss: 0.0888

plot(history)

# 评估模型效果DNN_score <- model %>% evaluate(x_test,y_test)DNN_score$acc # 查看测试集的准确率

2、k折线交叉验证

### 10折交叉验证 #### 导入car数据集car <- read.table("../data/car.data",sep = ",")# 对变量重命名colnames(car) <- c("buy","main","doors","capacity","lug_boot","safety","accept")# 手动构建10折交叉验证#下面构造10折下标集library(caret)ind<-createFolds(car$accept,k=10,list=FALSE,returnTrain=FALSE)# 下面再做10折交叉验证,这里仅给出训练集和测试集的分类平均误判率。E0=rep(0,10);E1=E0car$accept<-as.factor(car$accept)library(C50)for(i in 1:10){n0=nrow(car)-nrow(car[ind==i,]);n1=nrow(car[ind==i,])a=C5.0(accept~.,car[!ind==i,])E0[i]=sum(car[!ind==i,'accept']!=predict(a,car[!ind==i,]))/n0E1[i]=sum(car[ind==i,'accept']!=predict(a,car[ind==i,]))/n1}(1-mean(E0));(1-mean(E1))
# 利用caret包中的trainControl函数完成交叉验证library(caret)library(ROCR)control <- trainControl(method="repeatedcv",number=10,repeats=3)model <- train(accept~.,data=car,method="rpart",trControl=control)model

plot(model)

2、网格搜索

### 网格搜索 ###### 网格搜索 ####install.packages("gbm")set.seed(1234)library(caret)library(gbm)fitControl <- trainControl(method = 'repeatedcv',number = 10,repeats = 5)# 设置网格搜索的参数池gbmGrid <- expand.grid(interaction.depth = c(3,5,9),n.trees = (1:20)*5,shrinkage = 0.1,n.minobsinnode = 20)nrow(gbmGrid)
# 训练模型,找出最优参数组合gbmfit <- train(accept ~ .,data = car,method = 'gbm',trControl = fitControl,tuneGrid = gbmGrid,metric = 'Accuracy')gbmfit$bestTune # 查看模型最优的参数组合

plot(gbmfit)

 

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://xiahunao.cn/news/3248828.html

如若内容造成侵权/违法违规/事实不符,请联系瞎胡闹网进行投诉反馈,一经查实,立即删除!

相关文章

NFS存储、API资源对象StorageClass、Ceph存储-搭建ceph集群和Ceph存储-在k8s里使用ceph(2024-07-16)

一、NFS存储 注意&#xff1a;在做本章节示例时&#xff0c;需要拿单独一台机器来部署NFS&#xff0c;具体步骤略。NFS作为常用的网络文件系统&#xff0c;在多机之间共享文件的场景下用途广泛&#xff0c;毕竟NFS配置方 便&#xff0c;而且稳定可靠。NFS同样也有一些缺点&…

《Towards Black-Box Membership Inference Attack for Diffusion Models》论文笔记

《Towards Black-Box Membership Inference Attack for Diffusion Models》 Abstract 识别艺术品是否用于训练扩散模型的挑战&#xff0c;重点是人工智能生成的艺术品中的成员推断攻击——copyright protection不需要访问内部模型组件的新型黑盒攻击方法展示了在评估 DALL-E …

昇思25天训练营Day18 - 基于MobileNetv2的垃圾分类

基于MobileNetv2的垃圾分类 本文档主要介绍垃圾分类代码开发的方法。通过读取本地图像数据作为输入&#xff0c;对图像中的垃圾物体进行检测&#xff0c;并且将检测结果图片保存到文件中。 1、实验目的 了解熟悉垃圾分类应用代码的编写&#xff08;Python语言&#xff09;&a…

[MySQL][复核查询][多表查询][自连接][自查询]详细讲解

目录 1.铺垫&基本查询回顾1.多表查询1.何为笛卡尔积&#xff1f;2.示例 2.自连接1.何为自连接&#xff1f;2.示例 3.子查询1.何为子查询&#xff1f;2.单行子查询3.多行子查询4.多列子查询5.在from子句中使用子查询6.合并查询 1.铺垫&基本查询回顾 前面讲解的MYSQL表的…

【深度学习入门篇 ⑨】循环神经网络实战

【&#x1f34a;易编橙&#xff1a;一个帮助编程小伙伴少走弯路的终身成长社群&#x1f34a;】 大家好&#xff0c;我是小森( &#xfe61;ˆoˆ&#xfe61; ) &#xff01; 易编橙终身成长社群创始团队嘉宾&#xff0c;橙似锦计划领衔成员、阿里云专家博主、腾讯云内容共创官…

运算符的使用

一、运算符介绍 运算符是一种特殊的符号&#xff0c;用以表示数据的运算、赋值和比较等 算术运算符赋值运算符比较运算符逻辑运算符位运算符 二、算术运算符 1、算术运算符是对数值类型的变量进行运算的&#xff0c;在程序中使用的非常多 2、算术运算符的使用 # 算术运算符…

Learning vtkjs之vtkSource

vtkSource的主要类型 Cone 锥体Circle 圆形Arrow 箭头ConcentricCylinder 同心圆Cube 方形Cursor3D 包围盒Cylinder 圆柱体Line 线Plane 平面Point 点Sphere 球不能调整center的source 目前整理的有下面几种source&#xff0c;对应有点类似threejs的mesh&#xff0c;通过一定的…

【.NET全栈】ASP.NET开发Web应用——站点导航技术

文章目录 前言一、站点地图1、定义站点地图文件2、使用SiteMapPath控件3、SiteMap类4、URL地址映射 二、TreeView控件1、使用TreeView控件2、以编程的方式添加节点3、使用TreeView控件导航4、绑定到XML文件5、按需加载节点6、带复选框的TreeView控件 三、Menu控件1、使用Menu控…

C语言指针超详解——进阶篇

C语言指针系列文章目录 入门篇 强化篇 进阶篇 文章目录 C语言指针系列文章目录1. 字符指针变量2. 数组指针变量2. 1 概念2. 2 数组指针变量的初始化 3. 二维数组传参的本质4. 函数指针变量4. 1 函数指针变量的创建4. 2 指针变量的使用4. 3 两个有趣的代码4. 3. 1 代码一4. 3. …

c++初阶知识——内存管理与c语言内存管理对比

目录 前言&#xff1a; 1.c&#xff0b;&#xff0b;内存管理方式 1.1 new和delete操作自定义类型 2.operator new与operator delete函数 2.1 operator new与operator delete函数 3.new和delete的实现原理 3.1 内置类型 3.2 自定义类型 new的原理 delete的原理 new…

完整教程 linux下安装百度网盘以及相关依赖库,安装完成之后启动没反应 或者 报错

完整教程 linux下安装百度网盘以及相关依赖库&#xff0c;安装完成之后启动没反应 或者 报错。 配置国内镜像源&#xff1a; yum -y install wget mv /etc/yum.repos.d/CentOS-Base.repo /etc/yum.repos.d/CentOS-Base.repo.bak wget -O /etc/yum.repos.d/CentOS-Base.repo ht…

数据库端口LookUp功能:从数据库中获取并添加数据到XML

本文将为大家介绍如何使用知行之桥EDI系统数据库端口的Lookup功能&#xff0c;从数据库中获取数据&#xff0c;并添加进输入的XML中。 使用场景&#xff1a;期待以输入xml中的值为判断条件从数据库中获取数据&#xff0c;并添加进输入xml中。 例如&#xff1a;接收到包含采购…

pyqt/pyside QTableWidget失去焦点后,选中的行仍高亮的显示

正常情况下pyqt/pyside的QTableWidget&#xff0c;点击input或者按钮失去焦点后 行的颜色消失了 如何在失去焦点时保持行的选中颜色&#xff0c;增加下面的代码&#xff1a; # 获取当前表格部件的调色板 p tableWidget.palette()# 获取活跃状态下的高亮颜色和高亮文本颜色&a…

AWS-S3实现Minio分片上传、断点续传、秒传、分片下载、暂停下载

文章目录 前言一、功能展示上传功能点下载功能点效果展示 二、思路流程上传流程下载流程 三、代码示例四、疑问 前言 Amazon Simple Storage Service&#xff08;S3&#xff09;&#xff0c;简单存储服务&#xff0c;是一个公开的云存储服务。Web应用程序开发人员可以使用它存…

HZNUCTF2023中web相关题目

[HZNUCTF 2023 preliminary]guessguessguess 这道题目打不开了 [HZNUCTF 2023 preliminary]flask 这道题目考察SSTI倒序的模板注入&#xff0c;以及用env命令获得flag 看题目&#xff0c;猜测是SSTI模板注入&#xff0c;先输入{7*7},发现模板是倒序输入的 输入}}7*7{{返回77…

Postgresql主键自增的方法

Postgresql主键自增的方法 一.方法&#xff08;一&#xff09; 使用 serial PRIMARY KEY 插入数据 二.方法&#xff08;二&#xff09; &#x1f388;边走、边悟&#x1f388;迟早会好 一.方法&#xff08;一&#xff09; 使用 serial PRIMARY KEY 建表语句如下&#xf…

学生管理系统(C语言)(Easy-x)

课 程 报 告 课 程 名 称&#xff1a; 程序设计实践 专 业 班 级 &#xff1a; XXXXX XXXXX 学 生 姓 名 &#xff1a; XXX 学 号 &#xff1a; 231040700302 任 课 教 师 &a…

C++类与对象(补)

感谢大佬的光临各位&#xff0c;希望和大家一起进步&#xff0c;望得到你的三连&#xff0c;互三支持&#xff0c;一起进步 个人主页&#xff1a;LaNzikinh-CSDN博客 文章目录 前言一.默认成员函数二.static三.友元四.匿名对象总结 前言 类的默认成员函数&#xff0c;默认成员…

Mongodb数据库(上)

介绍 是一个基于磁盘存储的开源的、文档类型(数据存储格式)的非关系型数据库。 其数据首先是存放到内存中,当内存不够时,它还可以存放到磁盘里面去 优点 基本概念 数据库 mongodb中的数据库默认是’test‘(就是一进去就是直接使用用的test数据库),如果想要使用其他…

【LabVIEW作业篇 - 2】:分数判断、按钮控制while循环暂停、单击按钮获取book文本

文章目录 分数判断按钮控制while循环暂停按钮控制单个while循环暂停 按钮控制多个while循环暂停单击按钮获取book文本 分数判断 限定整型数值输入控件值得输入范围&#xff0c;范围在0-100之间&#xff0c;判断整型数值输入控件的输入值。 输入范围在0-59之间&#xff0c;显示…