《《人工智能与数据挖掘教学课件》l.ppt》由会员分享,可在线阅读,更多相关《《人工智能与数据挖掘教学课件》l.ppt(30页珍藏版)》请在三一办公上搜索。
1、2023/8/31,AI&DM,1,Chapter 8Neural Networks,Part III:Advance Data Mining Techniques,2023/8/31,AI&DM,2,What&Why ANN(8.1 Feed forward Neural Network)How ANN works-working principle(8.2.1 Supervised Learning)Most popular ANN-Backpropagation Network(8.5.1 The Backpropagation Algorithm:An example),Content
2、,2023/8/31,AI&DM,3,What&Why ANN:Artificial Neural Networks(ANN),ANN is an information processing technology that emulates a biological neural network.Neuron(神经元)vs Node(Transformation)Dendrite(树突)vs InputAxon(轴突)vs OutputSynapse(神经键)vs WeightStarts in 1970s,become very popular in 1990s,because of th
3、e advancement of computer technology.,2023/8/31,AI&DM,4,2023/8/31,AI&DM,5,2023/8/31,AI&DM,6,What is ANN:Basics,Types of ANNNetwork structure,e.g.Figure 17.9&17.10(Turban,2000,version 5,p663)Number of hidden layersNumber of hidden nodesFeed forward and feed backward(time dependent problems)Links betw
4、een nodes(exist or absent of links)The ultimate objectives of training:obtain a set of weights that makes all the instances in the training data predicted as correctly as possible.Back-propagation is one type of ANN which can be used for classification and estimationmulti-layer:Input layer,Hidden la
5、yer(s),Output layerFully connected Feed forwardError back-propagation,2023/8/31,AI&DM,7,What&Why ANN(8.1 Feed forward Neural Network)How ANN works-working principle(8.2.1 Supervised Learning)Most popular ANN-Backpropagation Network(8.5.1 The Backpropagation Algorithm:An example),Content,2023/8/31,AI
6、&DM,8,2.How ANN:working principle(I),Step 1:Collect dataStep 2:Separate data into training and test sets for network training and validation respectivelyStep 3:Select network structure,learning algorithm,and parametersSet the initial weights either by rules or randomlyRate of learning(pace to adjust
7、 weights)Select learning algorithm(More than a hundred learning algorithms available for various situations and configurations),2023/8/31,AI&DM,9,2.ANN working principle(II),Step 4:Train the networkCompute outputsCompare outputs with desired targets.The difference between the outputs and the desired
8、 targets is called deltaAdjust the weights and repeat the process to minimize the delta.The objective of training is to Minimize the Delta(Error).The final result of training is a set of weights.Step 5:Test the networkUse test set:comparing test results to historical results,to find out the accuracy
9、 of the networkStep 6:Deploy developed network application if the test accuracy is acceptable,2023/8/31,AI&DM,10,2.ANN working principle(III):Example,Example 1:OR operation(see table below)Two input elements,X1 and X2InputsCaseX1X2Desired Results1 0002011(positive)3101(positive)4111(positive),2023/8
10、/31,AI&DM,11,2.ANN working principle(IV):Example,Network structure:one layer(see next page)Learning algorithmWeighted sum-summation function:Y1=XiWiTransformation(transfer)function:Y1 less than threshold,Y=0;otherwise Y=1Delta=Z-YWi(final)=Wi(initial)+Alpha*Delta*XiInitial Parameters:Rate of learnin
11、g:alpha=0.2Threshold=0.5;Initial weight:0.1,0.3Notes:Weights are initially random The value of learning rate-alpha,is set low first.,2023/8/31,AI&DM,12,Processing Informationin an Artificial Neuron,x1,w1j,x2,Yj,w2j,Neuron j wij xi,Weights,Output,Inputs,Summations,Transfer function,2023/8/31,AI&DM,13
12、,What&Why ANN(8.1 Feed forward Neural Network)How ANN works-working principle(8.2.1 Supervised Learning)Most popular ANN-Backpropagation Network(8.5.1 The Backpropagation Algorithm:An example),Content,2023/8/31,AI&DM,14,3.Back-propagation Network,Network Topologymulti-layer:Input layer,Hidden layer(
13、s),Output layerFully connected Feed forwardError back-propagationInitialize weights with random values,2023/8/31,AI&DM,15,Back-propagation Network,Output nodes,Input nodes,Hidden nodes,Output vector,Input vector:xi,wij,2023/8/31,AI&DM,16,3.Back-propagation Network,For each node1.Compute the net inpu
14、t to the unit using summation function 2.Compute the output value using the activation function(i.e.sigmoid function)3.Compute the error4.Update the weights(and the bias)based on the error5.Terminating conditions:all wij in the previous epoch(周期)were so small as to be below some specified thresholdt
15、he percentage of samples misclassified in the previous epoch is below some thresholda pre-specified number of epoch has expired,2023/8/31,AI&DM,17,Backpropagation Error Output Layer,2023/8/31,AI&DM,18,Backpropagation Error Hidden Layer,2023/8/31,AI&DM,19,The Delta Rule,2023/8/31,AI&DM,20,Root Mean S
16、quared Error,2023/8/31,AI&DM,21,3.Back-propagation(cont.),Increase network accuracy and training speedNetwork topologynumber of nodes in input layernumber of hidden layers(usually is one,no more than two)number of nodes in each hidden layernumber of nodes in output layerChange initial weights,learni
17、ng parameter,terminating conditionTraining process:Feed the training instancesDetermine the output errorUpdate the weightsRepeat until the terminating condition is met,2023/8/31,AI&DM,22,Supervised Learning with Feed-Forward Networks,Backpropagation Learning,2023/8/31,AI&DM,23,Summary:Decisions the
18、builder must make,Network Topology:number of hidden layers,number of nodes in each layer,and feedbackLearning algorithms Parameters:initial weight,learning rateSize of training and test data,Structure and parameters determine the length oftraining time and the accuracy of the network,2023/8/31,AI&DM
19、,24,Neural Network Input Format(Normalization:categorical to numerical),All input and output must numerical and between 0,1Categorical Attributes.e.g.attribute with 4 possible valuesOrdinal:Set to 0,0.33,0.66,1Nominal:Set to 0,0,0,1,1,0.1,1 Numerical Attributes:,2023/8/31,AI&DM,25,Neural Network Out
20、put Format,Categorical Attributes:(Numerical to categorical)Type 0&1Type 0.45Numerical Attributes:(0,1 to ordinary value)Min+X*(Max-min),2023/8/31,AI&DM,26,Homework,P264,Computational Questions-2 r=0.5,Tk=0.65Adjust all weights for one epoch,2023/8/31,AI&DM,27,Case Study,Example:Bankruptcy Predictio
21、n with Neural NetworksStructure:Three-layer network,back-propagationTraining data:Small set of well-known financial ratiosData available on bankruptcy outcomes Supervised network,2023/8/31,AI&DM,28,Architecture of the Bankruptcy Prediction Neural Network,X4,X3,X5,X1,X2,Bankrupt 0,Not bankrupt 1,2023
22、/8/31,AI&DM,29,Bankruptcy Prediction:Network architecture,Five Input NodesX1:Working capital/total assets X2:Retained earnings/total assetsX3:Earnings before interest and taxes/total assetsX4:Market value of equity/total debtX5:Sales/total assetsSingle Output Node:Final classification for each firm
23、Bankruptcy or NonbankruptcyDevelopment Tool:NeuroShell,2023/8/31,AI&DM,30,DevelopmentThree-layer network with back-error propagation(Turban,figure 17.12,p669)Continuous valued inputSingle output node:0=bankrupt,1=not bankrupt(Nonbankruptcy)TrainingData Set:129 firmsTraining Set:74 firms:38 bankrupt,36 notTestingTest data set:55 firms:27 bankrupt firms,28 nonbankrupt firmsThe neural network correctly predicted:81.5 percent bankrupt cases 82.1 percent nonbankrupt cases,
链接地址:https://www.31ppt.com/p-5896485.html