1、 (7) 以test_attribute 标记节点N;(8) For each test_attribute 的已知值v /划分 samples ;(9) 由节点N分出一个对应test_attribute=v的分支;(10令Sv为 samples中 test_attribute=v 的样本集合;/一个划分块 (11)If Sv为空 then (12)加上一个叶节点,以samples中最普遍的类标记;(13)Else 加入一个由Decision_Tree(Sv,attribute_list-test_attribute)返回节点值。(2)实验数据预处理Age:30岁以下标记为“1”;30岁以上5
2、0岁以下标记为“2”;50岁以上标记为“3”。Sex:FEMAL-“1”;MALE-“2”Region:INNERCITY-“1”;TOWN-“2”;RURAL-“3”;SUBURBAN-“4”Income:50002万-“1”;2万4万-“2”;4万以上-“3”MarriedChildrenCarMortgagePep:以上五个条件,若为“是”标记为“1”,若为“否”标记为“2”。Age sex region income married children car mortgage pep 1 2 1 1 2 1 1 2 21 2 1 1 2 2 2 2 12 1 4 1 2 1 2 2 1
3、 2 1 1 1 1 2 2 2 21 2 1 1 1 2 2 2 2 1 2 1 1 2 1 2 1 1 2 1 2 1 1 2 1 1 2 2 1 1 1 2 1 1 2 1 2 1 3 1 2 2 1 2 1 2 1 2 2 2 1 2 2 2 2 2 1 2 2 2 2 1 1 2 1 2 2 1 1 2 1 1 2 2 1 2 1 2 2 1 2 1 1 1 2 1 2 2 2 1 3 2 1 2 1 1 1 2 2 1 1 1 2 1 1 1 2 1 1 1 3 2 2 2 1 2 1 3 1 2 2 1 2 2 2 1 3 2 3 3 1 1 1 2 1 3 2 2 3 1 2
4、1 1 2 3 1 3 3 1 1 2 2 1 3 2 1 3 1 2 1 2 2 3 2 1 3 1 1 1 1 1 3 1 1 3 1 2 1 1 2 3 1 3 3 1 2 2 2 2 3 2 4 3 1 2 2 1 1 3 1 3 3 2 2 1 1 2(3)Matlab语句:Tree RulesMatrix= DecisionTree(DataSet, AttributName);六、实验结果:实验程序:function Tree RulesMatrix=DecisionTree(DataSet,AttributName)%输入为训练集,为离散后的数字,如记录1:1 1 3 2 1;
5、%前面为属性列,最后一列为类标if narginmostlabelnum) mostlabelnum=length(ValRecords(i).matrix); mostlabel=i; Tree.Attribut=mostlabel;length(Attributs) Sa(i) ValRecord=ComputEntropy(DataSet,i); Gains(i)=S-Sa(i); AtrributMatric(i).val=ValRecord; maxval maxindex=max(Gains); Tree.Attribut=Attributs(maxindex); Attribut
6、s2=Attributs(1:maxindex-1) Attributs(maxindex+1:length(Attributs);length(AtrributMatric(maxindex).val) DataSet2=DataSet(AtrributMatric(maxindex).val(j).matrix,1:maxindex-1) DataSet(AtrributMatric(maxindex).val(j).matrix,maxindex+1:size(DataSet,2); if(size(DataSet2,1)=0) Tree.Child(j).root.Attribut=m
7、ostlabel; Tree.Child(j).root.Child=; Tree.Child(j).root=CreatTree(DataSet2,Attributs2); end function Entropy RecordVal=ComputEntropy(DataSet,attribut) %计算信息熵 if(attribut=0) clnum=0;size(DataSet,1) if(DataSet(i,size(DataSet,2)clnum) %防止下标越界 classnum(DataSet(i,size(DataSet,2)=0; clnum=DataSet(i,size(D
8、ataSet,2); RecordVal(DataSet(i,size(DataSet,2).matrix=; classnum(DataSet(i,size(DataSet,2)=classnum(DataSet(i,size(DataSet,2)+1; RecordVal(DataSet(i,size(DataSet,2).matrix=RecordVal(DataSet(i,size(DataSet,2).matrix i; Entropy=0;length(classnum) P=classnum(j)/size(DataSet,1); if(P=0) Entropy=Entropy+
9、(-P)*log2(P); valnum=0; if(DataSet(i,attribut)valnum) %防止参数下标越界 clnum(DataSet(i,attribut)=0; valnum=DataSet(i,attribut); Valueexamnum(DataSet(i,attribut)=0; RecordVal(DataSet(i,attribut).matrix=; %将编号保留下来,以方便后面按值分割数据集clnum(DataSet(i,attribut) %防止下标越界 Value(DataSet(i,attribut).classnum(DataSet(i,size
10、(DataSet,2)=0; clnum(DataSet(i,attribut)=DataSet(i,size(DataSet,2); Value(DataSet(i,attribut).classnum(DataSet(i,size(DataSet,2)= Value(DataSet(i,attribut).classnum(DataSet(i,size(DataSet,2)+1; Valueexamnum(DataSet(i,attribut)= Valueexamnum(DataSet(i,attribut)+1; RecordVal(DataSet(i,attribut).matrix
11、=RecordVal(DataSet(i,attribut).matrix i;valnum Entropys=0; for k=1:length(Value(j).classnum) P=Value(j).classnum(k)/Valueexamnum(j); Entropys=Entropys+(-P)*log2(P); Entropy=Entropy+(Valueexamnum(j)/size(DataSet,1)*Entropys;function showTree(Tree,level,value,branch,AttributValue,AttributName) blank=;
12、level-1 if(branch(i)=1) blank=blank |; if(level=0) blank= (The Root): if isempty(AttributValue)|_ int2str(value) _ value if(length(Tree.Child)=0) %非叶子节点 if isempty(AttributName) disp(blank Attribut int2str(Tree.Attribut); AttributNameTree.Attribut);length(Tree.Child)-1 showTree(Tree.Child(j).root,le
13、vel+1,j,branch 1,AttributValue,AttributName); showTree(Tree.Child(length(Tree.Child).root,level+1,length(Tree.Child),branch(1:length(branch)-1) 0 1,AttributValue,AttributName); rule=cell2struct(content(j,1), content(j,1)=num2str(Tree.Attribut) , num2str(i) rule.str; Rules=Rules;content; Rules= num2str(Tree.Attribut);
copyright@ 2008-2022 冰豆网网站版权所有
经营许可证编号:鄂ICP备2022015515号-1