欢迎来到三一办公! | 帮助中心 三一办公31ppt.com(应用文档模板下载平台)
三一办公
全部分类
  • 办公文档>
  • PPT模板>
  • 建筑/施工/环境>
  • 毕业设计>
  • 工程图纸>
  • 教育教学>
  • 素材源码>
  • 生活休闲>
  • 临时分类>
  • ImageVerifierCode 换一换
    首页 三一办公 > 资源分类 > DOCX文档下载  

    《中国邮电高校学报(英文)》论文投稿模板.docx

    • 资源ID:7262708       资源大小:160.55KB        全文页数:10页
    • 资源格式: DOCX        下载积分:5金币
    快捷下载 游客一键下载
    会员登录下载
    三方登录下载: 微信开放平台登录 QQ登录  
    下载资源需要5金币
    邮箱/手机:
    温馨提示:
    用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)
    支付方式: 支付宝    微信支付   
    验证码:   换一换

    加入VIP免费专享
     
    账号:
    密码:
    验证码:   换一换
      忘记密码?
        
    友情提示
    2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
    3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
    4、本站资源下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。
    5、试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。

    《中国邮电高校学报(英文)》论文投稿模板.docx

    Noisyspeechemotionrecognitionusingsamp1.ereconstructionandmu1.tip1.e-kerne1.1.earningJiangXiaoqing121XiaKcwcn1(0),1.inYong1.iang,BaiJianchuan'1. Schoo1.<X'E1.cctnx2Infixnuiitxi1.'Acrn.IfcbeiUfIiYCrtiI)MTbtKKihgy.TianjinXOU1.I.Chuu2. Schoo1.ofInZormutionScuetuxandEngiCKCnc:,UnivCnjCyfJirun.J1.n由25<K22.Cu3. InfoEXiiionCeneer.HanjinChsiian1.ivcnMy.TiCejin5(MMH4.ChinaAbstractSpeechemotionrecognition(SER)innoisyenvixmmen1.isavita1.issueinartificia1.ime1.Iigence(A1.).Inthispaper,(hereconstructionofspeechsamp1.esrcnu)vcstheaddednoise.Acousticfeaturesextractedfromthereconstructedsamp1.esarcse1.<x1.cd(obui1.danop<irna1.fe<uresubsetwithbe1.terCmO1.iona1.11xognizabih1.y.Amuhipk-keme1.(MK)support,ectormachine(SVM)c1.assifierso1.vedbyscni-dc11nitcprogramming(SDPIisadoptedinSERprocedure.TheproposedInC1.hodinthispaperisde11ms1.r<(edonBeriinDatabaseofEjnotiona1.Sptxch.Recognitionaccuraciesoftheorigina1.,noiy.andreconstructedsamp1.esc1.assifiedbybothsing1.e-kerne1.(SK)andMKc1.assifiersarccomparedandana1.yzed.Theexpcrimenia1.resu1.tsshowIhiKthepreposedIne1.hOdi、effecuveand11>bus1.whennmseexists.Kcyw<Hdxv1.Mrti<mm¾uiM,cmSeACd3fur.muK祢UcntdIf,U,IrC¼do1 IntroductionComPIenKiHiIriIyexistsbetweenhuman'sUIYeC1.iVityand1.ogica1.IhinkingkW)emotiona1.infbrma1.inissignificanttoUixkrstandtherea1.meaninginhuman'sspeech.SERisanimportantresearchfie1.dintherea1.izationofA1.11.Noiseexistingintheenvironmentandsigna1.processingsystemsinf1.uencestherecognitionaccuraciesand1.imitsthepractica1.app1.icationsofSER.suchasinte1.1.igentcustomersen*icesystemsandadjuvanttherapysystemsforautism,whereaccuraterecognitionofen)t>onsisneededtomakeaPrOperresponse.Inthispaper,noisySERisstudiedusing(hecombinationofsamp1.ereeouirUC1.ionbasedoncompressedsensing(CS)theoryandmu1.tip1.eke11>e1.1.earning(MK1.).InSER1.woessentia1.aspectsinf1.uencingthePCrfOfmansoftheemotionrecognitionsystem<reOP1.inIa1.fMUreseiandC1.YeuiWerecognitionc1.assifier.Theprecisionandinherentpropertiesofspeechfemurcsin11ectheemotiona1.rccognizabi1.ityofthefeatureset.NoisehasnegativeimpactontheextractionofacousticRoCmZX21>W2OI6C<xespriirJutk1.r:XuiKCQCr1.Emai1.:kw*i心MbinVdUeDOI:10.1016SI(K588851.17*H*features,andattcmp<s(ocopewith(henoiseinSERstartedfrom2006(2.Schu1.1.crc1.a1.SdCaedfeaturesubsetfroma4kfeaturesettorecognizecinotionsfromnoisyspeechsamp1.es3J.Youcta1.proposedenhanced1.ipschitzembeddingtoreducetheinf1.uenceofnoise4.Techniquessuchasswitching1.ineardynamicmode1.sandtwo-stageWienerfi1.teringetc.werea1.soproposedtohand1.enoisyspeechforc1.assification(5.CStheoryp11>posedbyDonohoe(a1.providespromisingmehcKtoOOiSyspeechprocessing6-7.Sparsercpfsenati>ninCStheoryhasbeenusedinnonaran>e(ricc1.assifier.Zhaoe(a1.adopted1.heenhancedsparseNPZMm1.i1.1.iOac1.assifierIode<1.with1.herx*>us1.SER.Addi1.iuna1.1.y.asIhcderivedCocffiden1.sofnisearer>sparseinanytransferd(xnain.itisimpossib1.eIor<xons(ruc1.1.henoiM:frommeasurements.Sosparsesigna1.scontaminatedbynoisecanbereconstructedwithhighqua1.ity9.Inthispaper.CStheoryisuti1.izedin(hedenoisingofnoisyspeechsamp1.esthroughsamp1.ercconstn1.ion.Acousticfeaturesofthereconstructedsamp1.esarcextractedandse1.ectedaccordingtothecomp1.ementaryinformationinOrdCrtoconstituterobustandoptima1.featuresubset.SVMisoneofthemosteffectivemethodsinpatternrecognitionprob1.ems.SVrMisakerne1.methodof11uiing(hemaximummafinhyperp1.anein(hefeatureSPaCeanditse1o11nancedependson(hekerne1.fuionstrong1.y.SoiisnecessaryIoovercome1.hekemddependCIKyindesigning1.heef1.c1.ivec1.assifierwithSVM.Inordertoirnpx>ve(heI1.cxibi1.i1.yofkerne1.fur(i>.MK1.isProPOMrdanddeveked(bina(k>nofdif1.crcntkerne1.s.1.anCkriCtcta1.PrOPOSCdMK1.withatransductionsettingfor1.earningakerne1.matrixfromdata.Themethodaimedattheop<ibinationofpredefinedbasekerne1.stogenerateagoodtargetkerne1.(1.Jincta1.Pft)POScdfeaturefusionmethodbasedonMK1.toimprovetheto<a1.SERperformanceofc1.eansamp1.es.TheWdghISofdi11crcnkerne1.scorrespondingtotheghba1.and1.oca1.featuresaregivenhughagfidSearChnh<xi(111.Inhipaper.MKfusionstrategyofHmckiietisadoptedtoimx>ve(heSVMn>de1.inabinaryInjeMructuredmuki-c1.assc1.assifier,and(hefusionCOefnCien1.qOfdifferentkerne1.sareso1.vedbyIhcSDP(ofindUPIima1.WeighISofmuI1.ip1.ckerne1.s.The3gofthep;iprarcstrc1.uredasIhcfo1.1.owings:Sect2reviews1.hebasicideaofCSinspe<xhsigna1.processingandana1.yzestheperformanceofnoisysamp1.ereconstruction.Sect.3introducesMK1.so1.vedbySDRAcousticfeaturesandfeaturese1.ectionarcpresentedinSect.4.ThCpcribrmanceeva1.uationofSERandexperimenta1.resu1.tsarci1.1.ustratedandana1.yzedinSect.5.Fina1.1.ySect.6devotestotheconc1.usions.2 CSandsamp1.ereconstructionofnoisyspeechCScombinessamp1.ingandcomssiinintooMepusingIbeminimumnumberofmeasurementsWhhmaximuminfonnation.CSaimsIurecoversparseMgna1.withfarIg(hanNyquis1.*Sh;mnonsamp1.ingr*te.u11dIbereconstructioncanbeexactunderkeyConCCPIysuch<s卬ani1.yandgIriC1.Cd太UmeIryproper1.y(RIP)7.12,x=(I),(2),.,()1卜Ihesigna1.inMdimensiona1.space_whereNisthenumberofsamp1.es,xcanberepresentedbythe1.inearcombinationof.V-di11nsiona1.orthogona1.basisvectorjrnw-1.2.,N.Thusxcanberepresentedas:.vX=Ea,”=Wa(>-1.InEq.(I)Vzisbeorthogona1.basisIM1.rix,a1.sonamedrepresentationmatrix,=(*.,“)isprojectioncoefficient,«isIhenecioncoefficientInaIriXand=V,1x.Itcanbesaidthatxandaarctheequiva1.emrepreenuwnsofIhesamesigna1.withxintimedomainwhi1.eaindomain.Whenthesigna1.xon1.yhasknon-zerorcocftkicn(sandk«N.nisIhCSPaZtsisofxandxcanbeconsideredksparsewih*rserepresentaionofEq.(1).InCS1.be>n.IhesensingprocessCanberepresentedas:尸极(2)InEq.(2)isthe×jVasurcmcntmatrix,andFE(MVVMisthemeasuremenvectorof-dimensina1.Compressionisrea1.izedbecause(hedimensionofn>easurenntsyisfar1.essthan(hedimensionofthesigna1.x.WithEq.(I),Eq.(2)canberewrienas:”6M(3)where0=isM×Ndimensiona1.recmMruc1.ionmatrix,andaisksparsevectorrepresenting1.heprojectioncoefficientsofxinVxdomain.Reconstniciiona1.gorithmsinCSIryIOSO1.veEq.(3).whichisanIinderde1.ermivdequationwithoutadeterminantso1.ution.Whenthesigna1.issparseandsatisfiestheRIPcondition.asparseapproximationso1.ution(OEq.(3)canbeobtainedbyminimizing(he1.1-norm.RIPofna1.rixisdefinedoniisoine1.ryconMan(c(O.I)fora&sparsesigna1.xani1.Miiisfies:1.-<¾!1.+<y,(4)1.x1.;I1.canbe1.oose1.ysaidtha1.amatrixobeystheRIPofo11!erkifaisnot1.ooCkxSC(oone.RIPensuresIha1.a1.1.subsetsofkco1.umnscakenfn>mmatrixarenear1.yorthogona1.Theequiva1.enconditionofRIPis(heiIKobereiKCbetweenIiwaNurenicntmatrix;ind(herepresentationmatrix.Avarietyofreconstructionmethodssuchasgreedya1.gorithmsandconvexQP1.imi/a1.iencanbeusedinIhCso1.vingPnKofEq.(3)B-2U.WhenCStheoryisapp1.iedtospeechsigna1.processing,theprerequisiteistoachievethesparserepresentationofspeechsigna1.susingproperorthogona1.basis.ThCexcitationofvoicedandunvoicedspeechisquasi-periodicvibrationsofvoca1.cordsandrandomnoiserespective1.y.Sovoicedspeechcarriesthemostenergyofthesamp1.eandfocusesin1.owerfrequencyMXtion.OneofthemostimportantspecmCkiracteristicsofdiscretecosinetransformaion(DCT)is(he3r11ngenergyconcenraionin1.owfrequencycoetticents,whichmakessuitab1.etoana1.yzethesparsityofspeechsigna1.s.T1.ieorthDCTCoefYicieniafn)ofaspeechframexWiIhNsamp1.escanbeca1.cu1.atedby:*、11(2-X-1.)IC.,a(n)=h(w)2-()cos-:/n=1.2NST2Af-J-:w三IH<m)-E后2这QwhereW“)denotesthe爪hsamp1.eofthespeechframe.Examp1.esofc1.eanvoicedframeandunvoicedfrane<swe1.1.asIheirDCTcoefficientsarep1.ottedinFig.1.Obvious1.y,on1.yafcv»-DCcoefficientshave1.argeramp1.itudewhi1.e(herestsarctowa111.szero.Thesparsityismoreobviousinthevoicedframe.ThereforetheDCTcoefficientsofvoicedSPCNhsigna1.scanbeconsideredasksparseapproximate1.y.2aB$01.ISO3002S0300350Number<rfDCICodIicicrtti(d>DCTedIiMn1.xManunvuicedrajur<bDCTc<)tfffkicnufa¾,<wcedfrunc1.r1Thepa11ityofx>ccdundUnV(Hadframe%AccordingtoCStheory,voicedsigna1.scontaminatedbynoisecanbereconstructedwithhighqua1.ity.Fig.2p1.otsthedenoisingperformanceofsamp1.ereconstruction.IfrandomGaussianmatrixisusedasthemeasurementmatrix,compressivesamp1.ingmatchingpursuit(CoSaMP)14.orthogona1.matchingpursuit(OMP>116.basispursuit(BP)I7andPO1.ytoPCfacespursuit(PFP)21a1.gorithmsarcadoptedinthereconstructionofnoisysamp1.es.Thenoisyvoicedframeisproducedbyadded2()dBGaussianwhitenoiw:onthec1.eanframe.Itisc1.earthatIberoco11sc(edsamp1.eshaveapproximatequa1.ityothec1.eanwaveform.Theamp1.itudeand(hePeriOdofthec1.eanframearepreservedin(hereeonsnc(edframes.Tereconsmc(kinwaveformsofBP,OMPandPFPa1.noscoincide.Thehighqua1.ityofthereconstructedvoicedspeechensures(heprecisionof(hefeau½sUX1.nK1.iQninSER.BO1.hFig.1andFig.2denvons(1111ebesamp1.ereconshuc(>nisfeasib1.einnoisySER.06(4Y)6Rcxms1.nK1.cd.BPRccvrrdrw1.atPFPfRw<mzMOMPRc<*Mrw1.c4.CcSaMPh)SO1.0M2-WNumtcrcfIhcsmp1.ngPuHr.2Reconstructedsamp1.esofanoisyvoicedframe3MKSVMc1.assifier3.1 SKSVMTbcSVMbasedonthetheoryofstructura1.riskminimizationisac1.assifierproposedforbinarye1.assi11caknprob1.em.Given/(raininga11erns«七K)'X1.i*heinputVeCIOrof(hehPanenIa>dy1.isthec1.ass1.abe1.ofx1.ThCninthefeaturespaceinducedbymappingfunctione.wecanfindahypcrp1.ancwiththemaximummargintoc1.assifytwoc1.asseswithdiscriminantfunction:/(>=*.<*<)÷frwherewandbarcweightvectorandtheOifSCt(ha(canbecomputedbyso1.vingaquadra1.k:OPIimiZa1.ionprob1.em:.Inun-mh“2“<7>依XJ*bN1.;三1.2.JTomakethemethodmoref1.exib1.eandrobust,uhypcrp1.anccanbeconstructedbyre1.axingconstrainsinEq.(7).which1.eadstothefo1.1.owingsoftma生informu1.ationwiththeintroductionofs1.ackvariab1.es1.toaccountfbrrnisckisiGcaUons.The>bjec1.ivefunctionandconstraintscanbeformu1.atedas:Imin*'m,+CV&>2yc,(M>(x,)+ft)1.-;0.I=1.Z./where/isthenumberof(rainingpatterns.CisaparameterWhkrhgivesatradeoffbetweenmaximummarginandc1.assificationerror,and©isamappingfromtheinputspacetothefeaturespace.Eq.canbeso1.vedbyintroducing1.agrangcmu1.tip1.iers:UW4b.m=gM+:-SSd-i-1i-1(ZaJM(WI幽与)+为T+帛(9>u1.wherea1.0and力R,i=12Jare(heUigrangenw1.(ip)ier.Byse1.1.ingPar1.Mderivativesof1.(ozeroyndsubstituting(hesu1.sintoEq.(9).w.1.canbee1.iminatedandEq.(9)canbetransformedfo1.1.owingWo1.fedua1.form:max£”,-Hy,y,a1.atk(x1.xt)U1.*.:S.t.Caj0.ZKa,=S=1,2Iwherek(xt.xy)=(x1.).(xj)isakerne1.Eq.(IO)Canberewri11eninmatrixfbmas:maxie-,G(K-2s.t.>0Ce-0arJ=0wheree三1.I1.jandGK)=diag(y)KdugOr).diag(j)intothe(IO)function.(三)isthediagona1.matrixwithdiagona1.yandKisthekerne1.matrixwithK”=A(x,.xJ.f=1.2,.,/,j=1.2./.3.2MKSVMThePerfbnniinWofaSKn1.hoddependsheavi1.yonIhechokeofIhekerne1.Kerne1.fusionhasbeenPrvPoMXJtodea1.withthisprob1.emthrough1.earningakerne1.machinewithMKsIO.22.Oneoftheeftc(ivckerne1.fusionstrategicsisaweightedcombinationofmu1.tip1.ekerne1.s.ThCcombinedkerne1.fu1.inis(x,.x,)=EXA.x力,Where<x)=(<(jr),1*,(j:)内,(*)andMisthenumberofkerne1.func(ionstobecombined.Thecorrespondingkerne1.n<irixcanbewrittenas:K=N",(12)whereK1.1.wSWiwisthekerne1.matrixconstructed.ufrom¢,andM(*1)isIhCre1.atingweight.1.anckrictcta1.proposedaMK1.methodwithaIransductionsettingtoobtain1.AccordingtoEq.(II).trainingtheSVMfor;igivenkerne1.invo1.vesyie1.ding(heoptima1.v<1.ucof<tKy-maxae-,G1.K),whichis«2afunctionofIbCparticu1.archoiceofthekerne1.matrixobvious1.y.Sofindingthekerne1.matrixcanbeconsideredasanOPIin1.i/a1.ienprobkmIha1.means(oI1.ndKinsonicconvexsubsetofpositivesemi-definitematriceskeepingthetraceofKconstant:min<XK)s.(trK三cThekerne1.matrixKinEq.(13)canbefoundbyso1.vingthefo1.1.owingCOnvCXoptimizationprob1.em:minIt二,.JK>0trK=cG(K1.<v÷v-÷y)v0庆EOInHq.(14)./e.rf.e/.ande【.K0meansthatKisapositiveMrmiYCfinitCmatrix.au1.(heaboveOP1.imi/a1.ienprob1.emisanSDP.Noticehav50meansdiag(r)jOandthusa1.inearmatrixinequa1.ity(I.MI1.simi1.ar1.yforTbedetai1.edproofoftheaboveequationscanbefoundinRef.(IO.InMK1.Mthecombinedkerne1.mtrix幺=Z外isar=1.1.inearcombinationoffixedkerne1.matrices,whereIisthetota1.numberof!miningauites(inga11erns.Byadding(hisackiitiona1.constrain!,Eq.(14)canberepresentedasEq.(15),fromwhichtcanbeso1.ved.minIs.t%二。住K卜(GE”,t',(e+v-3+M)0(e+F-J+-y)t-2C6er>0Abinantreestructurei1.1.ustratedinFig.3isadoptedinthestructuredesignofthemu1.ti-<1.assC1.aSSificrtorecognizefiveemotions:happy,angry,fear.DCUtra1.andsad.Thi、MriKtureis(iiffevnfromIheInMii1.iona1.netoooe,oned>rest,orhierarchySVMstructures(2>-25.In(hebinary【攸sruc(uv.(hefirstc1.assifyingnode(Modeh)isimprovedbyMKSVMICrecognize(hemostconAisabkenx)i>nwhi1.e(he(Ieeperc1.assifyingnodes(Modek-M(Xfeh)s1.i1.1.retainSKSVM.TakeIhcBer1.inDatabaseofEno(ma1.Speech26furexamp1.e,happyis1.hcmainfactorinf1.uencingovera1.1.PCdbEUMKCofIhec1.assifierbcciuseofits1.owestrecognitionaccuracy24,27-28."1.hus.Mode1.canbeused(orecognizehappyir>motheremotionswhenBer1.inDatabaseofEmotiona1.Speechisst1.icd.Thisarrangementcanreduceerroraccumu1.ationcausedbythemostCputingcomp1.exityofso1.vingSDPinCvcrymode1.PUp怨ng>fcirMUtrJ1.3I卜灯。IIAnzCraI3RneOIFmE血adF.3s1.n>c1.三eMuIii-CkiKSandMKC1.wifkrwithbinrytree4Acousticfeaturesandfeaturese1.ectionFe;I1.Urese1.ectionisnevewniryinbui1.dingOPei1.na1.fea(urcSUbMJ1.withemotiona1.rccognizabi1.i1.y.DoUb1.Cinputsymmetrica1.re1.evance(DISR)isaninform;Uiontheoreticse1.ectioncriteriondependingontheuti1.izationofsymmetricre1.evancetoconsiderthecomp1.ementaritySpeechfeaturesusua1.1.yusedinSERarc1.hcprosodicfeatures,voicequa1.ityfeaturesandspectra1.features.Pitch,energy,duration,formants,andmc1.frequencyccpst11mcoefficients(MFCC)andtheirstatisticsparametersarcextractedinthispaper.Thetota1.dinnsionofthefeaturevectoris45.Tab1.e11.iststheacousticfeaturesadoptedinthefo1.1.owingexperiments.Tab1.e1AeUxHCfeaturesIyPCFeMvrvSuiiMicparameter%PiichMkiwwm.mimum,“2c11nf.hr*rdOcvitfkak11f*1.qiurti1.c.niei1.ian.I1.iifd*rti1.c.r<<t-quM1.iru”趾PrvwdicEnEIyMaxmm.mnmum.rarc.mcun.1.ar*rd改ViMjC.IirSIqtunik.median.IhirdQJdniIc.ir«rquMikranpc仙UreOmmi<mTHa1.mk*iv<kv<1.rMaunukcd(r:«nc*.MiioofVOkM(fame*veruuvucedfrnc.ratio«f¼cifrmeVtffsuinu1.frMUc¼.ratioOiiM*XeedtramvenusiduItrM11eTh:HfMx11u1.户:a»«m.*Uhrddrtix,IiifUtiiiVbiaqiM1.yfcMvrcFnrmm1.TbcSbandfiMmuitF::mean.3udmidevuiiioa.11cdidnTeI1.iitdtvIiiMHF:incm.NZwnJ(1.r*iaiiMi.IiinfiafiSpectra1.(cjtun>MFVCI2<mjcfMFCCbetweenIWOinputfeatures.TeIMinadvantageofDISRCnieriixiis(hese1.eccedCOmPIUInefHaryvariaNehasmuchhigherrbabi1.iyre1.evanceona1.1.Of1.hedoub1.einputsin(hesubset29.5 Experimenta1.resu1.tsandana1.ysisIn(hispaperIheBer1.inDatabaseofEmoiiona1.Speechisse1.ected(ogtheproposedmethodfromIheaspectofs

    注意事项

    本文(《中国邮电高校学报(英文)》论文投稿模板.docx)为本站会员(李司机)主动上传,三一办公仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知三一办公(点击联系客服),我们立即给予删除!

    温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。




    备案号:宁ICP备20000045号-2

    经营许可证:宁B2-20210002

    宁公网安备 64010402000987号

    三一办公
    收起
    展开