欢迎来到三一办公! | 帮助中心 三一办公31ppt.com(应用文档模板下载平台)
三一办公
全部分类
  • 办公文档>
  • PPT模板>
  • 建筑/施工/环境>
  • 毕业设计>
  • 工程图纸>
  • 教育教学>
  • 素材源码>
  • 生活休闲>
  • 临时分类>
  • ImageVerifierCode 换一换
    首页 三一办公 > 资源分类 > PPT文档下载  

    龙星计划课程信息检索CourseOverviewBackground.ppt

    • 资源ID:5472895       资源大小:438.54KB        全文页数:65页
    • 资源格式: PPT        下载积分:10金币
    快捷下载 游客一键下载
    会员登录下载
    三方登录下载: 微信开放平台登录 QQ登录  
    下载资源需要10金币
    邮箱/手机:
    温馨提示:
    用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)
    支付方式: 支付宝    微信支付   
    验证码:   换一换

    加入VIP免费专享
     
    账号:
    密码:
    验证码:   换一换
      忘记密码?
        
    友情提示
    2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
    3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
    4、本站资源下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。
    5、试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。

    龙星计划课程信息检索CourseOverviewBackground.ppt

    2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,1,龙星计划课程:信息检索 Course Overview&Background,ChengXiang Zhai(翟成祥)Department of Computer ScienceGraduate School of Library&Information ScienceInstitute for Genomic Biology,StatisticsUniversity of Illinois,Urbana-Champaignhttp:/www-faculty.cs.uiuc.edu/czhai,czhaics.uiuc.edu,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,2,Outline,Course overviewEssential backgroundProbability&statisticsBasic concepts in information theoryNatural language processing,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,3,Course Overview,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,4,Course Objectives,Introduce the field of information retrieval(IR)Foundation:Basic concepts,principles,methods,etcTrends:Frontier topics Prepare students to do research in IR and/or related fieldsResearch methodology(general and IR-specific)Research proposal writingResearch project(to be finished after the lecture period),2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,5,Prerequisites,Proficiency in programming(C+is needed for assignments)Knowledge of basic probability&statistics(would be necessary for understanding algorithms deeply)Big plus:knowledge of related areasMachine learningNatural language processingData mining,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,6,Course Management,Teaching staffInstructor:ChengXiang Zhai(UIUC)Teaching assistants:Hongfei Yan(Peking Univ)Bo Peng(Peking Univ)Course website:http:/group discussion:http:/First post the questions on the group discussion forum;if questions are unanswered,bring them to the office hours(first office hour:June 23,2:30-4:30pm),2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,7,Format&Requirements,Lecture-based:Morning lectures:Foundation&TrendsAfternoon lectures:IR research methodologyReadings are usually available online 2 Assignments(based on morning lectures)Coding(C+),experimenting with data,analyzing results,open explorations(5 hours each)Final exam(based on morning lectures):1:30-4:30pm,June 30.Practice questions will be available,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,8,Format&Requirements(cont.),Course project(Mini-TREC)Work in teamsPhase I:create test collections(3 hours,done within lecture period)Phase II:develop algorithms and submit results(done in the summer)Research project proposal(based on afternoon lectures)Work in teams2-page outline done within lecture periodfull proposal(5 pages)due later,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,9,Coverage of Topics:IR vs.TIM,Text Information Management(TIM),Information Retrieval(IR),Multimedia,etc,IR and TIM will be used interchangeably,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,10,What is Text Info.Management?,TIM is concerned with technologies for managing and exploiting text information effectively and efficientlyImportance of managing text informationThe most natural way of encoding knowledgeThink about scientific literatureThe most common type of informationHow much textual information do you produce and consume every day?The most basic form of informationIt can be used to describe other media of informationThe most useful form of information!,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,11,Text Management Applications,Access,Mining,Organization,Select information,Create Knowledge,Add Structure/Annotations,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,12,Examples of Text Management Applications,SearchWeb search engines(Google,Yahoo,)Library systemsRecommendationNews filterLiterature/movie recommenderCategorizationAutomatically sorting emailsMining/ExtractionDiscovering major complaints from email in customer serviceBusiness intelligenceBioinformaticsMany others,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,13,Elements of Text Info Management Technologies,Search,Text,Filtering,Categorization,Summarization,Clustering,Natural Language Content Analysis,Extraction,Mining,Visualization,RetrievalApplications,MiningApplications,InformationAccess,KnowledgeAcquisition,InformationOrganization,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,14,Text Management and Other Areas,TM Algorithms,User,Text,StorageCompression,Probabilistic inferenceMachine learning,Natural language processing,Human-computer interaction,TM Applications,Software engineeringWeb,Computer science,InformationScience,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,15,Related Areas,InformationRetrieval,Databases,Library&InfoScience,Machine LearningPattern RecognitionData Mining,NaturalLanguageProcessing,ApplicationsWeb,Bioinformatics,StatisticsOptimization,Software engineeringComputer systems,Models,Algorithms,Applications,Systems,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,16,Publications/Societies(Incomplete),ACM SIGIR,VLDB,PODS,ICDE,ASIS,Learning/Mining,NLP,Applications,Statistics,Software/systems,COLING,EMNLP,ANLP,HLT,ICML,NIPS,UAI,RECOMB,PSB,JCDL,Info.Science,Info Retrieval,ACM CIKM,Databases,ACM SIGMOD,ACL,ICML,AAAI,ACM SIGKDD,ISMB,WWW,SOSP,OSDI,TREC,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,17,Schedule:available at http:/,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,18,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,19,Essential Backgroud 1:Probability&Statistics,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,20,Prob/Statistics&Text Management,Probability&statistics provide a principled way to quantify the uncertainties associated with natural languageAllow us to answer questions like:Given that we observe“baseball”three times and“game”once in a news article,how likely is it about“sports”?(text categorization,information retrieval)Given that a user is interested in sports news,how likely would the user use“baseball”in a query?(information retrieval),2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,21,Basic Concepts in Probability,Random experiment:an experiment with uncertain outcome(e.g.,tossing a coin,picking a word from text)Sample space:all possible outcomes,e.g.,Tossing 2 fair coins,S=HH,HT,TH,TTEvent:ES,E happens iff outcome is in E,e.g.,E=HH(all heads)E=HH,TT(same face)Impossible event(),certain event(S)Probability of Event:1P(E)0,s.t.P(S)=1(outcome always in S)P(A B)=P(A)+P(B)if(AB)=(e.g.,A=same face,B=different face),2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,22,Basic Concepts of Prob.(cont.),Conditional Probability:P(B|A)=P(AB)/P(A)P(AB)=P(A)P(B|A)=P(B)P(A|B)So,P(A|B)=P(B|A)P(A)/P(B)(Bayes Rule)For independent events,P(AB)=P(A)P(B),so P(A|B)=P(A)Total probability:If A1,An form a partition of S,thenP(B)=P(BS)=P(BA1)+P(B An)(why?)So,P(Ai|B)=P(B|Ai)P(Ai)/P(B)=P(B|Ai)P(Ai)/P(B|A1)P(A1)+P(B|An)P(An)This allows us to compute P(Ai|B)based on P(B|Ai),2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,23,Interpretation of Bayes Rule,Hypothesis space:H=H1,HnEvidence:E,If we want to pick the most likely hypothesis H*,we can drop P(E),Posterior probability of Hi,Prior probability of Hi,Likelihood of data/evidenceif Hi is true,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,24,Random Variable,X:S(“measure”of outcome)E.g.,number of heads,all same face?,Events can be defined according to XE(X=a)=si|X(si)=aE(Xa)=si|X(si)aSo,probabilities can be defined on XP(X=a)=P(E(X=a)P(aX)=P(E(aX)Discrete vs.continuous random variable(think of“partitioning the sample space”),2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,25,An Example:Doc Classification,X1:sport 1 0 1 1,Topic the computer game baseball,X2:sport 1 1 1 1,X3:computer 1 1 0 0,X4:computer 1 1 1 0,X5:other 0 0 1 1,For 3 topics,four words,n=?,EventsEsport=xi|topic(xi)=“sport”Ebaseball=xi|baseball(xi)=1Ebaseball,computer=xi|baseball(xi)=1&computer(xi)=0,Sample Space S=x1,xn,Conditional Probabilities:P(Esport|Ebaseball),P(Ebaseball|Esport),P(Esport|Ebaseball,computer),.,An inference problem:Suppose we observe that“baseball”is mentioned,how likely the topic is about“sport”?,But,P(B=1|T=“sport”)=?,P(T=“sport”)=?,P(T=“sport”|B=1)P(B=1|T=“sport”)P(T=“sport”),Thinking in terms of random variablesTopic:T“sport”,“computer”,“other”,“Baseball”:B 0,1,P(T=“sport”|B=1),P(B=1|T=“sport”),.,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,26,Getting to Statistics.,P(B=1|T=“sport”)=?(parameter estimation)If we see the results of a huge number of random experiments,then But,what if we only see a small sample(e.g.,2)?Is this estimate still reliable?In general,statistics has to do with drawing conclusions on the whole population based on observations of a sample(data),2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,27,Parameter Estimation,General setting:Given a(hypothesized&probabilistic)model that governs the random experimentThe model gives a probability of any data p(D|)that depends on the parameter Now,given actual sample data X=x1,xn,what can we say about the value of?Intuitively,take your best guess of-“best”means“best explaining/fitting the data”Generally an optimization problem,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,28,Maximum Likelihood vs.Bayesian,Maximum likelihood estimation“Best”means“data likelihood reaches maximum”Problem:small sampleBayesian estimation“Best”means being consistent with our“prior”knowledge and explaining data wellProblem:how to define prior?,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,29,Illustration of Bayesian Estimation,Posterior:p(|X)p(X|)p(),2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,30,Maximum Likelihood Estimate,Data:a document d with counts c(w1),c(wN),and length|d|Model:multinomial distribution M with parameters p(wi)Likelihood:p(d|M)Maximum likelihood estimator:M=argmax M p(d|M),Well tune p(wi)to maximize l(d|M),Use Lagrange multiplier approach,Set partial derivatives to zero,ML estimate,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,31,What You Should Know,Probability concepts:sample space,event,random variable,conditional prob.multinomial distribution,etcBayes formula and its interpretationStatistics:Know how to compute maximum likelihood estimate,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,32,Essential Background 2:Basic Concepts in Information Theory,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,33,Information Theory,Developed by Shannon in the 40sMaximizing the amount of information that can be transmitted over an imperfect communication channelData compression(entropy)Transmission rate(channel capacity),2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,34,Basic Concepts in Information Theory,Entropy:Measuring uncertainty of a random variableKullback-Leibler divergence:comparing two distributionsMutual Information:measuring the correlation of two random variables,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,35,Entropy:Motivation,Feature selection:If we use only a few words to classify docs,what kind of words should we use?P(Topic|“computer”=1)vs p(Topic|“the”=1):which is more random?Text compression:Some documents(less random)can be compressed more than others(more random)Can we quantify the“compressibility”?In general,given a random variable X following distribution p(X),How do we measure the“randomness”of X?How do we design optimal coding for X?,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,36,Entropy:Definition,Entropy H(X)measures the uncertainty/randomness of random variable X,Example:,P(Head),H(X),1.0,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,37,Entropy:Properties,Minimum value of H(X):0What kind of X has the minimum entropy?Maximum value of H(X):log M,where M is the number of possible values for XWhat kind of X has the maximum entropy?Related to coding,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,38,Interpretations of H(X),Measures the“amount of information”in XThink of each value of X as a“message”Think of X as a random experiment(20 questions)Minimum average number of bits to compress values of XThe more random X is,the harder to compress,A fair coin has the maximum information,and is hardest to compressA biased coin has some information,and can be compressed to 1 bit on averageA completely biased coin has no information,and needs only 0 bit,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,39,Conditional Entropy,The conditional entropy of a random variable Y given another X,expresses how much extra information one still needs to supply on average to communicate Y given that the other party knows XH(Topic|“computer”)vs.H(Topic|“the”)?,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,40,Cross Entropy H(p,q),What if we encode X with a code optimized for a wrong distribution q?Expected#of bits=?,Intuitively,H(p,q)H(p),and mathematically,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,41,Kullback-Leibler Divergence D(p|q),What if we encode X with a code optimized for a wrong distribution q?How many bits would we waste?,Properties:-D(p|q)0-D(p|q)D(q|p)-D(p|q)=0 iff p=q,KL-divergence is often used to measure the distance between two distributions,Interpretation:Fix p,D(p|q)and H(p,q)vary in the same wayIf p is an empirical distribution,minimize D(p|q)or H(p,q)is equivalent to maximizing likelihood,Relative entropy,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,42,Cross Entropy,KL-Div,and Likelihood,Likelihood:,log Likelihood:,Criterion for selecting a good model Perplexity(p),2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,43,Mutual Information I(X;Y),Comparing two distributions:p(x,y)vs p(x)p(y),Properties:I(X;Y)0;I(X;Y)=I(Y;X);I(X;Y)=0 iff X&Y are independent,Interpretations:-Measures how much reduction in uncertainty of X given info.about Y-Measures correlation between X and Y-Related to the“channel capacity”in information theory,Examples:I(Topic;“computer”)vs.I(Topic;“the”)?I(“computer”,“program”)vs(“computer”,“baseball”)?,2008 ChengXiang Zhai Dragon Star Lecture at Beijing University,June 21-30,2008,44,What You Should Know,Information theory concepts:entropy,cross entrop

    注意事项

    本文(龙星计划课程信息检索CourseOverviewBackground.ppt)为本站会员(sccc)主动上传,三一办公仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知三一办公(点击联系客服),我们立即给予删除!

    温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。




    备案号:宁ICP备20000045号-2

    经营许可证:宁B2-20210002

    宁公网安备 64010402000987号

    三一办公
    收起
    展开