欢迎来到三一办公! | 帮助中心 三一办公31ppt.com(应用文档模板下载平台)
三一办公
全部分类
  • 办公文档>
  • PPT模板>
  • 建筑/施工/环境>
  • 毕业设计>
  • 工程图纸>
  • 教育教学>
  • 素材源码>
  • 生活休闲>
  • 临时分类>
  • ImageVerifierCode 换一换
    首页 三一办公 > 资源分类 > DOC文档下载  

    对于网络视频质量度量标准的探索毕业论文外文翻译.doc

    • 资源ID:3942127       资源大小:357.50KB        全文页数:23页
    • 资源格式: DOC        下载积分:8金币
    快捷下载 游客一键下载
    会员登录下载
    三方登录下载: 微信开放平台登录 QQ登录  
    下载资源需要8金币
    邮箱/手机:
    温馨提示:
    用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)
    支付方式: 支付宝    微信支付   
    验证码:   换一换

    加入VIP免费专享
     
    账号:
    密码:
    验证码:   换一换
      忘记密码?
        
    友情提示
    2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
    3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
    4、本站资源下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。
    5、试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。

    对于网络视频质量度量标准的探索毕业论文外文翻译.doc

    A Quest for an Internet Video Quality-of-Experience MetricAthula Balachandran Vyas SekarAditya AkellaSrinivasan Seshan Ion StoicaHui ZhangCarnegie Mellon UniversityUniversity of Wisconsin MadisonStony Brook UniversityUniversity of California BerkeleyABSTRACTAn imminent challenge that content providers, CDNs, thirdparty analytics and optimization services, and video player designers in the Internet video ecosystem face is the lack of a single “gold standard” to evaluate different competing solutions. Existing techniques that describe the quality of the encoded signal or controlled studies to measure opinion scores do not translate directly into user experience at scale. Recent work shows that measurable performance metrics such as buffering, startup time, bitrate, and number of bitrate switches impact user experience. However, converting these observations into a quantitative quality-of-experience metric turns out to be challenging since these metrics are interrelated in complex and sometimes counter-intuitive ways,and their relationship to user experience can be unpredictable.To further complicate things, many confounding factors are introduced by the nature of the content itself (e.g., user interest, genre). We believe that the issue of interdependency can be addressed by casting this as a machine learning problem to build a suitable predictive model from empirical observations. We also show that setting up the problem based on domain-specific and measurement-driven insights can minimize the impact of the various confounding factors to improve the prediction performance.Categories and Subject DescriptorsC.4 Performance of Systems: measurement techniques,performance attributesGeneral TermsHuman Factors, Measurement, Performance1. INTRODUCTIONWith the decreasing cost of content delivery and the growing success of subscription and ad-based business models(e.g., 2), video traffic over the Internet is predicted to increase in the years to come, possibly even surpassing television based viewership in the future 3. An imminent challenge that all players in the Internet video ecosystemcontent providers, content delivery networks, analytics services, video player designers, and usersface is the lack of a standardized approach to measure the Quality-of-Experience (QoE) that different solutions provide. With the “coming of age” of this technology and the establishment of industry standard groups (e.g., 13), such a measure will become a fundamental requirement to promote further innovation by allowing us to objectively compare different competing designs 11,17.The notion of QoE appears to many forms of media and has a rich history in the multimedia community (e.g., 9, 10,14, 15). However, Internet video introduces new effects interms of measuring both quality and experience:Measuring quality: Internet video is delivered using HTTP-based commodity technology over a largely unreliable network via existing CDN infrastructures. Consequently, the traditional encoding-related measures of quality like Peak Signal-to-Noise Ratio are replaced by a suite of quality metrics that capture several effects introduced by the delivery mechanismbuffering, bitrate delivered, frame rendering rate, bitrate switching, and startup delay 6, 33.Measuring experience: In the context of advertismentand subscription-supported services, the perceptual opinion of a user in a controlled study does not necessarily translate into objective measures of engagement that impact providers business objectives. Typical measures of engagement used today to approximate these business objectives are in-the-wild measurements of user behavior; e.g., fraction of a particular video played and number of visits to the provider 6, 33.To obtain a robust QoE measure, we ideally need a unified and quantitative understanding of how low-level quality metrics impact measures of experience. By unified, we want to see how the set of quality metrics taken together impact quality, as opposed to each metric in isolation. This is especially relevant since there are natural tradeoffs between the metrics; e.g., lower bitrate can ensure lower buffering but reduces the user experience. Similarly, by quantitative, we want to go beyond a simple correlational understanding of “metric M impacts engagement”, to a stronger statement of the form “changing metric M from x to x changes engagement from y to y”.Unfortunately, the state of the art in our understanding of video QoE is limited to a simple qualitative understanding of how individual metrics impact engagement 19. This leads to severe shortcomings for every component of the video ecosystem. For example, adaptive video players today resort to ad hoc tradeoffs between bitrate, startup delay, and buffering 16,20,32. Similarly, frameworks for multi-CDN optimization use primitive QoE metrics that only capture buffering effects without accounting for the impact of bitrate or bitrate switching 28, 29. Finally, content providers do not have systematic ways to evaluate the cost-performance tradeoffs that different CDNs or multi-CDN optimizations offer 1.We observe that there are three key factors that make it challenging to obtain a unified and quantitative understanding of Internet video QoE:Complex relationships: The relationships between the quality metrics and the effective user experience can be quite complex and even counter-intuitive. For example, while one would naturally expect a higher video bitrate leading to better user experience, we observe a non-monotonic relationship between the two.Metric dependencies: The metrics themselves have subtle interdependencies and have implicit tradeoffs. For example, although switching bitrates to adapt to the bandwidth conditions can reduce buffering, we observe that high rates of switching can annoy users.Impact of content: There are many confounding factors introduced by the nature of the content itself. For example, different genres of content such as live and videoon-demand (VOD) show very different viewing patterns. Similarly, users interest in content also affects their tolerance non-trivially.Our goal in this paper is to identify a feasible roadmap toward developing a robust, unified and quantitative QoE metric that can address these challenges. We have two intuitive reasons to be hopeful. The challenges raised by complex relationships and subtle interdependencies can be addressed by casting QoE inference as a machine learning problem of building an appropriate model that can predict the user engagement (e.g., play time) as a function of the various quality metrics. The second issue of content-induced effects can be addressed using domain-specific and measurementdriven insights to carefully set up the learning tasks.Our preliminary results give us reason to be optimistic.For example, a decision tree based classifier can provide close to 50% accuracy in predicting the engagement. Carefully setting up the inputs and features for the learning process could lead to as high as 25% gain in accuracy of the prediction model.Figure 1: Overview of the Internet video ecosystem; a robust QoE metric is critical for every component in this ecosystem.The rest of this paper is organized as follows. Section 2 describes how a standardized QoE metric would impact the different players in the video ecosystem. Section 3 describes the main challenges in developing a QoE metric. Section 4 makes the case for a predictive model for developing a QoE metric. In Section 5, we present some preliminary results before discussing various challenges in Section 6. We conclude in Section 7.2. USE CASES FOR VIDEO QOEWe begin with a brief overview of the Internet video ecosystem today and argue why there is an immediate need for a standardized QoE metric and how this impacts the different players in the video ecosystem (Figure 1).Content providers like HBO, ABC, and Netflix would like to maximize their revenues from subscription and adbased business models while trying to minimize their distribution costs. To this end, content providers have business arrangements with CDNs (e.g., Akamai, Limelight) and also with third-party analytics (e.g., Ooyala 8) and optimization services (e.g., Conviva 5). A robust QoE metric enables content providers to objectively evaluate the cost-performance tradeoffs offered by the CDNs and the value that such third-party services offer.Content Distribution Networks need to allocate their distribution resources (e.g., server and bandwidth capacity) across user population. They need standard metrics to demonstrate superior cost-performance tradeoffs. CDNs also need such metrics to guide the design of their delivery infrastructures to minimize their delivery costs while maximizing their performance 24.Recent studies have argued the case for cross-CDN optimization 28, 29 and there are already commercial services (e.g., Conviva 5) that provide these capabilities. These services need standard measures to demonstrate quantifiable value to the content providers. An open challenge that such optimization frameworks face is the choice of a suitable quality metric that needs to be optimized 29.Similarly, third-party video analytics services need concrete ways to translate their insights with respect to user demographics and user behaviors into quantitative engagement effects.Video player designers have to make conscious tradeoffs in their bitrate adaptation algorithms. For example, moving to a higher bitrate may offer better engagement but increases the risk of buffering that is known to annoy users. Similarly, user studies suggest that users cannot tolerate too frequent bitrate switches as it impacts their perceptual experience 18. The lack of a systematic understanding of video QoE forces player designers to use ad hoc adaptation strategies without a clear optimization goal 16, 20, 32.Ultimately, the success of this ecosystem depends on the users experience. Increasingly, the same content (TV shows, movies) is available from multiple providers (e.g., Apple iTunes, Amazon, Hulu Plus, Google). Beyond issues of content availability, users would prefer services that give them a better cost-experience tradeoff. Another issue relates to the recent introduction of ISP bandwidth quotas 4; an understanding of video QoE enables users and delivery providers to better customize the experience under such constraints. There appears to be rough consensus among the leading industry players on two accounts. First, there is implicit agreement on the set of quality metrics and the measures of engagement 6, 19, 33. Second, there is also a growing realization in this community of the need for a data-driven approach using measurements in-the-wild as opposed to traditional methods of using controlled user studies. The key challenge that remains is providing a unified and quantitative understanding of the relationship between the quality metrics and the engagement measures. As we will show in the next section, this turns out to be non-trivial.3. CHALLENGES IN MEASURING QOEIn this section, we use real world measurements of client viewing sessions from two large content providers, one serving TV episodes and the other providing live sports events, to highlight challenges in QoE measurement. We use industrystandard video quality metrics for our study 6. For concreteness we focus on play time as the measure of user engagement in this section. A subset of the observations we present have also appeared in other measurement studies (e.g.,12,19,2123,26,31), albeit in other contexts. Our specific contribution lies in highlighting the challenges these raise for developing a unified QoE metric.3.1 Complex relationshipsThe relationship between different quality metrics and user engagement are complex. These were extensively studied in 19 and we reconfirm some of these observations. Counter-intuitive effects: Although, intuitively higher rendering quality (frames per second) should lead to higher user engagement, Dobrian et al. noticed several instances where lower rendering quality led to long play times, especially in the case of live videos 19. This is because of an optimization by the video player to reduce CPU consumption by reducing the frame rate when the video is played in the background. Users may run live videos in the background while focusing on other work, but would be compelled to close the player if the CPU usage was high.Non-monotone effects: Although higher average bitrate should result in higher user engagement, prior work has observed that there is a non-monotonic relationship between the two 19. This is because CDNs serve content at specific bitrates and the values of average bitrates in between thesestandard bitrates correspond to clients that had to switch bitrates during the session. These clients likely experienced higher buffering and hence the lower engagement.Threshold effects: We also observed that the rate of switching has a threshold effect on user engagement. Rates up to 0.5 switches/minute do not have any effect on the user engagement. However, at higher rates, users seem to quit early as shown in Figure 2. Our observation corroborates the user studies made in 18.3.2 Interaction between metricsNaturally, the various quality metrics are not independent of each other. The interaction and implicit tradeoffs between these metrics also needs careful examination. Next, we highlight some of these interesting tradeoffs: Switching vs buffering: An efficient bitrate switching algorithm should pro-actively switch bitrates to avoid buffering events within a session and also let the user experience the best possible video quality based on the current network conditions at the client. However, as shown in Figure 2, high rates of switching annoys users leading them to quit early.Join time vs bitrate: Although higher bitrate would imply higher quality, it would also imply higher join time since it would take longer time for the player video buffer to fill up sufficiently to start rendering video frames. Figure 3: User viewing pattern of live and VOD videos Figure 4: User interest induced by regional games3.3 ExternalitiesThere are many confounding external factors that affect user engagement and are not captured by the quality metrics. For instance, user-attributes like bandwidth at the client and its variability, and content attributes like genre, popularity and age of the content have effects on user engagement. Genre of the content: We observed that live and VOD video sessions experience similar qualitye.g., Figure 3b shows the cdf of buffering ratio (fraction of session time spent in buffering) for live and VOD sessions. However, as shown in Figure 3a, the distribution of the fraction of video viewed by users over severa

    注意事项

    本文(对于网络视频质量度量标准的探索毕业论文外文翻译.doc)为本站会员(仙人指路1688)主动上传,三一办公仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知三一办公(点击联系客服),我们立即给予删除!

    温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。




    备案号:宁ICP备20000045号-2

    经营许可证:宁B2-20210002

    宁公网安备 64010402000987号

    三一办公
    收起
    展开