{"id":15161,"date":"2025-12-29T10:18:19","date_gmt":"2025-12-29T17:18:19","guid":{"rendered":"https:\/\/mattfife.com\/?p=15161"},"modified":"2025-09-13T10:27:16","modified_gmt":"2025-09-13T17:27:16","slug":"tool-for-measuring-ai-enhanced-gpu-image-quality","status":"publish","type":"post","link":"https:\/\/mattfife.com\/?p=15161","title":{"rendered":"Tool for measuring AI enhanced GPU image quality"},"content":{"rendered":"\n<p>Engineers at Intel released an open-source tool that tries to quantify the issues from increasing amounts of upscalers, frame generators, and AI rendering techniques. Ironically, the tool itself is an AI trained on large datasets. <a href=\"https:\/\/arxiv.org\/abs\/2506.11546\" data-type=\"link\" data-id=\"https:\/\/arxiv.org\/abs\/2506.11546\">Their paper about the methodology is located here<\/a>.<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-large is-resized\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"640\" height=\"579\" data-attachment-id=\"15163\" data-permalink=\"https:\/\/mattfife.com\/?attachment_id=15163\" data-orig-file=\"https:\/\/i0.wp.com\/mattfife.com\/wp-content\/themes\/mattTheme\/headerimgs\/2025\/09\/x11-1.png?fit=1661%2C1502&amp;ssl=1\" data-orig-size=\"1661,1502\" data-comments-opened=\"1\" data-image-meta=\"{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}\" data-image-title=\"x11\" data-image-description=\"\" data-image-caption=\"\" data-medium-file=\"https:\/\/i0.wp.com\/mattfife.com\/wp-content\/themes\/mattTheme\/headerimgs\/2025\/09\/x11-1.png?fit=300%2C271&amp;ssl=1\" data-large-file=\"https:\/\/i0.wp.com\/mattfife.com\/wp-content\/themes\/mattTheme\/headerimgs\/2025\/09\/x11-1.png?fit=640%2C579&amp;ssl=1\" src=\"https:\/\/i0.wp.com\/mattfife.com\/wp-content\/themes\/mattTheme\/headerimgs\/2025\/09\/x11-1.png?resize=640%2C579&#038;ssl=1\" alt=\"\" class=\"wp-image-15163\" style=\"width:682px;height:auto\" srcset=\"https:\/\/i0.wp.com\/mattfife.com\/wp-content\/themes\/mattTheme\/headerimgs\/2025\/09\/x11-1.png?resize=1024%2C926&amp;ssl=1 1024w, https:\/\/i0.wp.com\/mattfife.com\/wp-content\/themes\/mattTheme\/headerimgs\/2025\/09\/x11-1.png?resize=300%2C271&amp;ssl=1 300w, https:\/\/i0.wp.com\/mattfife.com\/wp-content\/themes\/mattTheme\/headerimgs\/2025\/09\/x11-1.png?resize=768%2C694&amp;ssl=1 768w, https:\/\/i0.wp.com\/mattfife.com\/wp-content\/themes\/mattTheme\/headerimgs\/2025\/09\/x11-1.png?resize=1536%2C1389&amp;ssl=1 1536w, https:\/\/i0.wp.com\/mattfife.com\/wp-content\/themes\/mattTheme\/headerimgs\/2025\/09\/x11-1.png?resize=299%2C270&amp;ssl=1 299w, https:\/\/i0.wp.com\/mattfife.com\/wp-content\/themes\/mattTheme\/headerimgs\/2025\/09\/x11-1.png?w=1661&amp;ssl=1 1661w, https:\/\/i0.wp.com\/mattfife.com\/wp-content\/themes\/mattTheme\/headerimgs\/2025\/09\/x11-1.png?w=1280&amp;ssl=1 1280w\" sizes=\"auto, (max-width: 640px) 100vw, 640px\" \/><\/figure>\n<\/div>\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>CGVQM is a video quality metric that predicts perceptual differences between pairs of videos.<br>Like PSNR and SSIM, it compares a\u00a0ground-truth reference\u00a0to a\u00a0distorted\u00a0version (e.g. blurry, noisy, aliased).<\/p>\n\n\n\n<p>What sets CGVQM apart is that it is the first metric\u00a0calibrated for distortions from advanced rendering techniques, accounting for both\u00a0spatial\u00a0and\u00a0temporal\u00a0artifacts. <\/p>\n<\/blockquote>\n\n\n\n<p><a href=\"https:\/\/github.com\/IntelLabs\/cgvqm\" data-type=\"link\" data-id=\"https:\/\/github.com\/IntelLabs\/cgvqm\">CGVQM is available for free on github<\/a> and uses PyTorch optimized for CUDA GPUs though it does work on CPUs. <\/p>\n\n\n\n<p>Other links:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><a href=\"https:\/\/www.tomshardware.com\/video-games\/pc-gaming\/intel-releases-new-tool-to-measure-gaming-image-quality-in-real-time-ai-tool-measures-impact-of-upscalers-frame-gen-others-computer-graphics-video-quality-metric-now-available-on-github#xenforo-comments-3883158\">https:\/\/www.tomshardware.com\/video-games\/pc-gaming\/intel-releases-new-tool-to-measure-gaming-image-quality-in-real-time-ai-tool-measures-impact-of-upscalers-frame-gen-others-computer-graphics-video-quality-metric-now-available-on-github#xenforo-comments-3883158<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Engineers at Intel released an open-source tool that tries to quantify the issues from increasing amounts of upscalers, frame generators, and AI rendering techniques. Ironically, the tool itself is an AI trained on large datasets. Their paper about the methodology is located here. CGVQM is a video quality metric that predicts perceptual differences between pairs of videos.Like PSNR and SSIM, it compares a\u00a0ground-truth reference\u00a0to a\u00a0distorted\u00a0version (e.g. blurry, noisy, aliased). What sets CGVQM apart is that it is the first metric\u00a0calibrated&#8230;<\/p>\n<p class=\"read-more\"><a class=\"btn btn-default\" href=\"https:\/\/mattfife.com\/?p=15161\"> Read More<span class=\"screen-reader-text\">  Read More<\/span><\/a><\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[28,9,5],"tags":[],"class_list":["post-15161","post","type-post","status-publish","format-standard","hentry","category-ai","category-cool","category-technical"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p4WECr-3Wx","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/posts\/15161","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/mattfife.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=15161"}],"version-history":[{"count":3,"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/posts\/15161\/revisions"}],"predecessor-version":[{"id":15166,"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/posts\/15161\/revisions\/15166"}],"wp:attachment":[{"href":"https:\/\/mattfife.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=15161"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/mattfife.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=15161"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/mattfife.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=15161"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}