{"id":9798,"date":"2023-11-16T12:35:47","date_gmt":"2023-11-16T19:35:47","guid":{"rendered":"https:\/\/mattfife.com\/?p=9798"},"modified":"2023-11-16T15:36:06","modified_gmt":"2023-11-16T22:36:06","slug":"photorealistic-rendering-of-gta-v-via-ai","status":"publish","type":"post","link":"https:\/\/mattfife.com\/?p=9798","title":{"rendered":"Photorealistic rendering of GTA V &#8211; via AI"},"content":{"rendered":"\n<p>Old games often suffer from the limited graphics capabilities of the time they were made, while developing new games costs a fortune due to the requirements to author high quality models and textures. What if you could solve BOTH problems &#8211; with the same solution? <a href=\"https:\/\/intel-isl.github.io\/PhotorealismEnhancement\/\">A machine learning project<\/a>\u00a0from Intel Labs in 2021 called \u201cEnhancing Photorealism Enhancement\u201d might push rendering toward photorealism a lot quicker and easier.<\/p>\n\n\n\n<p>Researchers studied how to use a convolution network to re-render the scene. Below you can see an example of how they used the <a href=\"https:\/\/www.cityscapes-dataset.com\/\" data-type=\"link\" data-id=\"https:\/\/www.cityscapes-dataset.com\/\">CityScapes dataset<\/a> to give a much more realistic output of a race game &#8211; all in realtime.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<span class=\"embed-youtube\" style=\"text-align:center; display: block;\"><iframe loading=\"lazy\" class=\"youtube-player\" width=\"640\" height=\"360\" src=\"https:\/\/www.youtube.com\/embed\/P1IcaBn3ej0?version=3&#038;rel=1&#038;showsearch=0&#038;showinfo=1&#038;iv_load_policy=1&#038;fs=1&#038;hl=en-US&#038;autohide=2&#038;wmode=transparent\" allowfullscreen=\"true\" style=\"border:0;\" sandbox=\"allow-scripts allow-same-origin allow-popups allow-presentation allow-popups-to-escape-sandbox\"><\/iframe><\/span>\n<\/div><\/figure>\n\n\n\n<p>You can read how the image enhancement actually works\u00a0<a href=\"http:\/\/vladlen.info\/papers\/EPE.pdf\">in their paper (PDF)<\/a>. It includes a lot of good information about how their method works and how it improves on previous attempts that have issues with color, object hallucination, and temporal instability. They do this by using the extra information provided by rendered scenes such as clever use of the g-buffer &#8211; along with a specialized discriminator and segmentation network.<\/p>\n\n\n\n<p>Sources:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><a href=\"https:\/\/www.theverge.com\/2021\/5\/12\/22432945\/intel-gta-v-realistic-machine-learning-cityscapes-dataset\">https:\/\/www.theverge.com\/2021\/5\/12\/22432945\/intel-gta-v-realistic-machine-learning-cityscapes-dataset<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/isl-org.github.io\/PhotorealismEnhancement\/\">https:\/\/isl-org.github.io\/PhotorealismEnhancement\/<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Old games often suffer from the limited graphics capabilities of the time they were made, while developing new games costs a fortune due to the requirements to author high quality models and textures. What if you could solve BOTH problems &#8211; with the same solution? A machine learning project\u00a0from Intel Labs in 2021 called \u201cEnhancing Photorealism Enhancement\u201d might push rendering toward photorealism a lot quicker and easier. Researchers studied how to use a convolution network to re-render the scene. Below&#8230;<\/p>\n<p class=\"read-more\"><a class=\"btn btn-default\" href=\"https:\/\/mattfife.com\/?p=9798\"> Read More<span class=\"screen-reader-text\">  Read More<\/span><\/a><\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[28,23,9],"tags":[],"class_list":["post-9798","post","type-post","status-publish","format-standard","hentry","category-ai","category-automotive","category-cool"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p4WECr-2y2","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/posts\/9798","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/mattfife.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=9798"}],"version-history":[{"count":5,"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/posts\/9798\/revisions"}],"predecessor-version":[{"id":9985,"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/posts\/9798\/revisions\/9985"}],"wp:attachment":[{"href":"https:\/\/mattfife.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=9798"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/mattfife.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=9798"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/mattfife.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=9798"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}