{"id":9668,"date":"2023-10-26T11:00:22","date_gmt":"2023-10-26T18:00:22","guid":{"rendered":"https:\/\/mattfife.com\/?p=9668"},"modified":"2023-10-13T11:15:42","modified_gmt":"2023-10-13T18:15:42","slug":"zip-nerf-anti-aliased-grid-based-neural-radiance-fields","status":"publish","type":"post","link":"https:\/\/mattfife.com\/?p=9668","title":{"rendered":"Zip-NeRF: Anti-Aliased Grid-Based Neural Radiance Fields"},"content":{"rendered":"\n<p><a href=\"https:\/\/www.matthewtancik.com\/nerf\" data-type=\"link\" data-id=\"https:\/\/www.matthewtancik.com\/nerf\">Neural Radiance Fields<\/a> (NeRF) produce some pretty beautiful renderings. A little like photogrammetry, it utilizes objects placed in a multi-dimensional volume (as captured from multiple viewpoints) and then when you want to render it from a particular angle, shoots rays into the scene based on camera location and queries the volume in order to get a screen coordinate pixel color at that location.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<span class=\"embed-youtube\" style=\"text-align:center; display: block;\"><iframe loading=\"lazy\" class=\"youtube-player\" width=\"640\" height=\"360\" src=\"https:\/\/www.youtube.com\/embed\/JuH79E8rdKc?version=3&#038;rel=1&#038;showsearch=0&#038;showinfo=1&#038;iv_load_policy=1&#038;fs=1&#038;hl=en-US&#038;autohide=2&#038;wmode=transparent\" allowfullscreen=\"true\" style=\"border:0;\" sandbox=\"allow-scripts allow-same-origin allow-popups allow-presentation allow-popups-to-escape-sandbox\"><\/iframe><\/span>\n<\/div><\/figure>\n\n\n\n<p>It does suffer from some shortcomings &#8211; such as largely only working well on static scenes, has trouble when there is missing or occluded portions, and most notably renders objects that lack fine details or produces blobby geometry common to volumetric rendering techniques. <\/p>\n\n\n\n<p>But it doesn&#8217;t stop people from trying. <a href=\"https:\/\/jonbarron.info\/zipnerf\/\" data-type=\"link\" data-id=\"https:\/\/jonbarron.info\/zipnerf\/\">Zip-NeRF<\/a> is an example where these Google scientists demonstrate how ideas from rendering and signal processing yield better error rates and trains dramatically faster than previous techniques. <\/p>\n\n\n\n<p>It&#8217;s always interesting to see what new things people are trying out these days.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<span class=\"embed-youtube\" style=\"text-align:center; display: block;\"><iframe loading=\"lazy\" class=\"youtube-player\" width=\"640\" height=\"360\" src=\"https:\/\/www.youtube.com\/embed\/xrrhynRzC8k?version=3&#038;rel=1&#038;showsearch=0&#038;showinfo=1&#038;iv_load_policy=1&#038;fs=1&#038;hl=en-US&#038;autohide=2&#038;wmode=transparent\" allowfullscreen=\"true\" style=\"border:0;\" sandbox=\"allow-scripts allow-same-origin allow-popups allow-presentation allow-popups-to-escape-sandbox\"><\/iframe><\/span>\n<\/div><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>Neural Radiance Fields (NeRF) produce some pretty beautiful renderings. A little like photogrammetry, it utilizes objects placed in a multi-dimensional volume (as captured from multiple viewpoints) and then when you want to render it from a particular angle, shoots rays into the scene based on camera location and queries the volume in order to get a screen coordinate pixel color at that location. It does suffer from some shortcomings &#8211; such as largely only working well on static scenes, has&#8230;<\/p>\n<p class=\"read-more\"><a class=\"btn btn-default\" href=\"https:\/\/mattfife.com\/?p=9668\"> Read More<span class=\"screen-reader-text\">  Read More<\/span><\/a><\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[9,5],"tags":[],"class_list":["post-9668","post","type-post","status-publish","format-standard","hentry","category-cool","category-technical"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p4WECr-2vW","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/posts\/9668","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/mattfife.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=9668"}],"version-history":[{"count":1,"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/posts\/9668\/revisions"}],"predecessor-version":[{"id":9669,"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/posts\/9668\/revisions\/9669"}],"wp:attachment":[{"href":"https:\/\/mattfife.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=9668"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/mattfife.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=9668"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/mattfife.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=9668"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}