AI Portland

AI PDX has become a pretty popular group for folks working with AI systems in Portland. Check out their calendar for upcoming events.

AI PDX has become a pretty popular group for folks working with AI systems in Portland. Check out their calendar for upcoming events.
Unusual rendering styles has become very popular lately. In this video, Acerola tries to write an ASCII based 3D rendering shader. Very interesting.
GPU programming used to be just about rendering graphics. As we’ve moved into bitcoin mining and AI, eisfrosch goes over the current chaotic programming environments for GPUs.
Acerola has a bunch of great graphics videos. In this one, he talks about why PS1 graphics looked the way it did.
I learned that PS1 actually had realtime camera distance tessellation – something that wasn’t available to desktop GPUs until the introduction of tessellation shaders.
Gamelogic does a decent intro to a few simple edge detection shaders used in toon-style rendering.
8 years ago Tenkai Games Dev Room made a cool ASCII nethack-like prototype, and has only gotten like 60k views. It’s amazing how things like this exist yet nobody has seen them.
The Quest 3 lets you scan a room and build up an internal 3D mesh that represents the world you are in. This can take from 20 seconds to minutes and requires the user walking around the area – and is not able to change dynamically to opening/closing doors/etc.
The Depth API provides live depth frames up to 5 meters in distance – but how to use that to build up the environment in real time?
Julian Triveri‘s multiplayer mixed reality Quest 3 game Lasertag does just this. It takes the live frames and uses an open-source Unity implementation of marching cubes. Apple Vision Pro and Pico 4 Ultra already use this method – but have hardware accelerated depth sensors to help. Quest 3 developers need to do this computation themselves.
See the code on GitHub.
https://www.uploadvr.com/developer-implemented-continuous-scene-meshing-quest-3-lasertag
Unity Research decided to find out how hard it really is to beat modern anti-cheat systems in many FPS games. She deep dives into the history and current state of cheating.
Cheating in CS2
Kashif Hoda was waiting for a train near Harvard Square when a young man wearing glasses asked him for directions. A few minutes later, as Mr. Hoda’s train was pulling into the station, the young man, who was a junior at Harvard University named AnhPhu Nguyen, approached him again.
“Do you happen to be the person working on minority stuff for Muslims in India?” Mr. Nguyen asked.
Mr. Hoda was shocked. He worked in biotechnology, but had previously been a journalist and had written about marginalized communities in India.
AnhPhu Nguyen and Caine Ardayfio had created glasses that automatically identify people they look at. Nguyen and Ardayfio are both 21 and studying engineering at Harvard. They said in an interview that their system relied on already widely available technologies, including:
“All the tools were there,” Mr. Nguyen said. “We just had the idea to combine them together.” Nguyen posted a video of it working. Watching it is creepy to say the least. Imagine walking in public and anyone, at any time, can know exactly who you are and anything you’ve ever said or done.
Articles: