The Sony Trinitron KX-45ED1, aka the PVM-4300, is thought to be the largest CRT TV ever sold to consumers. It has a 43-inch visible diagonal on its 45-inch tube and weighs in at almost 440 lbs. The stand alone is over 170lbs. At the time, it cost $40,000 USD in 1989 (or about $100K today, adjusted for inflation)
Long since thought gone, Shank Mods managed to save an extremely rare 43-inch Sony Trinitron KX-45ED1 from an untimely ending. It was being kept on the second floor of an Osaka noodle shop called Chikuma Soba – a building due for demolition in just a few weeks.
It was moved from the soba shop, crated up, and shipped to the US. While it worked well – it did need servicing. The alignment was off, had some tube cataracts, and the dynamic convergence amplifier circuit had failed. They worked on them all and have a very nice display.
The video describes the incredible journey and is definitely worth a watch
Andreas from Insomniac Games made a Amiga 500 demo in 2019 as part of this work with The Black Lotus demo group. He presented not only the Eon Amiga 500 demo, but tons of great technical information about the 4 years it took to develop it.
Old demo scene programmers hold amazing amounts of wisdom. When solving the core pieces of logic, I found this is true (but when doing larger, complete system development, these don’t work)
Work backwards from desired outcome to discover your constraints. Don’t just brute force. Instead, ask, what must be in place for us to get the peak performance from the key component we’re dependent on (render, disk load, etc). Then work from that constraint.
Do everything you can at compile time, not run time. Pre-compute tons of things – even the build-up of the data structures in memory. Just run it and then save and reload that blob automatically.
Over-generalizing early is a trap many devs fall into. Solve the problem in front of you. Trust that you can delete the code and do something else if it sucks. It’s cheaper and faster than trying to anticipate things ahead of time. Do the simplest thing that will work and if it sucks come back and delete it.
If you end up with a small runtime table/code that doesn’t require runtime checks because you can prove it can’t go wrong, you’re doing something right.
When developing, the actual Amiga is super slow and limited. They took an Amiga emulator and hacked it up so they could debug on it instead. Using calltraps to trigger the emulator, they added memory protection, fast forward, trigger debug, loading symbols, cycle accurate profiling, single step, high-resolution timers, etc. Also allows perfect input playback.
Modern threading and consumer/producer components (disk loading, data transfer, decompressors, etc) often just throw things in buffers and YOLO. There’s no clear backpressure to show you where you’re wasting time/space. Running on this kind of hardware/simulator shows you how much time the design is wasting by poorly and inefficiently designed algorithms/constraints.
Japhy Riddle in a hackaday article tries to re-create the look of old CRT sub-pixels – the individual red, green, blue phosphors that make up a single pixel. His approach is to basically fake it with Photoshop, but old systems like the Apple II, debayering, and even modern text anti-aliasing actually use some of these techniques.
When teaching myself to program as a kid, my first language was type-in BASIC programs. After that, I made the very un-orthodox choice to learn assembly. I wrote a small database, a TSR (Terminate and stay resident program), and a couple other small creations.
Looks like GreatCorn did one better by writing his own game in x86 assembly.
RESound: Interactive Sound Rendering for Dynamic Virtual Environments
About 15 years ago, people noticed that rendering virtual scenes with ray tracing was a lot like how sound propagates through an environment. Light rays travel through open spaces, hit objects and then reflect, refract, and bend. Sound waves follow many of the same principles.
What if you use the same ray casting methods to simulate sound traveling through an environment? Instead of standard hacks on sound to make something sound like it’s in a tiled bathroom or a big orchestra hall, you could accurately simulate it – reducing artist time. Simply play the sound and let the algorithm figure out how it should sound.
Not sure what other research has happened since. It was too computationally expensive for real time back then, but it was a cool idea and maybe we have the compute for it with today’s GPU’s.
Daniel Holden from Ubisoft gave this great talk at GDC 2018 on how data-driven analysis of their character animation control system turned into a AI system that vastly reduced the complexity and manpower involved in building an animation system for character control.
The world has gotten very familiar to retro hardware re-creations, game emulation, re-releases, speed runs, creating new games for old platforms, as well as new exploits, tools, and discoveries. The nitty gritty work of doing all of this, however, is a labor of love. For those that dig into the binary, there’s tricky copyright concerns that need to be managed, only scraps of information about old hardware and software, highly optimized/tricky code that is tough to read, and almost no financial gain – except for commercial re-releases.
Made Up of Wires walks us through a live bit of decompiling of the PS1 classic: Castlevania: Symphony of the Night to give you a taste of the work involved in this kind of work. Not really that different than any other reverse engineering but surprisingly accessible as these old games were relatively small and simple.
Ben Eater decided to build his own VGA video card. Well, technically it’s more of a display adapter/controller since the card doesn’t provide any rendering or accelerate the image buffer generation portion – but it’s still a pretty fun watch.
This is pretty much how computer graphics started. Someone built a display controller. Then others added some helper hardware to speed up the buffer fills, then blitting, then rendering, AI upscaling/noise reduction, and now full on AI rendering. What a wild technology ride – but it was this early stuff that really got me excited about technology. You could create and build all of this kind of amazing stuff yourself.