Retro Game Mechanics Explained is a great series on retro game console programming. If you ever wanted to know how the cake is baked, this is a great channel.
One of the best series up so far is how to program the SNES system. His 16 part series talks about background effects, lag & blanking, DMA and HDMA, memory mapping, color math, hardware registers, background modes 0-6, and the infamous mode 7. It is one of the better explanations of mode 7 that I have seen (though folks with a more formal background in graphics might explain it with with affine transforms alone)
He also covers individual games and topics such as how the Atari 2600 ‘Raced the beam’, Atari quadrascan, pokemon sprite decompression, Pac-Man arcade’s famous kill screen, Mario’s wrong warp, and many other fun topics.
And just like that programmer’s were replaced by machine learning and pressing tab.
GitHub Copilot is a development plugin that uses AI to auto-complete what you’re coding. The AI was trained using github projects as its learning source. You start coding, press tab, and it gives you a list of what it thinks you might want next based on what it matches you might be developing.
Nick Chapsas tries out a number of programming tasks from basic data structures, creating an API, a calculator, and even fully implemented fizzbuzz. It does *shockingly* well.
I think this is the next obvious level of auto-completion we’ve had for years. I bet it almost certainly will come to mainline development tools in the next 5 years. It does, however, bring up some interesting legal points if someone unknowingly auto-completes a blob of code from an GPL or closed source project. This treads the fine line of auto-generated code and downright copying. My guess is that using IP violation code scanning tools to detect problems will be even more important.
With bitcoin hitting all time highs and lows, it’s interesting to hear self-described pundits go on and on about the promises of crypto-currency. Surprisingly, one thing you don’t hear about is that the life of these currencies might be very limited now that quantum computers are becoming a reality.
Quantum computers are excellent at breaking mathematically difficult problems – which is the underlying technology for almost all of cryptography and block-chain algorithms. In October 2019, Google announced they have achieved quantum supremacy on a certain class of problems. So what does this mean for crypto-currencies?
Interestingly enough, the most vulnerable ones are the ones in the p2pk addresses. The coins in this address range were some of the earliest coins mined. The ones still in that range are largely considered to belong to people who have long since lost their keys. This means they could easily be mined by anyone with a sufficiently large quantum computer – and claim 2 million bitcoins worth almost 70 BILLION dollars (assuming bitcoin is worth the current market price of $35,000).
Not only that, if 25% of a currency is vulnerable to be quietly captured by a single investor with a quantum computer – it represents a tremendous amount of power to manipulate the currency.
So, unused p2pkh coins are safe, right? Not really. The moment you want to transfer coins from such a “safe” address, you reveal the public key, making the address vulnerable. From that moment until your transaction is “mined”, an attacker who possesses a quantum computer gets a window of opportunity to steal your coins. In such an attack, the adversary will first derive your private key from the public key and then initiate a competing transaction to their own address. They will try to get priority over the original transaction by offering a higher mining fee.
The time for mining a transaction is about 10 minutes, calculations show that a quantum computer would take about 30 minutes to break a Bitcoin key. So, as long as that is true, your bitcoin transaction is …probably… safe. But that won’t last forever. It is an almost certainty quantum computing will make crypto-currencies worthless at some point – maybe even in our lifetime at the rate quantum computing is making advances.
Computer scientists spend a lot of time thinking about the most optimal way of doing things. This guy stacks up 79 different kinds of ways of sorting things from smallest to largest and compares number of writes, compares, auxiliary array use, etc.
Build systems are certainly not the sexy parts of software development. However, no part of the development process impacts your team so much as it’s build system. Build systems that perform poorly, regularly break, or generate inconsistent or confusing output files are one of the fastest ways to introduce bugs, slow releases, and bring a whole team to a screeching halt. That’s why automated reproducible builds are a keystone of agile development.
Out-of-source builds are one way to improve your build system. Out-of-source building is a practice that keeps the generated/compiled intermediate and binary files out of the source file directories. Traditionally, most build systems would generate object, binary, and intermediate files mixed right next to their source files. This leads to a confusing hierarchy of files that made getting a consistent picture of your build and source nearly impossible on big projects.
It turns out CMake can help you create our of source builds with just a few little tricks. Unfortunately, there were few examples and many of them were overly complex. So, to help get folks started, I wrote up a very simple sample. It’s the perfect starting point for your next project. It works with both Linux and Visual Studio builds.
Algorithm of the day: Rapidly exploring random trees (RRT) is an algorithm designed to efficiently search non-convex spaces by randomly building a space-filling tree. The tree is constructed incrementally from samples drawn randomly from the search space and is inherently biased to grow towards large unsearched areas of the problem. They easily handle problems with obstacles and differential constraints and have been widely used in autonomous robotic motion planning.
Traveling through hyperspace ain’t like dusting crops, boy! Without precise calculations we could fly right through a star or bounce too close to a supernova and that’d end your trip real quick, wouldn’t it?
Han Solo – Star Wars Episode IV: A New Hope
Moving to Vulcan and DirectX 12 isn’t like going from DX9 to DX11, or Opengl 3.0 to OpenGL 4.0. These new API’s add quite a bit of work that used to be done by the graphics driver. This gives devs more control, but it also makes things a lot more tricky.
Microsoft has generated a good set of videos to teach some of the unique and tricky parts of DirectX12 to those with some graphics background. These videos help teach a number of tricky topics and usages that aren’t immediately apparent by reading the docs.
Presentation modes in Windows 10
This video has terrible audio quality, but it does a great job covering the various flip modes and delays that they introduce:
This is one of the big concepts that trips you up and causes a lot of confusion.