Browsed by
Category: Technical

Instant followers for your social media – under $10

Instant followers for your social media – under $10

One of the critical arguments during the purchase of Twitter was how many accounts were bots. Bots are probably much more rampant in social media and games than people think – or would like to think. With things like ChatGPT, people are quickly realizing how easy it has been for bad actors to flood social media (from reddit to Facebook) to disrupt our elections and foment rebellions (and protests) in countries to destabilize governments (include in the US).

Smarter Every Day did an entire series on how easy it is to manipulate everything from Facebook to Youtube to Reddit. Reddit even has whole forums that are just bots talking to each other. Go read through it – it’ll shock you how easily bots create and control conversations on these platforms.

How easy is it to get followers and spread misinformation? What if I told you that you could get 1000 followers for $20? If you have a modest sized advertising budget (or part of a country’s defense/espionage budget), you could spend $5,000 and get 250,000 followers or only $20,000 and get a MILLION followers. This is true for Instagram, Facebook, Twitter, Youtube, and just about any other platform. Don’t believe me? Here’s just one example from BoosterGod:

So, maybe being an ‘influencer’ with a million subscribers isn’t quite as impressive as you’d think. That isn’t even a blip on most corporate advertising budgets – and nothing to a foreign power.

It might be worth an experiment if you have the cash. An influencer with 1 million subscribers can usually get advertising sponsorships and product advertising deals that far exceed $20,000 – so it might even pay for itself…

Two kinds of Randomness

Two kinds of Randomness

Game development is now as much art as science, or rather the art of science. Even something as simple as how and when to use randomness can profoundly impact the fun of a game. Enter the observation of two different kinds of randomness: input and output randomness.

Input randomness is randomness that is decided BEFORE a player makes their strategy and decisions. Examples would include having a random number of enemies generated before the fight starts. While the number is random, knowing how many will show up actually lets the user decide to use different strategies and feel more in control.

Output randomness is often a big contributing factor to frustrating parts of gameplay. Examples here would consist of attacking an enemy, only to find out your attack completely missed out of sheer bad luck or an usually bad hit roll. This kind of behavior, while mathematically correct, often leaves users feeling like they were ‘robbed’ and that the game is cheating.

Games are increasingly using input randomness as a way to give users control. Even games that rely on output randomness often put their thumbs on the scales so that you do not lose as often as you’d like. In Civilization, if your unit with a 33% chance of hitting misses twice in a row, it’s guaranteed to hit on the 3rd try – even though real randomness wouldn’t behave like that.

Anyway, this is a great video about the different kinds of randomness.

Using a Neural Net as compression for character animation

Using a Neural Net as compression for character animation

This was published in 2018, but it’s a fascinating dual purpose use of neural nets. Firstly, there was a massively increasing issue with character animation. Character animation is quickly becoming highly complex as it has becoming more realistic. The problem compounds when you want to make sure you can do things like crouch and aim at the same time. Or crouch and walk across uneven terrain while looking left or right. You can imagine all the different kinds of combinations of motion that must be described and handled. This all started taking massively more time to develop by artists; but even worse it was taking up more and more storage space on disk and especially in memory space.

Daniel Holden of Ubisoft wondered if he could use a neural net to not only reduce the combinations they had to handle into a net but also utilize the inherent nature of neural nets to compress data. It turns out he could – and he presents what he found in this excellent presentation.

Links:

8″ Floppy drive

8″ Floppy drive

8″ floppy drives are the earliest form of floppy drives connected to early minicomputers. By the time of personal computers, 8″ floppy drives had been replaced with 5.25″ floppy drives. But those 5.25″ (and later drives) were still often based on the Shugard interface.
Adrian’s Digital Basement shows how he hooked up an 8″ floppy drive from a TSR-80 Model II to a 386SX computer – and gets it to boot! This is almost certainly something I want to try some day.

Link:

Really useful caustics

Really useful caustics

With the right set of curvatures, it’s possible to make a clear object project an image that’s not visible until light shines through it. Science educator Steve Mould explains the optical and mathematical properties of these uniquely engineered lenses. It turns out the problem has a lot to do with moving the minimal amount of dirt to build a structure and was studied extensively by mathematicians who called it optimal transport. These transport theory problems have a number of solutions and applications.

A similar effect can be created with mirrors and reflected light. Rayform specializes in the technique for a wide variety of luxury and architectural items.

Reading 3.5″, 5.25″, and 8″ floppy disks with Raspberry Pi

Reading 3.5″, 5.25″, and 8″ floppy disks with Raspberry Pi

It looks like people have been using Raspberry Pi’s to connect and read floppy disks. Below are some of the links. Note that it appears they are only reading data, not writing. See my other posts for both reading and writing to 5.25″ and other floppy drive interfaces.

Links:

Wavelet introduction

Wavelet introduction

Artem Kirsanov normally talks about neuroscience, but in the process also made a great introductory video on wavelets. Wavelets let you find structures which are present in a complex signal but often hidden behind the noise. Since wavelets can perform decomposition in both time and frequency domains it makes them tremendously valuable tools.