Do your kids like bedtime stories? Stefano Mazzocchi has put Fably, the AI storytelling companion, on a Raspberry PI Zero.
Just push the button, tell it what kind of story you want, and enjoy the results. So much for bedtime stories with the kids.
This project is open source and you can build one for yourself or run it from your laptop. The project is located here: https://stefanom.github.io/fably/
Abandoned Films is back with another trippy, AI-generated movie trailer. This time, they took the 1997 sci-fi classic The Fifth Element and applied a 1950’s big-screen aesthetic.
While definitely not perfect, these AI generated trailers are amazing ways to generate and explore artistic concepts.
Carnegie Mellon researchers have developed a real-time human tele-operation system. Using a simple camera, it is able to read the actions of a human person and then translate that into real-time full-body control of a robot.
Individuals can now seamlessly teleoperate full-sized humanoids to execute a myriad of actions. According to researchers, they can perform simple tasks like picking and placing objects to dynamic movements like walking, kicking, and even boxing.
There’s lots of possibilities for this kind of remotely operated humanoid robotic system. Remotely controlled humanoid robots could save countless lives operating in dangerous environments.
They could be used to go in and shut down equipment after dangerous chemical or industrial accidents. Search dangerous buildings for survivors after earthquakes. They could perform dangerous police or urban warfare operations without loss of life. Stop terrorist by defusing bombs. Another such place would be effecting repairs, shutdowns, and cleanup in highly radiated areas like Chernobyl, Fukushima, or when there are nuclear accidents. In the future, we may never need the horrors of Chernobyl’s biorobots to deal with such disasters.
Google researchers have published a new artificial intelligence model that can take a text prompt, sketch or idea and turn it into a virtual world you can interact with and play.
Named Genie, the virtual world model was trained on gameplay and other videos found online and is currently a research preview. The games are 2D platformer style games.
Genie can be prompted with images it has never seen before, such as real world photographs or sketches, enabling people to interact with their imagined virtual worlds-–essentially acting as a foundation world model. This is possible despite training without any action labels. Instead, Genie is trained from a large dataset of publicly available Internet videos. We focus on videos of 2D platformer games and robotics but our method is general and should work for any type of domain, and is scalable to ever larger Internet datasets.
Given a single static image and a speech audio clip, VASA-1 is capable of producing lip movements that are synchronized with the audio and capture a large spectrum of facial nuances and natural head motions.
Getting worried you’ll be replaced by AI yet? If this gets perfected (it’s not perfect yet, but the results get better and better each year), then you can pretty much get rid of any ‘talking head’ jobs.
This could also be used to fool people on conference calls where video quality would totally render any minor glitches as unnoticeable or easily ignored as just streaming artifacts.
Just slap the CEO’s face into this, set up a conference call with finance via some very easy phishing, and approve that $1m transfer to your Swiss bank account.
A household robot can learn how to do almost any chore in about 20 minutes when taught by a human using an iPhone camera and a grabber.
Mahi Shafiullah at New York University and his colleagues created a way to teach robots that involves using the grabber equipped with an iPhone to train the operation.
Fast forward just 8 years and now there is the real thing.
The US military has tested an AI-controlled F16 named X-62A in a dogfight with an actual test pilot. The Variable Stability In-flight Simulator Test Aircraft, or “VISTA” for short, is essentially a modified F16 fighter jet controlled by AI that has previously conducted multiple test flights to demonstrate the capabilities of its artificial pilot (via The Telegraph).
In a press release, the USAF Test Pilot School and DARPA revealed that they initially tested various defensive maneuvers with the AI controlled jet to establish initial in-flight safety, before engaging in air-to-air simulated combat with another F16 in the skies above Edwards air force base in California last year.
It’s not clear if the tested dogfights were limited purely to aerial combat maneuvers with simulated weapons fire or how it did against live pilots. However, a previous AI developed by Heron Systems as part of a DARPA tournament was able to defeat a human pilot in five out of five tested scenarios.