Developed by Carnegie Mellon University, a new AI called Libratus won the “Brains Vs. Artificial Intelligence” tournament against four poker pros by $1,766,250 in chips over 120,000 hands (games). Researchers can now say that the victory margin was large enough to count as a statistically significant win, meaning that they could be at least 99.7 percent sure that the AI victory was not due to chance.
The four human poker pros who participated in the recent tournament spent many extra hours each day on trying to puzzle out Libratus. They teamed up at the start of the tournament with a collective plan of each trying different ranges of bet sizes to probe for weaknesses in the Libratus AI’s strategy that they could exploit. During each night of the tournament, they gathered together back in their hotel rooms to analyze the day’s worth of plays and talk strategy.
The AI took a lead that was never lost. It see-sawed close to even mid-week and even shrunk to $50,000 on the 6th day. But on the 7th day ‘the wheels came off’. By the end, Jimmy Chou, became convinced that Libratus had tailored its strategy to each individual player. Dong Kim, who performed the best among the four by only losing $85,649 in chips to Libratus, believed that the humans were playing slightly different versions of the AI each day.
After Kim finished playing on the final day, he helped answer some questions for online viewers watching the poker tournament through the live-streaming service Twitch. He congratulated the Carnegie Mellon researchers on a “decisive victory.” But when asked about what went well for the poker pros, he hesitated: “I think what went well was… shit. It’s hard to say. We took such a beating.”
The victory demonstrates the AI has likely surpassed the best humans at doing strategic reasoning in “imperfect information” games such as poker. But more than that, Libratus algorithms can take the “rules” of any imperfect-information game or scenario and then come up with its own strategy. For example, the Carnegie Mellon team hopes its AI could design drugs to counter viruses that evolve resistance to certain treatments, or perform automated business negotiations. It could also power applications in cybersecurity, military robotic systems or finance.
Jan Krissler, known in hacker circles as Starbug, was already known for his high-profile stunt of cracking Apple TouchID sensors within 24 hours of the iPhone 5S release. In this case, he used several easily taken close-range photos of German defense minister Ursula von der Leyen, including one gleaned from a press release issued by her own office and another he took himself from three meters away, to reverse-engineer her fingerprint and pass biometric scans.
The same conference also demonstrated a “corneal keylogger”. The idea behind the attack is simple. A hacker may have access to a user’s phone camera, but not anything else. How to go from there to stealing all their passwords?
One way, demonstrated on stage, is to read what they’re typing by analyzing photographs of the reflections in their eyes. Smartphone cameras, even front-facing ones, are now high-resolution enough that such an attack is possible.
“Biometrics are not secrets… Ideally, they’re unique to each individual, but that’s not the same thing as being a secret.”
PIX is a performance tuning and debugging tool for game developers – that hadn’t been updated in years for the desktop. It survived on in three generations of Xbox consoles, but there was no desktop love. No longer! Microsoft just announced PIX beta is now available for analyzing DirectX 12 games on Windows.
PIX on Windows provides five main modes of operation:
GPU captures for debugging and analyzing the performance of Direct3D 12 graphics rendering.
Timing captures for understanding the performance and threading of all CPU and GPU work carried out by your game.
Function Summary captures accumulate information about how long each function runs for and how often each is called.
Callgraph captures trace the execution of a single function.
Memory Allocation captures provide insight into the memory allocations made by your game.
Ever want to see what is really going on inside a combustion engine like your car? Warped Perception built a custom transparent acrylic head for a four-stroke engine so we can see what internal combustion looks like as it happens. He fed the engine with different types of fuel then captured the results in slow-motion.
Computer vision algorithms have the ability to track and discern individual objects with incredible precision. Check out this footage from Støj, using YOLO object detection tech to identify, isolate, and even censor objects and people in The Wolf of Wall Street trailer.
John Edmark‘s sculptures wiggle and twist before your eyes. It’s hard to believe that what you’re seeing is a real physical object—but we assure you it is, with a bit of trick photography and some heady mathematics thrown in for good measure. Blooms 2 (a year in the making) is the latest collection of wild strobe-animated sculptures that begin life as computer programs written in Python before being 3D printed and set in motion on a table, but the patterns you see are created, in a sense, by nature itself.