Top 10 Airports in the US

Top 10 Airports in the US

How do domestic airports rank? This year, my Midwest home of Indianapolis gets some love as the #2 airport in the US.

  1. Savannah/Hilton Head International Airport
  2. Indianapolis International Airport
  3. Rhode Island T.F. Green International Airport
  4. Palm Beach International Airport
  5. Tampa International Airport
  6. Hartford Bradley International Airport 
  7. Minneapolis St. Paul
  8. Long Beach Airport
  9. Portland (Oregon) International Airport
  10. Detroit Metropolitan Airport

Sadly, Portland falls to #9 after having lead the list in years past, but things haven’t been very good at Portland airport in the last few years. We had protesters/counter-protesters assault each other a few times, and then of course a recent woman that fired a few shots at the airport.

360 VR experiences – black holes

360 VR experiences – black holes

ScienceClic English created this 360 experience of approaching and falling into a black hole. You can move your camera around and see in all directions. This is interesting because it’s a different take on some of the older ideas of what would happen if you fall into a black hole.

If you want to know what you’re seeing, here’s an explanation of what’s going on in this video and what the different visual phenomenon are caused by.

Reading 50 year old rope core memory

Reading 50 year old rope core memory

Mike Steward decided to recover the original Apollo guidance computer programs that landed man on the Moon in the 1960’s. Unfortunately some of them seem to have been lost to history.

It turns out, chunks of the original hardware still exist – such as the rope core memory which contained the programs. The next question is, how do you read these programs off 50 year old rope core memory hardware? This video below tells you how he did it!

He even wrote a web app that simulates how core memory works. We do a decent job recording history’s events, but I think it’s extremely cool that this kind of historical technical information is not being lost to the ages.

You can also check out the many other videos about the Apollo guidance computer in the other parts of his videos – or a previous article which has a super-awesome description of how Apollo computers work by Robert Wills.

Bored Ape NFT party severely injuries attendees

Bored Ape NFT party severely injuries attendees

A number of people have reported experiencing eye pain, vision problems, and sunburnt skin after attending ApeFest, a Bored Ape Yacht Club NFT collection event Nov 3-5th in Hong Kong. One person posted he woke up at 4am and could not see anymore. They rushed to the hospital where they are hoping to make a full recovery but were diagnosed with UV eye damage. It turns out, someone was almost certainly using full-spectrum UV-C (often called germicidal) lights instead of black lights.

The same thing happened at the Hypebest event in 2017 in Hong Kong. It’s not like this kind of harm is unique. Like the Jagermeister event that left a person in a coma, you better know your science.

BigClive on Youtube (who does amazing videos about extremely dangerous and counterfeit electronic devices you can buy and should be cautious of) recently uploaded a video and found the likely culprit. In the photo of the area with the toilets there were fluorescent tubes which are the characteristic teal-blue of mercury vapor discharge which emits quite a bit of UV-C and ozone as well.

How much radiation?

How much radiation?

I wrote a little while back about radiation from nuclear sources and how to detect them. But what about getting voluntary radiation for medical procedures: like X-rays, CT scans, and MRI’s?

Annual background radiation

Did you know you’re getting about 3.00 mSv of radiation every year if you live in the US? This breaks down to about 0.0082mSv per day.

There’s actually quite a bit of variation in the US depending on where you live. The types of rock in your area, position relative to the planet’s poles, elevation, and wide variety of other factors can affect your daily background radiation dosage.

Approximate effective radiation dose:Comparable to natural background radiation for:
One day’s background radiation0.0082 mSv1 days
A year of background radiation (US average)3.0 mSv1 year/365 days
Cross-country flight from New York to Los Angeles0.04 mSv4.87 days

X-rays

X-rays are something most people are familiar with. It turns out, however, modern x-rays actually give you a very low dosage. The lowest doses are dental X-rays. For a full panoramic dental x-ray, you’ll get about 0.007 mSv (0.7mrem). How little is that? One full day’s background radiation is about 0.0066 mSv (depending on where you live).

Still, there does appear to be a link to dental X-rays and certain types of thyroid and laryngeal, parotid gland, and salivary gland cancers.

Moving up, a full chest X-ray gives you a 0.1mSv dose. That’s about 10 days worth of radiation.

MRI’s

MRI’s use magnetic imaging, so they do not use radiation at all. They use a very powerful rotating magnet to generate their images.

Is it perfectly safe? Well, MRI’s can be done with contrast to help identify certain issues, and those chemicals can carry risk. Chuck Norris’ wife (yes, the Chuck Norris) even filed a $10 million lawsuit when his wife experienced health issues from the contrast used in an MRI.

CT scans

CT scans are particularly troublesome because they actually give you a pretty substantial dose of radiation. How much? Ex: A chest X-ray gives you the equivalent of 10 days of natural background radiation (0.1mSv). This is a very low dosage and highly unlikely to cause permanent or long-term damage.

On the other hand, a chest CT scan gives you 2.6 YEARS of radiation dosage – the equivalent of 77 chest x-rays. See some examples below or click them to see even more dosages based on different body part:

If that wasn’t bad enough, CT’s often also involve the use of contrast chemicals that may carry their own risks.

Beyond first class

Beyond first class

First class has seen some pretty big improvements the last few years, but that’s small peanuts these days.

The new A380’s offer suites – but even those with their full-sized recliners, full bed, in a private room doesn’t really touch what’s available.

Etihad Airways started offering something they’re calling The Residence on some of their highest end flights. Private airport entrances with concierge service, a private 3 room suite on the plane with living room, double bedroom, and full bathroom with shower. Add private large screen tv’s, cognac and turndown services along with private high end meals and you have a recipe for luxury. The price? $20,000 per ticket (vs $5000 for first class).

2023 Update: Unfortunately, it seems like The Residence were largely discontinued when Etihad retired most of their A380’s during covid. Nonstop Dan shows why they ran into problems selling them. Not because there weren’t customers, but because it turns out people that can afford that kind of luxury usually find it is barely any better deal than just renting a whole private jet. The price for a one-way private jet from Abu Dhabi to London also costs about $40,000 – or about the same as 2 Residence tickets. So if you’re flying with at least one other person, the private jet lets you take many more friends/business partners, have much more flexibility in schedules, and avoid big airports all together.

Bonus points for mentioning Abu Dhabi is a huge hub because it is located in a spot that makes most of the world’s population just 6 hours away – from China to Europe.

Chihuahua or muffin

Chihuahua or muffin

Free code camp compares various AI-based image recognizers to see how well they can identify if a picture is a chihuahua or a muffin. It’s surprisingly harder than you think and has a history of being used to determine the quality of the recognizer.

The author compares solutions from Amazon, Microsoft, IBM, Google, Cloudsight, and Clarifai. They also discuss the per-image cost as well as the quality of tags and other considerations. Definitely worth looking at if you’re trying to find an image classifier system.

Final results are on Topbots.

Links:

Photogrammetry/NeRF/Gaussian splatting compared

Photogrammetry/NeRF/Gaussian splatting compared


Matthew Brennan is not a computer scientist, but he takes 335 frames from a video and then processes them 3 different ways to compare the results. He creates a 3D mesh out of it for Photogrammetry, the processes it into a NeRF, and finally Gaussian Splatting.

What’s cool is that he shows how each works and how to process the data yourself. He also gives you access to the data to try it yourself.

Here’s the software he uses:

How will we know when AI gains consciousness?

How will we know when AI gains consciousness?

exurb1a hypothesizes about some of the very real horrors that current social media bots are capable of doing. Pro-tip – get the HECK off social media and stop trusting anything you read there because this stuff has already been happening on every social media, dating, review, and news feed apps since even before the 2016 election.

His speculations? Perhaps AI personas will become so realistic and comforting to us that we’ll stop interacting with each other – and spend our lives conversing and in relationships with non-entities.

Or (as is already happening) governments, extremist groups, media, and intelligence agencies weaponized AI to flood the internet with manipulated stories, data, and opinions. Finally (as if becoming unable to form real relationships and being in relationships with AI is not scary enough) he asks what if AI itself becomes conscious.

One of the main reasons this would be terrifying is because right now we have no way to ensure alignment of AI to any set of values.

When the AI becomes able to mimic humans so well that it can convince anyone of anything – even talking to it becomes infinitely dangerous. We could have just created an almost infinitely hyper-intelligent demon, trickster, and sociopath.

See how deep the rabbit hole goes – and the majority of the possible outcomes are not good.

Early AI was more like a therapist

Early AI was more like a therapist

ELIZA was an early ‘AI’ created by MIT scientist Joseph Weizenbaum between 1964 to 1967.

He created it to explore communication between humans and machines. ELIZA simulated conversation by using very simple pattern matching and substitution that gave users an illusion of understanding – but it had no representation that could be considered really understanding what was being said by either party. Something you can easily discover by playing with it for a few minutes.

Fast forward to 1991, and Creative Labs was having amazing success with their SoundBlaster add-on sound cards. On the driver disks that came with the SoundBlaster, there were programs showing off different capabilities. One of these capabilities was voice generation. To show off the ability to voice synthesize text, Creative Labs included a little program called Dr. Sbaitso (SoundBlaster Acting Intelligent Text-to-Speech Operator).

You interacted with it like a pseudo-therapist; but you can clearly see the connections and similar pattern/substitution methods that Eliza used. I remember being wowed by it when I played with it for the first time – and experimented for hours with it. It quickly shows its limitations, but the speech synthesis was very good for the time.

It doesn’t hold the test of time, but it is pretty neat and you can even check it out here:

https://classicreload.com/dr-sbaitso.html#