Code Monger, cyclist, sim racer and driving enthusiast.
8624 stories
·
6 followers

Wealth distribution in the United States

1 Comment and 3 Shares

Forbes recently published the Forbes 400 List for 2024, listing the 400 richest people in the United States. This inspired me to make a histogram to show the distribution of wealth in the United States. It turns out that if you put Elon Musk on the graph, almost the entire US population is crammed into a vertical bar, one pixel wide. Each pixel is $500 million wide, illustrating that $500 million essentially rounds to zero from the perspective of the wealthiest Americans.

Graph showing the wealth distribution in the United States.

The histogram above shows the wealth distribution in red. Note that the visible red line is one pixel wide at the left and disappears everywhere else—this is the important point: essentially the entire US population is in that first bar. The graph is drawn with the scale of 1 pixel = $500 million in the X axis, and 1 pixel = 1 million people in the Y axis. Away from the origin, the red line is invisible—a tiny fraction of a pixel tall since so few people have more than 500 million dollars.

Since the median US household wealth is about $190,000, half the population would be crammed into a microscopic red line 1/2500 of a pixel wide using the scale above. (The line would be much narrower than the wavelength of light so it would be literally invisible). The very rich are so rich that you could take someone with a thousand times the median amount of money, and they would still have almost nothing compared to the richest Americans. If you increased their money by a factor of a thousand yet again, you'd be at Bezos' level, but still well short of Elon Musk.

Another way to visualize the extreme distribution of wealth in the US is to imagine everyone in the US standing up while someone counts off millions of dollars, once per second. When your net worth is reached, you sit down. At the first count of $1 million, most people sit down, with 22 million people left standing. As the count continues—$2 million, $3 million, $4 million—more people sit down. After 6 seconds, everyone except the "1%" has taken their seat. As the counting approaches the 17-minute mark, only billionaires are left standing, but there are still days of counting ahead. Bill Gates sits down after a bit over one day, leaving 8 people, but the process is nowhere near the end. After about two days and 20 hours of counting, Elon Musk finally sits down.

Sources

The main source of data is the Forbes 400 List for 2024. Forbes claims there are 813 billionaires in the US here. Median wealth data is from the Federal Reserve; note that it is from 2022 and household rather than personal. The current US population estimate is from Worldometer. I estimated wealth above $500 million, extrapolating from 2019 data.

I made a similar graph in 2013; you can see my post here for comparison.

Disclaimers: Wealth data has a lot of sources of error including people vs households, what gets counted, and changing time periods, but I've tried to make this graph as accurate as possible. I'm not making any prescriptive judgements here, just presenting the data. Obviously, if you want to see the details of the curve, a logarithmic scale makes more sense, but I want to show the "true" shape of the curve. I should also mention that wealth and income are very different things; this post looks strictly at wealth.

Read the whole story
acdha
13 days ago
reply
“Since the median US household wealth is about $190,000, half the population would be crammed into a microscopic red line 1/2500 of a pixel wide using the scale above. (The line would be much narrower than the wavelength of light so it would be literally invisible).”
Washington, DC
LeMadChef
3 hours ago
reply
Denver, CO
Share this story
Delete

Lee Pace and Josh Brolin Will Make Glen Powell’s Life Miserable in Edgar Wright’s The Running Man

1 Share
News The Running Man

Lee Pace and Josh Brolin Will Make Glen Powell’s Life Miserable in Edgar Wright’s The Running Man

Would you run all that fast if Lee Pace were chasing you?

By

Published on October 18, 2024

Screenshot: Apple TV+

Lee Pace in Foundation

Screenshot: Apple TV+

Edgar Wright’s take on The Running Man, the novel by Stephen King writing as Richard Bachmann, has been in the works since the long-ago year of 2021. But in recent months the adaptation has been moving forward with a certain swiftness, casting Glen Powell as the titular Man and later adding Daniel Ezra (A Discovery of Witches) and Katy O’Brian (The Mandalorian) to the cast.

But no reality-game-show-type story is complete without a ruthless villain or two, and now we know who will play those roles in this film: two Marvel bad guys. Thanos himself, Josh Brolin, is set to play the “ruthless producer” of the murderous game show. And Lee Pace—who once glowered frightfully as Ronan the Accuser—will play a character The Hollywood Reporter describes as “the brutal chief hunter for the network airing the game shows and tasked by the producer with tracking down Powell’s character.”

Pace has a history of morally complex characters, from the depressed, storytelling stuntman in the just-rereleased The Fall to cranky, elk-riding elf king Thranduil to… well, okay, Foundation’s clone emperor (pictured above) isn’t morally complex so much as morally bankrupt. (He did go for that soul-searching walk in the desert, at any rate.) This is an interesting choice for Pace, though, as he had a sort of similar henchman role as Ronan. Or maybe there will be more to his hunter than the plot summary suggests.

The book The Running Man takes place in a terrible dystopia (the year 2025, natch) in which people compete in a deadly reality show in order to earn money. Ben Richards, the main character, wants to save his gravely ill daughter, so while his wife turns to prostitution, he turns to running away from hunters who are dead set on killing him. A not-terribly-true-to-the-book movie adaptation was released in the ’80s and starred Arnold Schwarzenegger.

The new version, from Shaun of the Dead director Wright, is due in theaters on November 21st, 2025.[end-mark]

The post Lee Pace and Josh Brolin Will Make Glen Powell’s Life Miserable in Edgar Wright’s <i>The Running Man</i> appeared first on Reactor.

Read the whole story
LeMadChef
3 hours ago
reply
Denver, CO
Share this story
Delete

Feds open their 14th Tesla safety investigation, this time for FSD

1 Share

Today, federal safety investigators opened a new investigation aimed at Tesla's electric vehicles. This is now the 14th investigation by the National Highway Traffic Safety Administration and one of several currently open. This time, it's the automaker's highly controversial FSD feature that's in the crosshairs—NHTSA says it now has four reports of Teslas using FSD and then crashing after the camera-only system encountered fog, sun glare, or airborne dust.

Of the four crashes that sparked this investigation, one caused the death of a pedestrian when a Model Y crashed into them in Rimrock, Arizona, in November 2023.

NHTSA has a standing general order that requires it to be told if a car crashes while operating under partial or full automation. Fully automated or autonomous means cars might be termed "actually self-driving," such as the Waymos and Zooxes that clutter up the streets of San Francisco. Festooned with dozens of exterior sensors, these four-wheel testbeds drive around—mostly empty of passengers—gathering data to train themselves with later, with no human supervision. (This is also known as SAE level 4 automation.)

But the systems that come in cars that you or I could buy are far less sophisticated. Sometimes called "level 2+," these systems (which include Tesla Autopilot, Tesla FSD, GM's Super Cruise, BMW Highway Assistant, and Ford BlueCruise, among others) are partially automated, not autonomous. They will steer, accelerate, and brake for the driver, and they may even change lanes without explicit instruction, but the human behind the wheel is always meant to be in charge, even if the car is operating in a hands-free mode.

(Yes, there is also a level 3, but so far it is only available on a small number of Mercedes-Benz vehicles and just in California and Nevada.)

The investigation seeks to determine FSD's ability to "detect and respond appropriately to reduced roadway visibility conditions." Unlike almost every other system deployed on the road, Tesla chooses to rely on cameras alone and does not have a stereoscopic setup but instead has a wide, a main, and a narrow angle forward-looking sensor instead. And hundreds of thousands of older Teslas have less capable hardware yet are still able to run FSD.

NHTSA will also determine whether there are any other similar low-visibility crashes to the four it already knows about, as well as updates or tweaks to the system by Tesla, in particular... the timing, purpose, and capabilities of any such updates, as well as Tesla’s assessment of their safety impact."

This one could be costly

The stakes are high for Tesla. If NHTSA determines that the company's camera-only strategy isn't capable of delivering on the promises repeatedly made by Tesla CEO Elon Musk, it can force the automaker to issue a recall. This could involve having to retrofit the cars with new hardware at great expense or require disabling FSD, which would deprive Tesla of a critical revenue stream and perhaps even force its investors to come to terms with reality.

While the company's valuation has long since entered meme stock status, decoupled from the fundamentals of the underlying business of making and selling EVs, after last week's "emperor's new clothes" robotaxi reveal there are signs that the market is finally starting to pay at least a little attention.

Read full article

Comments



Read the whole story
LeMadChef
3 hours ago
reply
Denver, CO
Share this story
Delete

Silicon Valley is sacrificing the climate for AI

1 Share
Silicon Valley is sacrificing the climate for AI

The generative AI hype cycle has been revelatory for many reasons. The tech industry’s dependence on boom and bust cycles to drive investment based on inflated promises was put on full display. Some of the obscure beliefs held by tech billionaires had a light shone on them, particularly how determined they are to believe computers will one day be able to replicate human thought patterns. But it also showed how willing they are to sacrifice the rest of us on the altar of their technological ambitions.

Earlier this month, we saw yet another example of that. On October 1, the Special Competitive Studies Project held its inaugural AI+Energy Summit. The think tank is funded by former Google CEO Eric Schmidt to promote his goal of getting the US military to aggressively adopt new technologies like artificial intelligence (AI). On stage at the conference, Schmidt was asked whether his ambitions for AI adoption could be squared with the need to meet our climate targets. His answer was shocking, but also confirmation of a pattern we’ve been seeing from Silicon Valley leaders.

After calling AI a “universal technology” that can be put to virtually any use, he asserted that “we’re not going to hit the climate goals anyway,” effectively arguing the tech industry should be off the hook for its rising emissions. Instead, he said we face two scenarios: one where we breach emissions targets without AI and another where we emit even more, but at least we have that supposed “alien intelligence” to help us.

“I’d rather bet on AI solving the problem than constraining it and having the problem,” he concluded, never accepting that another scenario where we do take the climate crisis seriously without generative AI could be considered. His comments are an example of the tech industry’s evolving stance on climate change that it will either be solved through new technologies or the planet will be left to warm, regardless of the consequences.

Read the whole story
LeMadChef
3 hours ago
reply
Denver, CO
Share this story
Delete

Cheap AI “video scraping” can now extract data from any screen recording

1 Share

Recently, AI researcher Simon Willison wanted to add up his charges from using a cloud service, but the payment values and dates he needed were scattered among a dozen separate emails. Inputting them manually would have been tedious, so he turned to a technique he calls "video scraping," which involves feeding a screen recording video into an AI model, similar to ChatGPT, for data extraction purposes.

What he discovered seems simple on its surface, but the quality of the result has deeper implications for the future of AI assistants, which may soon be able to see and interact with what we're doing on our computer screens.

"The other day I found myself needing to add up some numeric values that were scattered across twelve different emails," Willison wrote in a detailed post on his blog. He recorded a 35-second video scrolling through the relevant emails, then fed that video into Google's AI Studio tool, which allows people to experiment with several versions of Google's Gemini 1.5 Pro and Gemini 1.5 Flash AI models.

Willison then asked Gemini to pull the price data from the video and arrange it into a special data format called JSON (JavaScript Object Notation) that included dates and dollar amounts. The AI model successfully extracted the data, which Willison then formatted as CSV (comma-separated values) table for spreadsheet use. After double-checking for errors as part of his experiment, the accuracy of the results—and what the video analysis cost to run—surprised him.

A screenshot of Simon Willison using Google Gemini to extract data from a screen capture video.
A screenshot of Simon Willison using Google Gemini to extract data from a screen capture video.

"The cost [of running the video model] is so low that I had to re-run my calculations three times to make sure I hadn’t made a mistake," he wrote. Willison says the entire video analysis process ostensibly cost less than one-tenth of a cent, using just 11,018 tokens on the Gemini 1.5 Flash 002 model. In the end, he actually paid nothing because Google AI Studio is currently free for some types of use.

Video scraping is just one of many new tricks possible when the latest large language models (LLMs), such as Google's Gemini and GPT-4o, are actually "multimodal" models, allowing audio, video, image, and text input. These models translate any multimedia input into tokens (chunks of data), which they use to make predictions about which tokens should come next in a sequence.

A term like "token prediction model" (TPM) might be more accurate than "LLM" these days for AI models with multimodal inputs and outputs, but a generalized alternative term hasn't really taken off yet. But no matter what you call it, having an AI model that can take video inputs has interesting implications, both good and potentially bad.

Breaking down input barriers

Willison is far from the first person to feed video into AI models to achieve interesting results (more on that below, and here's a 2015 paper that uses the "video scraping" term), but as soon as Gemini launched its video input capability, he began to experiment with it in earnest.

In February, Willison demonstrated another early application of AI video scraping on his blog, where he took a seven-second video of the books on his bookshelves, then got Gemini 1.5 Pro to extract all of the book titles it saw in the video and put them in a structured, or organized, list.

Converting unstructured data into structured data is important to Willison, because he's also a data journalist. Willison has created tools for data journalists in the past, such as the Datasette project, which lets anyone publish data as an interactive website.

To every data journalist's frustration, some sources of data prove resistant to scraping (capturing data for analysis) due to how the data is formatted, stored, or presented. In these cases, Willison delights in the potential for AI video scraping because it bypasses these traditional barriers to data extraction.

"There's no level of website authentication or anti-scraping technology that can stop me from recording a video of my screen while I manually click around inside a web application," Willison noted on his blog. His method works for any visible on-screen content.

Video is the new text

An illustration of a cybernetic eyeball.
An illustration of a cybernetic eyeball.

The ease and effectiveness of Willison's technique reflect a noteworthy shift now underway in how some users will interact with token prediction models. Rather than requiring a user to manually paste or type in data in a chat dialog—or detail every scenario to a chatbot as text—some AI applications increasingly work with visual data captured directly on the screen. For example, if you're having trouble navigating a pizza website's terrible interface, an AI model could step in and perform the necessary mouse clicks to order the pizza for you.

In fact, video scraping is already on the radar of every major AI lab, although they are not likely to call it that at the moment. Instead, tech companies typically refer to these techniques as "video understanding" or simply "vision."

In May, OpenAI demonstrated a prototype version of its ChatGPT Mac App with an option that allowed ChatGPT to see and interact with what is on your screen, but that feature has not yet shipped. Microsoft demonstrated a similar "Copilot Vision" prototype concept earlier this month (based on OpenAI's technology) that will be able to "watch" your screen and help you extract data and interact with applications you're running.

Despite these research previews, OpenAI's ChatGPT and Anthropic's Claude have not yet implemented a public video input feature for their models, possibly because it is relatively computationally expensive for them to process the extra tokens from a "tokenized" video stream.

For the moment, Google is heavily subsidizing user AI costs with its war chest from Search revenue and a massive fleet of data centers (to be fair, OpenAI is subsidizing, too, but with investor dollars and help from Microsoft). But costs of AI compute in general are dropping by the day, which will open up new capabilities of the technology to a broader user base over time.

Countering privacy issues

As you might imagine, having an AI model see what you do on your computer screen can have downsides. For now, video scraping is great for Willison, who will undoubtedly use the captured data in positive and helpful ways. But it's also a preview of a capability that could later be used to invade privacy or autonomously spy on computer users on a scale that was once impossible.

A different form of video scraping caused a massive wave of controversy recently for that exact reason. Apps such as the third-party Rewind AI on the Mac and Microsoft's Recall, which is being built into Windows 11, operate by feeding on-screen video into an AI model that stores extracted data into a database for later AI recall. Unfortunately, that approach also introduces potential privacy issues because it records everything you do on your machine and puts it in a single place that could later be hacked.

To that point, although Willison's technique currently involves uploading a video of his data to Google for processing, he is pleased that he can still decide what the AI model sees and when.

"The great thing about this video scraping technique is that it works with anything that you can see on your screen... and it puts you in total control of what you end up exposing to the AI model," Willison explained in his blog post.

It's also possible in the future that a locally run open-weights AI model could pull off the same video analysis method without the need for a cloud connection at all. Microsoft Recall runs locally on supported devices, but it still demands a great deal of unearned trust. For now, Willison is perfectly content to selectively feed video data to AI models when the need arises.

"I expect I’ll be using this technique a whole lot more in the future," he wrote, and perhaps many others will, too, in different forms. If the past is any indication, Willison—who coined the term "prompt injection" in 2022—seems to always be a few steps ahead in exploring novel applications of AI tools. Right now, his attention is on the new implications of AI and video, and yours probably should be, too.

Read full article

Comments



Read the whole story
LeMadChef
4 hours ago
reply
Denver, CO
Share this story
Delete

If You Bought A New Audi E-Tron Or Mercedes EQS: I’m So Sorry

1 Share

Elise and I recently went on a hike with another couple, and on our way up the mountain I learned from the guy all about how his car has lost $50 grand in value over just a few years. I didn’t think much of it; lots of cars have seen heavy depreciation, especially lately, so I figured he was just exaggerating. But then, when I got home, I recalled the conversation and decided to look up his car: A 2019 Audi E-Tron. And my god was I shocked with what I saw.

Depreciation is a part of life if you buy a new car, with a few exceptions like the Jeep Wrangler, Toyota Tacoma, and pretty much anything bought just before COVID and sold during that car-market nightmare. Among the worst cars when it comes to depreciation are expensive European cars. Buy a new Mercedes S-Class, for example, and you can expect to lose many tens of thousands of dollars in a really short span.

Another segment of the car market that has seen humongous depreciation is electric cars. So what if you blend 1. Expensive German Car with 2. Electric car? Well, you get a depreciation To The Max.

Image01
Image: Audi

The headline of this article isn’t meant to be a joke, because people losing tens of thousands of dollars is no laughing matter. It can have a huge effect on someone’s livelihood if they end up buying a car whose value tanks just before they have to sell. On one hand, the cars that are depreciating worst are the ones purchased by folks who, at least in theory, can most afford it. On the other hand, I can see how this could blindside someone.

I mean, just look at the reviews of the Audi E-Tron when it came out in 2019; anybody would have thought they were buying a state-of-the-art machine. Motor Trend‘s headline was: “2019 Audi E-Tron Review: What a Way to Glide” and its subheading was “The EV wars are starting in earnest, and Audi has itself a real weapon.” Here’s how Motor Trend described how the E-Tron compared to others in its class, and even described it as “affordable”:

In the showroom wars, the e-tron’s primary enemies are the aforementioned I-Pace and Tesla’s Model X, as well as Mercedes’ upcoming EQC. A little smaller and pricier but quicker and more responsive, the Jag boasts an EPA range of 234 miles. The Model X is by far the costliest of the bunch—when you add desirable options it can soar well past $100K—but it’s also by far the quickest, as it can sprint from zero to 60 mph in 2.8 seconds with the extra-cost Ludicrous Mode. The Tesla also leads with a maximum claimed range of 325 miles. The e-tron, in contrast, is the most “normal” of the trio. Excepting the Mercedes, which starts at $68,895, it’s the most affordable with a base sticker of $75,795, offers a generous 57 cubic feet of cargo space with the rear seats folded down, and while it may not deliver the sizzling straight-line acceleration of the Model X or the halfback-like chassis moves of the I-Pace, it’s designed to charge quickly, glides over the road with unfailing refinement, and is built with battery longevity and unflagging performance as priorities.

CNET’s review was similarly glowing:

In that spirit, today I’d like to celebrate the $74,800 Audi E-Tron, a car I’ve been appreciating for nearly two months now. Audi’s first production electric car takes a subtly different but distinctive path to all-electric glory. There’s nothing ludicrous about this EV SUV and, frankly, I couldn’t care less what its Nurburgring lap time is. What I do know is that this is among the most comfortable, most soothing cars I’ve ever had the privilege of driving, and that makes it something special.

[…]
Configured this way, at $77,290 including destination, this is not a cheap car. But it offers luxury appointments on par with similarly priced premium machines, plus the added benefits of that smooth, quiet, maintenance-free EV lifestyle.
Reading these reviews, you might think the E-Tron, even at over $75 grand, might be a good choice! It’s comfortable, efficient, luxurious, and it really doesn’t cost that much more than its competition.
At the time, this might have been right. But oh how things change quickly in the EV world. In fact, Car and Driver‘s review was one that I think offered a bit of foreshadowing:

It’s hard to argue with this practicality or with Audi’s wholly rational approach to building an EV. The e-tron is a competent, well-engineered piece that makes few compromises compared to Audi’s gasoline-powered SUVs. But at this point, buying an electric car—especially one that starts at $75,795—is still a bold, somewhat irrational choice, a decision to go against the grain…. But we’re not far enough into the EV era to know what’s right and wrong….

The Tesla Model Y launched just a few months after the Audi E-Tron, and then as other competitors like the Kia EV6, Hyundai Ioniq 5, and a boatload more joined in on the fun, prices tanked. Some of this is a result of EVs being seen as appliances whose value is determined predominantly by a single attribute (range), some of it is a result of early adopters having already bought their EVs and skeptics hesitating to make the plunge given infrastructure issues, part of it is a result of political uncertainty/rebates, and part of it is a result of the crazy price-cuts from Tesla.

In any case, look at what a 2019 Audi E-Tron — whose MSRP was $75,795 for the Premium Plus model and $82,795 for the Prestige model — costs nowadays:

Screen Shot 2024 10 17 At 1.07.51 Pm
Image: Auto Trader
Screen Shot 2024 10 17 At 1.08.12 Pm
Image: Auto Trader

Screen Shot 2024 10 17 At 1.11.07 Pm

Those cars only have about 50,000 miles on them, meaning they’ve lost over a dollar a mile! My god, a dollar a mile. If I knew my car would lose a dollar of value every mile I drove it, I’d sell it immediately. Check out the value trend over time:

Screen Shot 2024 10 17 At 2.14.05 Pm

Especially in the last couple of years, after the post-pandemic price-jump, the market has not been friendly to the E-Tron:

Screen Shot 2024 10 17 At 2.30.39 Pm
Image: KBB

It’s worth pointing out that E-Trons and other EVs were eligible for the $7,500 EV rebate. Not to mention, I bet plenty of these were leased, and others were sold with money on the hood. But for those who bought them outright in 2019, even with the EV rebate: Yikes!

Here’s the gasoline-powered Audi Q7, for reference:

Screen Shot 2024 10 17 At 2.21.23 Pm

Now, you might be thinking that lots of expensive German cars depreciate a lot, and that’s definitely true. A 2022 Mercedes S-Class started between $110,000 and $120,000, and look at how cheap they are now:

Screen Shot 2024 10 17 At 1.50.11 Pm Screen Shot 2024 10 17 At 1.51.11 Pm

That’s about 50 grand in just two years and 35,000 miles! Yikes! But even the mighty S-Class has nothing on the electric version of the S-Class, the EQS. That car started at $102,310 for the EQS 450+ and $119,110 for the EQS 580 4MATIC. (Less the EV rebate).

Ykbmjhe0gwgprh9zperrngiva4ogbshboohwlsxb
Image: Mercedes

Now let’s have a look at what these machines are trading for:

Screen Shot 2024 10 17 At 1.57.04 Pm

That’s among the cheapest ones I’ve seen. Only $39,000 for a car that started at over $100 grand just two years prior! Surely this thing is flying off the shelf, right?

Screen Shot 2024 10 17 At 1.57.49 Pm

Apparently not! Here’s another EQS that lost 60 grand in 40,000 miles ($1.50 a mile):

Screen Shot 2024 10 17 At 1.58.55 Pm

And here’s an EQS 580 for good measure (this one actually being sold by a Mercedes dealership). These started at over $119,000, so this is also a car that lost over 60 G’s:

Screen Shot 2024 10 17 At 2.03.36 Pm

Just look at the pricing trends of the EQS:

Screen Shot 2024 10 17 At 2.16.05 Pm

Here’s

Here’s the S-Class, in case you’re curious (sorry about the scale from Cargurus):

Screen Shot 2024 10 17 At 2.18.08 Pm

As you can see, those expensive German cars are depreciating quickly, but the electric expensive German cars are losing value violently.

Yikes!

Thank goodness so many of these were leased or purchased with lots of money on the hood, and hopefully all of them got the $7,500 federal rebate, along with a potential $4000 rebate for the buyer of the used car (though that only applies to sub-$25,000 vehicles).

All Images: Cargurus (Unless otherwise specified)

The post If You Bought A New Audi E-Tron Or Mercedes EQS: I’m So Sorry appeared first on The Autopian.

Read the whole story
LeMadChef
4 hours ago
reply
Denver, CO
Share this story
Delete
Next Page of Stories