3.06 | 08.07.19
This month Soonish geeks out on a new piece of computer hardware: Apple's Pro Display XDR. But the episode isn't a product review. It's an attempt to understand how innovations in the technology of image reproduction can alter the very way we see the world. Listen in your browser using the player above, or subscribe on Apple Podcasts or your favorite podcast app.
Apple used the opening keynote presentation at its annual World Wide Developers Conference in June to roll out the usual array of new hardware and software. But tucked about an hour into the June 3 event was one the most consequential and underappreciated pieces of news. For the first time in more than three years, Apple will offer its own LCD computer monitor, the Pro Display XDR.
The last three decades have seen only a couple of big shifts in the way we view video images: the advent of digital high-definition TV in the late 1990s, and the introduction of so-called “Retina” screens, with pixels so invisibly small that they blend into a seamless whole, around 2010.
To me, the introduction of the Pro Display XDR feels like another one of those moments.
It’s a serious piece of gear, with 20 million pixels (which makes it a 6K screen, halfway between the emerging 4K and 8K standards) and a 1-million-to-1 contrast ratio. And it comes with a serious price tag: $4,999, without the separate $999 stand.
But for some buyers, that could actually be a bargain. Apple says its goal is to give creative professionals access to image quality previously available only on super-high-end “reference monitors” that cost north of $40,000. That means more of the creators who make the media we see every day will be shaping their own content on these displays.
At Apple’s invitation, I went to the WWDC keynote and got an up-close look at the Pro Display XDR. (Check out my photos from the event here.) The quality of the still and video images I saw was, indeed, stunning—comparable to or better than the same images on nearby reference monitors from other companies. It seems inevitable that the new Apple display will be coveted by professionals—and not just the intended customer base of filmmakers, videographers, photographers, and animators, but anyone who works intensively with visual data (think of radiologists, for example, or astronomers).
That’s because this new display is about better pixels, not just more pixels. To achieve extremes of performance, Apple placed an array of LEDs directly behind the screen (instead of around the edges, as in a typical LCD screen) and came up with new ways to control their brightness, minimize their heat output, and carry away excess heat. The ability to generate bright higlights and very dark blacks gives the screen a high contrast ratio, which makes it ideal for displaying HDR (high dynamic range) photos and video.
Engineers predict that many of these technologies will eventually filter down into consumer-level displays. Which is good, because the monitors and TVs we’ve all been using for the last five, 10, or 15 years are already looking pretty dim and fuzzy by comparison.
In a very different context—the age when talking pictures were still new—the German cultural critic Walter Benjamin observed that “The way in which human perception is organized—the medium in which it occurs—is conditioned not only by nature but by history.” I think that’s still true, and that we’ll be conditioned by the newest screens to expect all digital images to start looking more true to life.
At WWDC, I interviewed two of the key Apple employees behind the project: product marketing manager Colleen Novielli and display engineer Vincent Gu. Novielli says Apple felt previous display technology “really wasn't good enough for showing the beauty of the world that photographers are capturing, that videographers are capturing,” and that “once people have seen the Pro Display XDR…I think that bar is going to be set much higher.”
For perspective, I also spoke with Michael Isnardi, a distinguished computer scientist at SRI International (formerly Sarnoff Corporation) in Princeton, NJ. He’s a pioneer in the areas of high-definition TV, satellite TV, and video compression algorithms, and he says he agrees the Pro Display XDR will raise the bar in the video world, making our old screens look…old. But not to worry: Isnardi says “all of the things that are in the Apple Pro Display XDR that make it unique right now are going to eventually become standard features five to 10 years from now, in displays that are going to be at Best Buy.”
I’ll be the first in line.
Mentioned In This Episode
Apple Pro Display XDR product page
Michael Isnardi, Center for Vision Technologies, SRI International
Colleen Novielli, Apple Worldwide Product Marketing Manager, iMac and Pro Display XDR
Mingxia (Vincent) Gu, Apple Engineering and Display Engineer
The Good Enough Revolution: When Cheap and Simple is Just Fine. Wired, August 24, 2009.
Why Was ‘The Long Night’ So Hard to Actually See? Here’s a Theory. Vulture, April 29, 2019.
The Work of Art in the Age of Mechanical Reproduction by Walter Benjamin, 1935
00:00 Hub & Spoke Sonic ID
00:08 Content Warning
00:24 Soonish Opening Theme
00:44 The Principle of Good Enough
01:46 The Ceiling and the Floor
02:22 A Very Deep Dive into Displays
02:59 WWDC 2019
04:02 Announcing the Pro Display XDR
05:51 Spoiled by the Garage Door Opener
07:13 Resets in Visual History
07:39 Color and Light and Pixels
10:51 The Future’s So Bright
14:11 Roy G. Biv
16:28 The Battle of Winterfell
19:32 Hollywood Is Leaving You Behind
21:23 Picture Optimization Mode
22:29 What Would Walter Benjamin Say?
24:38 A New Art Form
26:01 End Credits and Acknowledgements
26:44 Culture Hustlers
27:32 Thank You to Our Top Patreon Supporters
The Soonish opening theme is by Graham Gordon Ramsay.
If you like the show, please rate and review Soonish on Apple Podcasts / iTunes! The more ratings we get, the more people will find the show.
Listener support is the rocket fuel that keeps this whole ship going! You can pitch in with a per-episode donation at patreon.com/soonish.
We need your ideas to make the show better! Please take a few minutes to fill out our listener survey at soonishpodcast.org/survey.
Wade Roush: Before we start the show, I want to let you know that this episode contains details about the inner workings of computer displays and TV screens that some listeners may find extremely nerdy, as well as descriptions of violence against dragons and zombies. Okay, you’ve been warned.
You’re listening to Soonish. I’m Wade Roush.
I love trying out new technologies. But I’m not as much of a gadget geek as you might think. I kind of go by the principle of Good Enough.
My laptop is three years old and it’s at the point where its battery runs out really fast. But it’s good enough.
My TV is a high-definition model that I bought in 2007. It doesn’t have 4K resolution like the newest sets, but whatever. It’s good enough.
I never listen to my CDs anymore. I get all my music in compressed formats like MP3 or AAC. MP3 files don’t sound as clear as CDs, but they take up a lot less bandwidth and memory on my devices, so they’re good enough.
And as it turns out, about 10 years ago there was a little rash of stories in the media about the so-called Good Enough Revolution. Wired magazine ran a piece in 2009 that looked at the popularity of technologies like MP3 and concluded that people were increasingly okay with, quote, flexibility over high fidelity, convenience over features, quick and dirty over slow and polished, unquote.
And I think that’s still true. Up to a point. There’s an interesting thing about technology, though.
The ceiling is always getting higher, meaning there’s always going to be a new gadget that’s faster and smarter and brighter than the one you have.
But the floor keeps getting higher too.
I mean, I’m happy with my MP3s. But if you forced me to listen to everything on a scratchy old Victrola, I’d be pretty grumpy. Because I know what kind of sound quality is possible today, and I know that when I listen to an MP3, I’m not sacrificing all that much.
My point is, what’s acceptable to consumers at the low end is defined by what’s available at the high end.
On today’s show we’re going to take a very deep dive into the technology of displays, those glowing rectangles that we spend so much of our lives staring at.
I’m going to tell you about a trip I took to California to see a new display that absolutely raises the ceiling. And my bet is that it’ll also end up raising the floor, for all of us. And not just in our offices, but also in our living rooms. It could even change our expectations about what’s possible on a digital screen.
All I know is that by the time I got back from this trip, my old high-definition TV from 2007 was looking very … not good enough.
The story starts a couple of months ago, when I flew out to San Jose. That’s where Apple was holding its annual World Wide Developers Conference, or WWDC for short.
It’s one of Apple’s biggest media events of the year, and the company uses it to announce new products and share details about its latest operating system upgrades.
It’s also a chance for the Apple engineers who design Macs and iPhones and iPads to meet with all the people who write apps for those platforms.
Or at least, with 6,000 of them. That’s how many people fit into the main hall at the San Jose Convention Center for the big opening keynote, hosted by CEO Tim Cook.
This year Apple invited me to attend the keynote and meet afterward with one of the speakers. Her name is Colleen Novielli, and she was there to take the lid off a secret Apple project to build a new kind of display.
Here’s some tape from Colleen’s part of the keynote.
Colleen Novielli: Our goal was simple…make a display that delivers every feature pros have asked for… it’s a 32-inch LCD display with over 20 million pixels…an incredible 1 million to 1 contrast ratio….the images this display produces are truly stunning and with these capabilities we have gone way beyond high dynamic range. This is extreme dynamic range, or XDR. And so we call this display the Pro Display XDR.
Wade Roush: Now, you just heard a bunch of tech jargon like 6K and a million to 1 contrast ratio and high dynamic range. And don’t worry, I’m going to come back and explain each of those terms.
But long story short, Apple said it plans to start selling the Pro Display XDR in the fall of 2019, with a price tag of five thousand dollars. That’s obviously a lot more than you or I would spend on a desktop monitor. But as the name suggests, the Pro Display XDR really is meant for pros, as in movie producers, graphic designers, and photographers.
And believe it or not, for people in those businesses, five grand is a huge bargain. That’s because they’re used to buying so-called “reference monitors” that can cost thirty or forty thousand dollars.
After the keynote I got to see the Apple display up close, alongside several of those reference monitor. And honestly, the images on the Pro Display XDR were sharper, brighter, and more vivid. So it raises the ceiling on what’s possible if you’ve got $5,000 to spend. But it also raises the floor for the rest of us.Because once you get a chance to look at one of these new displays, it’s gonna be way harder to go back your old one.
Michael Isnardi: It's like when you go from a manual garage door opener to an automatic one. You notice if you have to go backwards and lift up the garage door again.
Wade Roush: That’s Michael Isnardi. He’s a distinguished computer scientist at SRI International in Princeton, New Jersey,
Michael Isnardi: People are subconsciously seeing a better picture and hearing better sound and will generally notice if we take a step backward. So I think it is going to set a new normal or a new bar, expectation bar in your mind.
Wade Roush: Isnardi is one of the nation’s pioneering engineers in the areas of satellite TV, high-definition TV, and video compression. And he says in his experience, even the fanciest new display technologies trickle down to the mass market sooner or later.
Michael Isnardi: I would think that all of the things that are in the Apple Pro Display XDR that make it unique right now are going to eventually become standard features five to 10 years from now, in displays that are going to be at Best Buy.
Wade Roush: And that’s why I think this news from Apple is worth talking about. At WWDC I think I got a glimpse of the kind of displays we’ll all be using in the future. I think they could totally reset our expectations about the quality of the images that we see in our everyday lives.
We’ve lived through these resets before. Photography forced painters away from literal representations and toward impressionism and abstraction. Movies made still photography look static. Color film made the black-and-white past look antique. High-definition flat-panel displays made our old standard-definition CRT televisions look grainy. And it turns out there may be more revolutions to come.
So to understand how our glowing rectangles are evolving, you have to know exactly what makes one display better than the previous one. And you start by counting how many pixels it’s got, and how densely they’re packed.
If you’ve ever seen a painting by Georges Seurat, then you know about pointillism, where the painter puts down nothing but millions of tiny dots, and then your eyes do the rest of the work to blend the dots together into a unified picture.
That’s how an LCD screen works too, except that the pixels are smaller and there are more of them. A high-definition screen like my old 2007 job has one thousand nine hundred and twenty pixels horizontally and one thousand and eighty pixels vertically.
Hey Siri, what’s one thousand nine hundred and twenty times one thousand and eighty?
Siri: That would be two million seventy three thousand six hundred.
Wade Roush: Okay, so a high definition screen has roughly 2 million pixels altogether, or 2 megapixels. The more pixels a screen has, the better, up to a certain point. But what’s just as important is how closely those pixels are packed together.
When I get really close to my old HDTV I can see the individual pixels, because they’re spaced out at a roomy 70 pixels per inch. The screen of Apple’s first iPhone in 2007 had fewer pixels than my TV, but they were packed a lot closer, at about 160 pixels per inch.
And over the years Apple and its manufacturing partners figured out how to make LCD screens with more and more pixels at higher and higher density. Apple crossed an important threshold in 2010, with the iPhone 4. That was the first device to have what Apple called a Retina display. On a Retina screen the pixels are so small that from a normal viewing distance of about a foot or more, you can’t see them with the naked eye. At that point the image on a screen looks just as sharp and smooth as it would if it were printed on paper.
Okay, fast forward to 2019 and the new Pro Display XDR. It’s got 20 megapixels and they’re crammed in at 218 pixels to the inch. That’s 10 times more pixels than my TV has, at three times the density. Which is remarkable, because it makes the picture so sharp.
But a display’s raw pixel count is only part of the story. Today, electronics stores sell lots of 4K TVs, which have 4,000 pixels horizontally. They’re even starting to sell 8K TVs. But Michael Isnardi says that beyond a certain point, adding more pixels is just overkill.
Michael Isnardi: I think we're eventually going to stop with the resolution increase. I think it's going to be a long time before we start seeing a lot of 8K televisions in the market. And I think we're going to sort of plateau at that point. Because once you're sitting, you know, eight to 10 feet from a television, even a 65 inch television, you know, you're not going to really notice the difference between 4K and 8K.
Wade Roush: The Pro Display XDR is a 6K screen, halfway between 4K and 8K. So yeah, it has plenty of pixels. But that’s just the start. You really don’t want the image on a screen to be drowned out by the ambient light in your office or your living room. So it helps if a display can make the bright parts of an image really bright.
Hardware makers quantify screen brightness using a unit called nits. And no, those don’t have anything to do with louse eggs. Nits is from the Latin word nitere, to shine. When it’s turned off a screen is black and emits zero nits.
If you’re old enough to remember TVs with those big cathode ray tubes, they had a peak brightness of about 100 nits. Apple’s new display can shine at 1,000 nits, continuously.
Now 1,000 nits isn’t so bright that you’ll have to wear shades, but it is about five times brighter than what we’re used to on our typical desktop monitors.
And that kind of brightness is hard to sustain, because all the light in an LCD screen comes from light-emitting diodes or LEDs, and those LEDs also emit heat. If you don’t find a way to regulate all that heat, the screen’ll just burn out.
Apple says it has two new ways to deal with that. First, they put a special computer chip into the screen called a timing controller. Its job is to analyze the incoming picture signal and turn the LEDs up in areas where more brightness is needed and turn them down or even turn them off in areas that should be darker.
Vincent Gu: This is a content based optical engine.
Wade Roush: That’s Vincent Gu. He’s a display engineer who led one of the teams that designed the Pro Display XDR.
Vincent Gu: So essentially we have an LED area behind the display, so anywhere we want the display to display black content, we simply turn the LED off. So essentially there's no light out. So that's why how we achieve the true black.
Wade Roush: That’s also how Apple keeps the display from generating excess heat. Second, there’s a new heat sink on the back of the display It’s a lattice full of these intricately carved holes. It looks sort of like a cheese grater you’d find on the kitchen of an alien ship.And it’s designed to carry the heat away.
I talked with Colleen Novielli the day after her WWDC keynote presentation, and here’s how she describes the whole brightness thing.
Colleen Novielli: Maintaining 1000 nits of full screen brightness is something that you know I said fairly casually on the stage yesterday. And definitely this is a huge engineering accomplishment. So the design of the efficient LEDs, the algorithm that's that's modulating and making sure that the light efficiency and the heat being generated is only when needed and then the heat sink working in the back to double the surface area and cool the LEDs, allowed us to have this incredible innovation of 1000 nits of a full screen brightness.
Wade Roush: Now when your screen can get both very bright and very dark, you naturally end up with what’s called a high contrast ratio. Which is exactly what it sounds like. It’s the ratio of the luminance of the brightest whites to the luminance of the darkest blacks.
A move a theater might have a contrast ratio of about 500 to 1. A typical desktop monitors have a ratio of 1000 to 1. But Apple’s new display has a contrast ratio of a million to one. That’s like the brightness of a piece of white paper in sunlight compared to the same piece of paper in moonlight.
A high contrast ratio is one of the elements of what’s called high dynamic range, or HDR. A lot of movies and TV shows these days are shot in HDR format, so they have darker blacks and brighter highlights. But HDR isn’t just about a high contrast ratio. It goes hand in hand with a wider color gamut. In other words, HDR screens have more colors.
No, I’m not saying they can show new colors that aren’t on the visible spectrum. We’re still stuck with good old Roy G Biv. In a wide color gamut video signal there’s just more information to specify the color of each pixel, and the colors reach across more of the palette of colors that humans can see in the real world. So greens can be greener, blues can be bluer, reds can be redder, and there can be more shades in between. My old HDTV can show about 16 million colors, but most HDR sets can show a billion colors.
And video experts say that if you want an image that really pops, HDR and its wide color gamut are even more important than the pixel count.
Michael Isnardi: Wider color gamuts and HDR fall into the category of better pixels. Not more pixels, but making the pixels truer to life, making them pop.
Wade Roush: Now as you may have guessed, Apple’s display has both a wide color gamut and an extremely high contrast ratio and dynamic range, which is how they ended up calling it the Pro Display XDR, for extreme dynamic range.
And Colleen Novielli says that could be useful in all sorts of professions where visual data is important. I mean, not just in film production but in fields like radiology.
Colleen Novielli: You're able to see that brightness and contrast and all the details in between. That's really what … dynamic range is about. And sometimes when you're viewing a file like you know an X-ray or a CAT scan, those details in between are the things that are going to be able to save lives…this could apply to really any profession where you know I just want to see all the data that I have in this image, this photo, in this video. And so it’s going to be compelling across the board in that sense.
Wade Roush: Okay, so you can see why professionals would care about things like nits and heat sinks and color gamuts. But does all that stuff really matter to you, Mr. or Ms. Average Consumer? Isn’t the display or the TV you already have…good enough?
Well, no, it isn’t. And to show you why all I have do is take you back in time to the night of April 28, 2019, when 18 million people tuned into HBO to watch the Starks, the Targaryens, and the Dothraki take on the Army of the Dead at the Battle of Winterfell, and … nobody could see a damn thing.
Audio montage speaker 1: The episode is so dark. From the start of the episode it’s pitch black.
Audio montage speaker 2: So far this Game of Thrones episode sounds amazing. It’s too bad I can’t see a damn thing.
Audio montage speaker 3: A lot of people were complaining that it was poorly lit and too dark.
Audio montage speaker 4: It was kinda dark.
Audio montage speaker 5: I liked it though!
Wade Roush: The question everyone was asking the next morning was how HBO could spend 11 weeks and a reported twenty million dollars filming this decisive episode, And then turn make it so dark that most viewer couldn’t see it. The website Vulture called it “one of the most expensive missed opportunities in the history of television.”
Well, here’s the deal. It wasn’t just the producers’ fault. It was kinda our fault too.
Film and TV producers watch the final cuts of shows like Game of Thrones on reference monitors that with a wide color gamut and a high contrast ratio. So to them the Battle of Winterfell looked really great.
Unfortunately they didn’t do enough to adjust for everyone else, meaning, people who don’t have HDR TVs or home theaters.
Here’s Michael Isnardi again.
Michael Isnardi: Do you know about Season 8 of Game of Thrones. People complained and how dark it was?
Wade Roush: Yeah, it was scandalous.
Michael Isnardi: So there is an example that if you were in a home theater environment with, you know, calibrated monitors, et cetera, you would have seen all of that detail. But the ordinary consumer in their living room, even at night, but with lots, plenty of, you know, lamps and stuff producing glare on the television set, they missed it. So it is an example in which there was a disconnect or maybe an assumption by the people who were producing that episode that people had, you know, the audience, the viewing audience, that a large percentage of them had HDR sets and were in a good controlled viewing environment. But that was not the case. And that's why that that's how the scandal erupted.
Wade Roush: So, to recap. The likely reason you couldn’t see anything during the Battle of Winterfell is that your TV couldn’t handle it. Or, you weren’t watching it under the right lighting conditions. Or both.
And the takeaway is that if you haven’t upgraded to an HDR TV, Hollywood is leaving you behind. These days the big studios and cable networks are using better technology to make their stuff. And to fully appreciate that stuff the rest of us need better technology too.
And that’s where Apple comes back into the picture. What the Pro Display XDR could do is take the picture quality that used to be available only on reference monitors costing forty or fifty thousand dollars, and put it into the hands of every videographer or photographer or editor with an extra five grand to spend. That wider group of creators shapes almost everything we see in the media. And Apple’s goal is to help them show the world more accurately.
Colleen Novielli: I think that when we looked across how people were able to perceive their work, the photographs they're taking, the videos that they're taking, we thought that HDR as it is today really wasn't good enough for showing the beauty of the world that photographers are capturing, that videographers are capturing. And we wanted to make sure that people were able to see as much brightness, as much contrast as they could. And I do I certainly think that now you know once people have kind of seen the Pro Display XDR and it's it's really much better reflecting the real world, I think that that bar is going to be set much higher now.
Wade Roush: The fact that something like Apple’s new Pro Display exists means that sooner or later, we’re going to want better monitors and TVs ourselves, so that we can actually see things content creators are trying to show us.
One factor that’ll help is that the new technologies inside the Pro Display XDR will inevitably make their way into consumer displays and TVs. Isnardi says one of the first examples could be a technique called picture optimization mode. Apple’s display has sensors on the front and the back that can look at the room and assess the lighting conditions.
Michael Isnardi: And they are adjusting the contrast ratio and the color gamut to the type of illumination…whether there is incandescent or cold cathode or fluorescent lighting in the room, will adjust the colors so that what you see in that environment is closer to the intent of the original production. So I think TVs that automatically do picture optimization by analyzing the ambient lighting … that's something that we're going to see more of moving forward and that's going to trickle down into consumer models.
Wade Roush: Now, you can certainly look at everything I’ve been talking about as the workings of an unstoppable marketing machine. Makers of consumer products are always trying to convince us that there’s something wrong with the thing we have, and that we’ll be happier as soon we replace it with a better thing. That’s just the way our form of capitalism works. And Apple is the consummate consumer product company.
But there’s also a less jaded way to look at this. I think you can jump back about 80 years, to a time before video technology even existed, and find some insights in the work of the German cultural critic Walter Benjamin.
He died in Spain in 1940 while trying to flee the Nazis. But five years earlier he’d written his most famous essay. It’s called “The Work of Art in the Age of Mechanical Reproduction.” And the idea Benjamin was one of the first to capture is that technology gives us new ways to see.
To be specific, he said:
Walter Benjamin [computer voice]: The way in which human perception is organized—the medium in which it occurs—is conditioned not only by nature but by history.
Wade Roush: Now, Benjamin was particularly impressed by the new art of filmmaking. His heroes were directors like Charlie Chaplin who, in his eyes, were conjuring a new kind of emotional physics. They were using the camera to put audiences inside the action. And they were using techniques like slow motion for slowing down time, or time lapse for speeding it up, or jump cuts to teleport between locations. None of that had been imagined in visual storytelling before.
The way Benjamin saw it, painters like the Cubists and the Dadaists had been trying for decades to show a handful of art connoisseurs that the world is beautiful, yes, but also fragmented and a little absurd. But in movie theaters filmmakers could train millions of people to see things from that same modernist point of view.
And if you brought Benjamin forward to 2019, I think he’d see exactly what’s going on.
Walter Benjamin [computer voice]: “The history of every art form has critical periods in which the particular form strains after effects which can be easily achieved only with a changed technical standard—that is to say, in a new art form.”
Wade Roush: That’s a key idea from Benjamin’s essay, so let me play it again, a little slower.
Walter Benjamin [computer voice]:: “The history of every art form has critical periods in which the particular form strains after effects which can be easily achieved only with a changed technical standard—that is to say, in a new art form.”
Wade Roush: One of these critical periods came in the late 1990s, with the switch to digital high-definition production technology in movies and television. And we got two new art forms out of that. In theaters we got a new generation of blockbuster special-effects movies, starting with the Star Wars prequels. And on TV screens we got an explosion in long-form TV drama, starting with The Sopranos.
What I’m arguing is that today, technical standards are changing once again.
I don’t know what Benjamin would make of Game of Thrones. He’d probably be amazed and appalled. But when you get screens with 4K and 8K resolution, high contrast ratios, HDR, and wider color gamuts, you get the opportunity to try new effects. And you could end up with…a new art form.
That’s the real meaning of what device makers like Apple are up to. And I’m sorry to tell you, but that’s why your old monitor and your old TV…just aren’t good enough.
Soonish is written and produced by me, Wade Roush.
Our theme is by Graham Gordon Ramsay.
All additional music is from the creative geniuses at Titlecard Music and Sound in Boston.
With a little boost from Stephen Sondheim, Timbuk 3, and They Might Be Giants.
To learn more about all of the people and ideas in this episode, you must go to our website, soonishpodcast.org.
And here’s a reminder for you. If you like Soonish, you’ll like my monthly column in Scientific American magazine.
You need a subscription to read it, but hey, it’s only thirty five bucks a year, which a sweet deal considering the magazine’s extremely high IDI count. That’s ideas per inch, of course. Check it out at scientificamerican.com.
Soonish is a proud member of Hub & Spoke, a Boston-based collective of smart, idea-driven podcasts.
And this month I want to urge you to check out the latest episode of Culture Hustlers from Lucas Spivey.
Lucas ask why so many art schools fail to get students ready for life in the real world, where they’re also going to have to understand money and business.
Lucas Spivey: The faculty decide what they teach and what they don’t want to teach. And they want to teach techniques and concepts, but they definitely don’t want to talk about business, or money, or marketing, or legal, or intellectual property, or finance, or any of that. It’s a religious battle at this point. Religious. Teaching business to creators would be like losing your religion.
Wade Roush: Check out that episode now at culturehustlers.com or wherever you get your podcasts.
Speaking of money and business, this show would not be possible without contributions from our listeners.
Special thanks to my top supporters on Patreon, especially Kent Rasmussen, Celia Ramsay, and Paul and Patricia Roush, Jamie Roush, Lucia Prosperi, Victor McElheny, Andy Hrycyna, Steve Marantz, Elizabeth Blanch, Chuck and Gail Mandeveille, Ellen Leanse, Mark Pelofsky, and Graham Ramsay.
You can support the show too by signing up to make a per-episode donation at patreon.com/soonish.
And if you give $10 or more per episode I’ll send you the Soonish coffee mug. It’s got our logo on one side and our motto on the other. So along with your daily caffeine you’ll get a dose of informed optimism.
I’d like to give a shout-out to Kip Clark, Charles Gustine, Ellen Leanse, Mark Pelofsky, and Joel Roston for sending me notes on this episode.
Thank you for listening, and I’ll be back with a new episode….Soonish.