127 Comments
User's avatar
Dawson Eliasen's avatar

If dreams are the cure to our overfitting on our individual learning, then works of art are the dreams of the culture. Part of the explanation for an overfitted culture is a deficiency of dreams (which is also caused by the overfitted culture--it's a feedback loop).

I often complain to family and friends that algorithmic recommendations on Spotify etc. have completely broken down because the algorithms are now learning from their own recommendations. I listen to artist A, so Spotify thinks that I will like artist B because lots of people that listen to artist A also listen to artist B. But at this point in the history of algorithmic content, people that listen to artist A listen to artist B *because Spotify suggested it*. This creates a downward spiral, maybe something like a model collapse of recommendation engines where you are just being recommended music which represents the average of whatever genre you're spending the most time in.

The reason this is so awful is that artists now are slave to these recommendations. Human consciousness has been all but removed from the process by which artists "make it" (of course not entirely because, of course, labels still pay to have their artists recommended by Spotify) and instead they are mostly doing whatever they can to be picked up by Spotify's recommendation algorithm or TikTok creators that will make their song viral through TikTok's algorithm. The result is that artists are encouraged to make average-sounding music, because that's what's recommended by the algorithm. Not because anyone wants that, just because of the dynamics of such a system.

Spotify is just the place where this is happening most acutely, but clearly it's happening with movies and books as well.

I have some hope that eventually the tide will turn and people will become so sick of the stagnancy that they will begin to actively seek out actual cultural dreams, and slowly they will start to be encouraged even more so than ever before once we have some awareness of their actual importance.

Erik Hoel's avatar

>> The reason this is so awful is that artists now are slave to these recommendations. Human consciousness has been all but removed from the process by which artists "make it" (of course not entirely because, of course, labels still pay to have their artists recommended by Spotify) and instead they are mostly doing whatever they can to be picked up by Spotify's recommendation algorithm or TikTok creators that will make their song viral through TikTok's algorithm. The result is that artists are encouraged to make average-sounding music, because that's what's recommended by the algorithm. Not because anyone wants that, just because of the dynamics of such a system.

That's such a perfect example. And it literally fits the discriminatory overfitting we see with GANs!

Sage M's avatar

I've only just switched from Spotify to qubuz, which pays artists much better. So far the algorithm is much looser and introducing me to a ton of new music. Not much of it is gelling for me yet, but after the stagnation of the spotify recommendations it's a nice problem to have.

Harold Kildow's avatar

The machine ascendency is advancing in the space left by spiritual despondency. Creativity is not an attribute of machine code or algorithms. The omnipresence of simulacra is the witness to the soulessness that results from displacing the Creator.

Tristan Naramore's avatar

I would buy and read this book. This essay alone contains the seeds for a hundred new lines of inquiry and exploration.

Erik Hoel's avatar

I was struggling to keep it short, there are a bunch of potential follow-ups... but one in particular is I think, very interesting, and I'll write about it, probably in January. But it's a secret which one. It's probably guessable from this but it takes a little leap.

Yuval Welis's avatar

He can easily publish it on Amazon, or as a crowd funding

Tom Pendergast's avatar

This may be the Substack essay of the year! Already planning a second run through.

Christopher Shinn's avatar

I am hardly an expert on culture in other countries, but I have noticed that there's a lot of thematic and formal diversity in foreign films that we just don't have in this country. Material forces beyond just the changes brought about by social media are shaping our culture here.

GShadiya's avatar

Are you sure you aren’t just noticing that foreign films are different than

GShadiya's avatar

Ah sorry sent too early. Different than American films? Or alternatively that you’re not seeing a selection bias, where only standout-creative foreign films are worth translating/exporting/showing, but local films are less original because they’re not under that filter?

Mark's avatar

Just had to watch some 'popular' filipino movies (on busses), indeed mainly slop/copies. - While those 'foreign movies' that make it to the festivals may often be 'special' by copying I. Bergman et al. Otoh, Tarantino is 'quoting' other films a lot, and seems fun to me.

Megan Anne Agathon's avatar

I’m glad you enjoyed my Plato piece! I also have been wondering about our obsession with spin-offs and remakes, but my observation was mainly about our obsession with nostalgia. I didn’t think to connect it to imitation in art. Lovely piece!

Machine Liberalism's avatar

Love this! Too bad the "overfitting" book wasn't picked up, because the "Overfitting" lens explains so much of our political stagnation.

I think we're saying very similar things.... I just published a book (You Must Become an Algorithmic Problem - Bristol University Press). In it, I argue that engagement algorithms view human "outliers" as an optimization problem they cannot solve. To reduce the cost function, the system implicitly encourages us to develop a "Machine Habitus"—we essentially overfit ourselves to the model to reduce the anxiety of choice -- I call this the socio-technical contract. We trade our complexity for the safety of "fitting the model."

I was particularly struck by your connection between "Dreams" as a solution to the overfitting problem. In my final chapter, I argue for an AI that promotes "Synthetic Potentiality" and "Play." Basically, I say democracy needs "hallucination" (or what Vonnegut called foma) to survive optimization.

Here's a link to the book:

https://bristoluniversitypress.co.uk/you-must-become-an-algorithmic-problem

and an excerpt from the Stanford Social Innovation Review:

https://ssir.org/books/excerpts/entry/you-must-become-an-algorithmic-problem

Let me know if you (or any of your readers intersted the topic) are open to swapping notes. Keep up the great work!!

Smurfolope's avatar

Machine Habitus is a cool idea, do you think we also have a Social Habitus? We overfit ourselves to fit the social model?

Machine Liberalism's avatar

That's a cool way of talking about social norms and conformity :)

Andy Iverson's avatar

This is a really interesting way to frame the problem. Really enjoyed reading this article.

I have been wondering if, because they are tired of the same algorithmically determined content all the time, people have gained more of an appetite for stuff that is very contrary to the algorithm. Things like being aggressively non-politically correct, throwing out Nazi salutes, etc. Although I guess that sort of content is just boosted by a different algorithm in a different bubble.

Performative Bafflement's avatar

Really interesting read. I've thought and written about this from the other direction - basically the data science revolution that happened between 2010 and today drove all of this.

Around 2010 computers hit a point that we could start crunching Big Data, and deploying algorithms that used that data to build better-than-ever models and optimizations. It was the Data Science revolution, and it significantly changed how big companies marketed things, segmented and understood their customers, did analytics, tracked business outcomes and KPI’s, and created user interfaces.

It had precursors, of course. Computers were pretty good before then, and certain companies had been generating / storing Big Data well before then, but this is about when things took off collectively, the more powerful skills and algorithms were honed and deployed at scale, and every company that was big enough began doing it.

Basically, for about a decade, you could pull together a team of Phd’s (or those unambiguously smart enough to get a STEM Phd), and you could point them at some data and give them a business outcome or goal, and they could lift things by absurdly massive amounts, generally driving tens of millions of value per year with a team that cost only $1-2M. Bump conversion by 20-30%, drop costs by 20-50% by targeting things or using resources more intelligently, really dial in what factors were actually most important for driving various outcomes via modeling, segment customers in much more predictive ways, and so on. It was an arms race, of sorts - business is a competitive landscape, and those deltas are too big to ignore.

Another more telling way to look at this, is that ten thousand Phd’s in every major company have now for years been coordinating and arbitraging against average people by creating and using vastly imbalanced world-models and optimization power.

They're creating and deploying incredibly fine-grained and insightful models of consumer behavior and motivations, which reach deep into our collective biology and neurology to identify, grasp, and yank on whatever hooks exist for altering people’s behavior at large. They’re discovering and creating biologically-grounded addictive superstimuli, in other words - Skinner boxes writ large.

One outcome that came from this was the fundamentally adversarial dynamics that we see more and more of in our digital and physical lives now - junk food, the "attention economy" making phone time grow from 2-3 hours per day in 2014 to 7-9 hours a day in Zennials today, online gambling, and more.

And apparently it had still more repercussions beyond adversarial systems - those feedback loops and models have boxed in our cultural production into the safest, "known to make money" zones, and this has led to the stagnation you're pointing to.

Fortunately, on this cultural front, we have options - there is such an immense reservoir of music, shows, and books that you could consume constantly for 12 hours a day for 100 years and literally never run out of any one. So all you need to do is find real-life tastemakers you respect to mine those content reservoirs for gold.

David Stafford's avatar

Here's another idea: stagnation emerges from our socially-stratified society. The lack of interaction between the classes reduces the friction and ferment that creates culture.

Mark's avatar

Lack of classes (same social media for all), lack of friction.

David Stafford's avatar

I think that’s what David Marx concluded at the end of Culture and Status.

Mark's avatar

Made me re-read this fine review https://www.noahpinion.blog/p/book-review-status-and-culture-by - so: Thanks! - Otoh, does the internet mean: max-out of interaction between all (incl. classes!) - or that old classes became culturally mostly insignificant (except if rich + some 2nd hand fame, e.g. Harry/Meghan) ) + and sheer money/followers/likes is what counts - and while one can redistibute some money, one can not do the same with followers/likes. And should not try, I guess - that would overfit the worst way. - Stratification by clicks? Anyways, I see magnitudes more of interactions, rather doubtful about 'classes'.

Jordan Braunstein's avatar

I recall several think pieces over the last decade declaring that in an algorithm-mediated world, people who wanted to cultivate good taste and seek out the creative frontier needed to rely on a new class of plugged-in connoisseurs and curators who had access to stuff that was truly innovative and untainted by optimization for some preexisting commercial incentive structure.

What happened to that? Where are the set of expert obsessives who have an encyclopedic knowledge of some niche or subgenre of art or media, and whose recommendations, because they're human and have Good Taste, we could take to the bank?

Surely these people exist, but they've been completely swamped by algorithmic recommendation engines that exploit the revealed preferences of even self-proclaimed novelty seekers for convenience and immediate gratification over the time-consuming trial-and-error process of manual discovery.

It seems that if you introduce a mechanised algorithm anywhere into the stream of human cultural evolution, you get simulacra and overfitting issues, because it creates a filter that contaminates everything downstream.

Where can we find a resource for "the best new X" that favors the novel and the avant-garde and isn't pre-contaminated?

James McDermott's avatar

> Where are the set of expert obsessives

Substack, obvs

Cristhian Ucedo's avatar

That's very interesting. Algorithms should weight tastes for recommending. For example, I like heavy metal music, so apps should recommend new music to me that are being discovered by other people who also likes heavy metal music.

But hey, isn't that one the other complain about algorithms? The "echo chambers"?

Chris Dalla Riva's avatar

Honored to be mentioned in The Intrinsic Perspective! Long time fan. Was shocked to see my name.

Leo's avatar

I love this article. One strange thing however… Conceivably, mode collapse makes it easier to be original and stand out. So why is this not happening?

Or is this indeed happening but whatever the original thing, it is being absorbed by the machine at an ever faster rate so you need more and more creativity and originality to stay ahead until humans just can’t?

That sounds like the The Nothing in Michael Ende’s Neverending story sort of. Ahem, that’s depressing.

Mark's avatar

I think, it's happening a lot. And is mostly not (fully) absorbed by the machines/copy-cats. But, sure, the taste of the many and incentives creates: lots of slop. But the tastes of some others: also some great substack writing

cf. Luke's avatar

This question gets into what exactly is being "modeled" by culture, and why an overfit cultural apparatus wouldn't generate its own solutions for fitness. If an overfit apparatus is "too successful", what enables it to keep succeeding (cf. the "revealed preference blackpill")?

I think the phenomenon of cultural overfit is historically novel - does that mean we're waiting for a similarly novel compensatory mechanism to emerge?

Yosef's avatar

I think part of the problem is the loss of specificity. Car companies, apart from Subaru, aren't interested in making the best car for a very specific group of people, they want to get a slice of the generic crossover market, and they're just one ad campaign away from owning that market...

And anyway, there aren't actually that many car companies. Volkswagen, Seat, Skoda, Audi, Porsche, Bentley, and Lamborghini are all one company, which leads to a lot of parts sharing and codevelopement of vehicles, and sometimes tortured attempts at differentiation. Peugot, Citroen, DS, Fiat, Vauxhall, and Opel, are also one company. Kia and Hyundai are also one company, both aggressively attempting to capture US market share. They have a differentiating strategy, which is basically just buzzwords.

Cristhian Ucedo's avatar

Nassim Taleb writes about the many problems of overfitting and lack of variance too. In his theoretical framework, an overfitted system is fragile, while the many-gardens of yours would be robust or even antifragile depending on the selection rules.

McCray's avatar

I've experienced this in social circles in parts of the US. Some people only want to be friends that are over-fitted to them. I then later see these friendships break apart because they were founded on barely anything—not even trust. I think this is a great way of describing echo chambers and extending them to all areas of culture.

Roderik's avatar

I like this analogy and can see how it fits to dating practices i hear people describe around me. However, it also makes me think more meta in that we are now overfitting the model in this article to the world

Ann's avatar

Thank you.

I've felt for some time that everything is just "same old, same old".

So it's not just that I'm 79 and have been everywhere and done everything.