13 comments

  • apsurd 1 hour ago
    Axios got traction because it heavily condensed news into more scannable content for the twitter, insta, Tok crowd.

    So AI is this on massive steroids. It is unsettling but it seems a recurring need to point out that across the board many of "it's because of AI" things were already happening. "Post truth" is one I'm most interested in.

    AI condenses it all on a surreal and unsettling timeline. But humans are still humans.

    And to me, that means that I will continue to seek out and pay for good writing like The Atlantic. btw I've enjoyed listening to articles via their auto-generated NOA AI voice thing.

    Additionally, not all writing serves the same purpose. The article makes these sweeping claims about "all of writing". Gets clicks I guess, but to the point, most of why and what people read is toward some immediate and functional need. Like work, like some way to make money, indirectly. Some hack. Some fast-forwarding of "the point". No wonder AI is taking over that job.

    And then there's creative expression and connection. And yes I know AI is taking over all the creative industries too. What I'm saying is we've always been separating "the masses" from those that "appreciate real art".

    Same story.

    • ngriffiths 29 minutes ago
      > Additionally, not all writing serves the same purpose.

      I think this is a really important point and to add on, there is a lot of writing that is really good, but only in a way that a niche audience can appreciate. Today's AI can basically compete with the low quality stuff that makes up most of social media, it can't really compete with higher quality stuff targeted to a general audience, and it's still nowhere close to some more niche classics.

      An interesting thought experiment is whether it's possible that AI tools could write a novel that's better than War and Peace. A quick google shows a lot of (poorly written) articles about how "AI is just a machine, so it can never be creative," which strikes me as a weak argument way too focused on a physical detail instead of the result. War and Peace and/or other great novels are certainly in the training set of some or all models, and there is some real consensus about which ones are great, not just random subjective opinions.

      I kind of think... there is still something fundamental that would get in the way, but that it is still totally achievable to overcome that some day? I don't think it's impossible for an AI to be creative in a humanlike way, they don't seem optimized for it because they are completely optimized for the sort of analytical mode of reading and writing, not the creative/immersive one.

      • lich_king 7 minutes ago
        > Today's AI can basically compete with the low quality stuff that makes up most of social media, it can't really compete with higher quality stuff

        But compete in what sense? It already wins on volume alone, because LLM writing is much cheaper than human writing. If you search for an explanation of a concept in science, engineering, philosophy, or art, the first result is an AI summary, probably followed by five AI-generated pages that crowded out the source material.

        If you get your news on HN, a significant proportion of stories that make it to the top are LLM-generated. If you open a newspaper... a lot of them are using LLMs too. LLM-generated books are ubiquitous on Amazon. So what kind of competition / victory are we talking about? The satisfaction of writing better for an audience of none?

    • meetingthrower 1 hour ago
      Same. New yorker is the other mag I subscribed to.

      Until 3 weeks ago I had a high cortisol inducing morning read: nyt, wsj, axios, politico. I went on a weeklong camping trip with no phone and haven't logged into those yet. It's fine.

      • jihadjihad 1 hour ago
        People think I'm nuts when I tell them I ditched subscriptions for those sites and only check them maybe once a week, if that.

        But what you said is 100% true, it's fine. When things in your life provide net negative value it's in your best interest to ditch them.

      • KittenInABox 1 hour ago
        I agree with this in general but with caveats. For example I think reading national-sized news every day sucks. But if you're of a specific demographic it might be useful to keep pretty up to date on nuanced issues, like if you're a gun owner you will probably want to keep up to date on gun licensing in your area. Or if you're a trans person it's pretty important nowadays to be very aware of laws being passed to dictate your legally going to whatever bathroom or something.
    • plastic-enjoyer 1 hour ago
      > "Post truth" is one I'm most interested in.

      I have this theory that the post-truth era began with the invention of the printing press and gained iteratively more traction with each revolution in information technology.

      • robot-wrangler 46 minutes ago
        Doesn't matter when post-truth started because it's now over, and it's more accurate to characterize this era as "post-rationality". Most people do seem to understand this, but we are in different stages of grief about it.
      • Finbel 1 hour ago
        So slightly before 1440 was peak Truth for humanity?
      • yannyu 1 hour ago
        I think you're right, but I also think it's worthwhile to look at Edward Bernays in the early 1900s and his specific influence on how companies and governments to this day shape deliberately shape public opinion in their favor. There's an argument that his work and the work of his contemporaries was a critical point in the flooding of the collective consciousness with what we would consider propaganda, misinformation, or covert advertising.
        • plastic-enjoyer 1 hour ago
          > There's an argument that his work and the work of his contemporaries was a critical point in the flooding of the collective consciousness with what we would consider propaganda

          I would rather say that Bernays was a keen observer and understood mass behavior and the potential of mass media like no one else in his time. Soren Kierkegaard has written about the role of public opinion and mass media in the 19th and had a rather pessimistic outlook on it. You have stuff like the Dreyfuss Affair where mass media already played a role in polarizing people and playing into the ressentiments of the people. There were signs that people were overwhelmed by mass media even before Bernays. I would say that Bernays observed these things and used those observations to develop systematic methods for influencing the masses. The problem was already there, Bernays just exploited it systematically.

  • ericdykstra 21 minutes ago
    I won't ever put my name on something written by an LLM, and I will blacklist any site or person I see doing it. If I want to read LLM output I can prompt it myself, subjecting me to it and passing it off as your own is disrespectful.

    As the author says, there will certainly be a number of people who decide to play with LLM games or whatever, and content farms will get even more generic while having less writing errors, but I don't think that the age of communicating thought, person to person, through text is "over".

  • dtf 1 hour ago
    "Is Claude Code junk food, though? ... although I have barely written a line of code on my own, the cognitive work of learning the architecture — developing a new epistemological framework for “how developers think” — feels real."

    Might this also apply to learning about writing? If have barely written a line of prose on my own, but spent a year generating a large corpus of it aided by these fabulous machines, might I also come to understand "how writers think"?

    I love the later description of writing as a "special, irreplaceable form of thinking forged from solitary perception and [enormous amounts of] labor", where “style isn’t something you apply later; it’s embedded in your perception" (according to Amis). Could such a statement ever apply to something as crass as software development?

    • girvo 1 hour ago
      My current bugbear is how art is held up as creativity and worthy of societal protection and scorn against AI muscling in on it

      While the same people in the same comments say it’s fine to replace programming with it

      When pressed they talk about creativity, as if software development has none…

      • jarjoura 41 minutes ago
        I haven't heard writers make any kind of stance on software engineering, but Brandon Sanderson has very publicly renounced AI writing because it lacks any kind of authentic journey of an authors own writing. Just as we would cringe at our first software projects, he cringes at his first published novel.

        I think that's a reasonable argument to make against generative art in any form.

        However, he does celebrate LLM advancements in health and accessibility, and I've seen most "AI haters" handwave away its use there. It's a weird dissonance to me too that its use is perfectly okay if it helps your grandparents live a longer, and higher quality of life, but not okay if your grandparents use that longer life to use AI-assisted writing to write a novel that Brandon would want to read.

      • arctic-true 1 hour ago
        The easiest job to automate is someone else’s.
      • yason 1 hour ago
        Art has two facets. First is if you like it. If you do, you don't need to care where it came from. Second is the art as cultured and defined by the artistic elites. They don't care if art is liked or likable, they care about the pedigree, i.e. where it came from, and that it fits what they consider worthy art. Between these two is what I call filler art: stuff that's rather indifferent and not very notable, but often crosses over some minimum bar that it's accepted by, and maybe popular among average people who aren't that seriously interested in art.

        In the first category, AI is no problem. If you enjoy what you see or hear, it doesn't make a difference if it was created by which kind of artist or AI. In the second category, for the elite, AI art is no less unacceptable than current popular art or, for that matter, anything at all that doesn't fit their own definition of real art. Makes no difference. Then the filler art.. the bar there is not very high but it will likely improve with AI. It's nothing that's been seriously invested in so far, and it's cheaper to let AI create it rather than poorly paid people.

      • SpaceManNabs 1 hour ago
        a lot of artists don't mind use AI for art outside their field

        I was in a fashion show in tokyo in 2024.

        i noticed their fashion was all human designed. but they had a lot of posters, video, and music that was AI generated.

        I point blank asked the curator why he used AI for some stuff but didn't enhance the fashion with AI. I was a bit naive because I was actually curious to see if AI wasn't ready for fashion or maybe they were going for an aesthetic. I genuinely was trying to learn and not point out a hypocrisy.

        he got mad and didn't answer. i guess it is because they didn't want to pay for everything else. big lesson learned in what to ask lol.

      • zozbot234 25 minutes ago
        Maybe that's because AI "art" looks just as cringe as written AI slop.
    • benbreen 1 hour ago
      Thank you, this sort of insight is exactly why I've felt such kinship with what software engineers like Karpathy and Simon Willison have been writing lately. It seems obvious to me that there is something special and irreplaceable about the thought processes that create good code.

      However, I think there is also something qualitatively different about how work is done in these two domains.

      Example: refactoring a codebase is not really analogous to revising a nonfiction book, even though they both involve rewriting of a sort. Even before AI, the former used far more tooling and automated processes. There is, e.g., no ESLint for prose which can tell you which sentences are going to fail to "compile" (i.e., fail to make sense to a reader).

      The special taste or skillset of a programmer seems to me to involve systems thinking and tool use in a different way than the special taste of a writer, which is more about transmuting personal life experiences and tacit knowledge into words, even if tools (word processor) and systems (editors, informants, primary sources) are used along the way.

      Sort of half formed ideas here but I find this a really rich vein of thought to work through. And one of the points of my post is that writing is about thinking in public and with a readership. Many thanks for helping me do that.

      I don't have a good answer to your question, but I do think it might be comparable, yes. If you had good taste about what to get Opus 4.6 to write, and kept iterating on it in a way that exposes the results to public view, I think you'd definitely develop a more fine grained sense of the epistemological perspective of a writer. But you wouldn't be one any more than I'm a software developer just because I've had Claude Code make a lot of GitHub commits lately (if anyone's interested: https://github.com/benjaminbreen).

    • randusername 21 minutes ago
      > Could such a statement ever apply to something as crass as software development?

      Absolutely. I think like a Python programmer, a very specific kind of Python programmer after a decade of hard lessons from misusing the freedom it gives you in just about every way possible.

      I carry that with me in how I approach C++ and other languages. And then I learned some hard lessons in C++ that informed my Python.

      The tools you have available definitely inform how you think. As your thinking evolves, so does your own style. It's not just the tool, mind, but also the kinds of things you use it for.

  • AstroBen 1 hour ago
    This type of cadence.

    You know the one.

    Choppy. Fast. Saying nothing at all.

    It's not just boring and disjointed. It's full-on slop via human-adjacent mimicry.

    Let’s get very clear, very grounded, and very unsentimental for a moment.

    The contrast to good writing is brutal, and not in a poetic way. In a teeth-on-edge, stomach-dropping way. The dissonance is violent.

    Here's the raw truth:

    It’s not wisdom. It’s not professional. It’s not even particularly original.

    You are very right to be angry. Brands picking soulless drivel over real human creatives.

    And now we finish with a pseudo-deep confirmation of your bias.

    ---

    Before long everyone will be used to it and it'll evoke the same eugh response

    Sometimes standing out or wuality writing doesn't actually matter. Let AI do that part

    • vpribish 40 minutes ago
      well done. :)

      and at the same time the chop becomes long-form slop, stretching out a little seed of a human prompt into a sea of inane prose.

  • pawelduda 1 hour ago
    About the article that's referenced in the beginning - that sentiment presented in it honestly sounds like AI version of cryptocurrency euphoria just as the bubble burst. "You are not ready for what's going to happen to the economy", "crypto will replace tradfi, experts agree". The article is sitting at almost 100M views after just a week and has strong FOMO vibes. To be honest, it's very conflicting for me to believe that, because I've been using AI and compared to crypto, it doesn't just feel like magic, it also does magic. However, I can't help but think of this parallel and the possibilty that somehow the AI bubble could right now be starting to stall/regress. The only problem is that I just don't see how such a scenario would play out, given how good and useful these tools are
  • ayoung5555 1 hour ago
    As much as the general public seems to be turning against AI, people only seem to care when they're aware it's AI. Those of us intentionally aware of it are better tuned to identify LLM-speak and generated slop.

    Most human writing isn't good. Take LinkedIn, for example. It didn't suddenly become bad because of LLM-slop posts - humans pioneered its now-ubiquitous style. And now even when something is human-written, we're already seeing humans absorb linguistic patterns common to LLM writing. That said, I'm confident slop from any platform with user-generated content will eventually fade away from my feeds because the algorithms will pick up on that as a signal. (edit to add from my feeds)

    What concerns me most is that there's absolutely no way this isn't detrimental to students. While AI can be a tool in STEM, I'm hearing from teachers among family and friends that everything students write is from an LLM.

    Leaning on AI to write code I'd otherwise write myself might be a slight net negative on my ability to write future code - but brains are elastic enough that I could close an n month gap in 1/2n months time or something.

    From middle school to university, students are doing everything for the first time, and there's no recovering habits or memories that never formed in the first place. They made the ACT easier 2 years ago (reduced # of questions) and in the US the average score has set a new record low every year since then. Not only is there no clear path to improvement, there's an even clearer path to things getting worse.

    • unyttigfjelltol 8 minutes ago
      I spent several years trying to get ground truth out of digital medical records and I would draw this parallel to AI slop:

      With traditional medical records, you could see what the practitioner did and covered because only that was in the record.

      With computerized records, the intent, thought process, most signal you would use to validate internal consistency, was hidden behind a wall of boilerplate and formality that armored the record against scrutiny.

      Bad writing on LinkedIn is self-evident. Everything about it stinks.

      AI slop is like a Trojan Horse for weak, undeveloped thoughts. They look finished, so they sneak into your field of view and consume whatever additional attention is required to finally realize that despite the slick packaging, this too is trash.

      So “AI slop,” in this worldview, is a complaint that historical signals of quality simply based on form, no longer are useful gatekeepers for attention.

  • submeta 1 hour ago
    I wonder whether we will see a shift back toward human generated, organic content, writing that is not perfectly polished or exhaustively articulated. For an LLM, it is effortless to smooth every edge and fully flesh out every thought. For humans, it is not.

    After two years of reading increasing amounts of LLM generated text, I find myself appreciating something different: concise, slightly rough writing that is not optimized to perfection, but clearly written by another human being

  • 1necornbuilder 1 hour ago
    The "cognitive debt" framing resonates, but from an unexpected direction. I'm not a developer. I've never written a line of code. I built enterprise software, a live computer vision system monitoring industrial cranes, deployed on Google Cloud Run, generating six figures in contracts, entirely by chatting with Claude. No IDE, no terminal muscle memory to lose.

    For me, there is no cognitive debt in the code. There's no ground truth I'm losing touch with, because I never had it. The ground truth I bring is domain knowledge: fifteen years of understanding what an industrial operator actually needs to see on a screen at 3am. What Breen describes as "junk food", the dopamine hit of watching Claude build a new feature is, for domain experts like me, the first time in history we could participate in building at all. The gap that existed wasn't "developer loses touch with code." It was "person closest to the problem could never build the solution." But his core point about writing holds, even here. The thinking that produces good software requirements, the careful articulation of what needs to be built and why, that remains irreducibly human. My most important contributions to my own codebase aren't commits. They're the precise questions I ask. Maybe cognitive debt is domain-specific. Developers accumulate it. Domain experts spend it.

    • eaglelamp 7 minutes ago
      Is this your product?: https://cranesync.com/

      If so I hope your monitoring software is higher quality than your website.

    • piker 32 minutes ago
      You vibe-coded a computer-vision product that is to be used in monitoring industrial cranes? And people are using it?
      • throwaway0123_5 24 minutes ago
        The account is 47 minutes old and with the writing style plus the hefty dose of em dashes, I think they are an LLM.
    • sibeliuss 24 minutes ago
      Appreciate this take. It makes a lot of sense and can see this happening all over right now.
  • kittikitti 1 hour ago
    I think people hate AI generated writing more than they like human curated writing. At the same time, I find that people like AI content more than my writing. I write, comment, and blog in many different places and I notice that my AI generated content does much better in terms of engagement. I'm not a writer, I code, so it might be that my writing is not professional. Whereas my code-by-hand still edges out against AI.

    We need to value human content more. I find that many real people eventually get banned while the bots are always forced to follow rules. The Dead Internet hypothesis sounds more inevitable under these conditions.

    Indeed we all now have a neuron that fires every time we sense AI content. However, maybe we need to train another neuron that activates when content is genuine.

  • kittbuilds 1 hour ago
    [dead]
  • rnakle 52 minutes ago
    That is a shallow piece of the new genre: I am a concerned academic who nevertheless uses these new tools to create vibe coded slop and has to tell the world about it.

    Everything is inevitable but my own job is secure. Have I already told you how concerned I am?

    No novelty. No intellectual challenge. No spirit. Just AI advertisements! /s

  • jongjong 2 hours ago
    I agree with the assessment that pure writing (by a human) is over. Content is going to matter a lot more.

    It's going to be tough for fiction authors to break through. Sadly, I don't think the average consumer has sufficiently good taste to tell when something is genuinely novel. People often prefer the carefully formulated familiar garbage over the creative gems; this was true before AI and, IMO, will continue to be true after AI. This is not just about writing, it's about art in general.

    There will be a subset of people who can see through the form and see substance and those will be able to identify non-AI work but they will continue to be a minority. The masses will happily consume the slop. The masses have poor taste and they're more interested in "comfort food" ideas than actually novel ideas. Novelty just doesn't do it for them. Most people are not curious, new ideas don't interest them. These people will live and breathe AI slop and they will feel uncomfortable if presented with new material, even if wrapped in a layer of AI (e.g. human-written core ideas, rewritten by AI).

    I feel like that about most books, music and pop culture in general; it was slop and it will continue to be slop... It was the same basic ideas about elves, dragons, wizards, orcs, kings, queens, etc... Just reorganized and mashed with different overarching storylines "a difficult journey" or "epic battles" with different wording.

    Most people don't understand the difference between pure AI-generated content (seeded by a small human input) and human-generated content which was rewritten by AI (seeded by a large human input) because most people don't care about and never cared about substance. Their entire lives may be about form over substance.

    • Aldipower 1 hour ago
      Who or what is "the masses" actually?
      • altruios 51 minutes ago
        Reminded of this clip.

        https://www.youtube.com/watch?v=KHJbSvidohg

        But as much as it pains me to admit... the current state of America is the slopocalypse. A slopalanche. A slopnado. AI cats waking people up in the middle of the night, blasting down doors, glitching out. All produced by slop-slingers. It's rather bleak for long form attention content, human created or not.

        Its a war of/on attention. A war to secure your attention during the time that you would otherwise think for yourself. Keep off the short form content, is my advice.

    • apsurd 2 hours ago
      What is the difference between writing and content?
      • RyanHamilton 2 hours ago
        I would guess he's looking to compare the equivalent of fast-food to fine-dining or nutritious eating.
  • davtyan1202 1 hour ago
    As we move further into a world where data exfiltration is becoming more sophisticated, local-first processing isn't just a luxury—it’s a necessity. Hardware is finally powerful enough to handle what used to require a massive backend infrastructure.
    • caseyohara 1 hour ago
      My spidey-sense: the "it isn't X, it's Y" construct and the dreaded em dash.