When Sloppy Writing Becomes an Asset
Reflections on AI and a life struggling with spelling and grammar
For as long as I can remember, I have struggled with writing. Specifically with spelling and grammar. At some point, I was tested for dyslexia. Turns out I could not use that as an excuse for my writing. My essays always looked like bloodbaths when I got them back from my teacher after corrections.
The thing is that despite my struggles with writing, I have always loved writing. There are lots of people who write much better than me, but who will only write a tiny fraction of what I write. Writing is simply not something they love.
With the rise of AI, I realize that my problem may, in fact, become an asset. I was reminded of this when discussing with my wife today about how there were reports of skilled students who got accused of using AI because their writing was so good.
For context: My wife is a very good writer. She doesn't make mistakes, unlike me. She can take a one-second glance at something I write, and she already spots like 10 different spelling and grammar mistakes. Just bloody unfair.
So here was the funny thing, my wife was making the tongue in cheek comment that had we lived today we would have gotten accused of using AI. Partially a sort of reflection on the fact that we were both good students and tend to write longer, more complex texts. But that remark amused me. I had to remind her that that might be the case for you, but not for me. Anyone reading any of my work would know right away that this guy could not possibly have used AI. No AI, would make that many spelling and grammar mistakes.
In fact, many people who are really good writers have actually deliberately making their writing more sloppy. People who actually know very well how to write perfect. It is to avoid looking like AI writing.
Somehow it feels like the greatest luck. For a change, my lifelong problem with spelling and grammar has kind of become an asset. Not only that but my often meandering stream of conscious style is also a benefit as that is also very far away from how AI writes. AI writing tends to be very punchy: Bang, bang, bang. It hits you with the points as if it is punching a sandbag or something.
Oh, and its analogies are always very clever, unlike my sandbag analogy which just isn't very well-thought-out. AI does not display doubts in its writing. It forges ahead with great confidence. It is why it can be so bewildering when AI is totally wrong about something. It is because it made those claims with such rock solid confidence that we are easily duped into believing in them.
The question is of course, if it matters whether AI wrote something or not. I mean if the writing is good and enjoyable does it matter? Yes, I have increasingly realized it matters a lot to me, and probably to you as well.
Allow me to clarify why. So often when I read something, I am not actually looking for whomever has the greatest insight or expertise on the matter. Rather, I read what is written specifically because of who they are. It can be a soldier in Ukraine talking about his experience of the war. Someone not at the front may very well write a much better piece on the front line of the war because they are more skilled writers. But it would lack authenticity.
Likewise, as a software developer, I will read what experience people using a particular programming language or software tool had at their company. There are probably plenty of people who could write an in-depth article about that given programming language but it would not be an actual experience, solving an actual real-world problem at a real-world company.
It is the same of why I might read someone's experience of a divorce, going through depression or having lived a life of fame. None of these people are necessarily great writers, but they are channeling real-world experiences.
Recently, I read an analysis of the Rust programming language compared to the Go programming language. Don't worry if you are not a software developer. It is totally irrelevant to the point I am making what these languages are about. The story went on about how software developers were realizing this thing and that thing about Go programming language. How it was driving them to Rust programming.
The problem is that this guy was not writing based on experience of working in software. He was not relaying his own experiences, what he had heard other fellow developers say and do. No, the whole thing was AI generated. In other words, an AI had simply synthesized an idea of what developers were "feeling." It wasn't real. It has been trained on what developers have been writing for sure. But the sentiments of developers could just as well have been feelings about entirely different technologies. AI is not such that it cleanly separates experiences like that.
Let me give an example of making AI images. If you make characters with armor, it will typically match a lot how real clothes are draped on people. The AI has learned roughly what armor looks like, but it has also mainly looked at how normal clothes fit on people. It extrapolates from this how armor should often look.
The result is that armor on women and men look quite different. Because women tend to wear very different clothes from men, the way their armor gets shaped by AI tends to become very different. It is very easy to see this kind of peculiar behavior by AI on images, but something very equivalent happens in writing as well. AI will take patterns in one type of writing and apply it in another.
Think about it: The armor the AI creates for a person is not actually based on real armor it has seen. It is a combination of clothing and armor. The same will apply if it writes about software developers using different technologies. It will take descriptions of how developers might have felt about an entirely different technology and apply it to other technologies. This is part of the problem and genius of modern AI: It is able to extract insights in one area and reapply in another. But in doing so it is also in a way lying about reality.
It is what I realized about this programming article. It was not badly written. It had many decent points. But ultimately it was all a lie. It did not reflect actual developer experiences. Rather it was a synthesis of what developer could likely have felt. It is a bit like if somebody wrote about the feeling of fighting Russians from a Ukrainian trench by reading how WWI trench-warfare was for British soldiers. Imagine you read those feelings and then you upgrade and modify it based on the modern world we live in and some stuff you read about Ukrainian society and the war there. Such an approach could probably produce something that was in fact quite realistic in how it depicts a Ukrainian trench war.
The problem is that it isn't real. It is not an actual experience being conveyed. It is an artificial synthesized experienced. We humans want the real thing. We want the words, feelings, and thoughts of real people.
Imagine a world where every article you read about any topic is from the same AI. whether you read about crime, love, war, technology, or politics it is always an AI. Maybe it can provide different styles. How would you feel about that? Would you not prefer hearing all these experiences from real people who either lived them or have their own personal takes on it?
When you read my texts, you know there is a real person writing this. Somewhere in Oslo, Norway, there is a guy writing his thoughts based the life he lived and experiences had. It could be a combination of what he lived through as well as what he read. Every one of us is different like that and thus have a unique voice.
I admit I used AI in some of my past articles because I was attracted to the allure of finally getting texts written in a far more professional manner than I write. A lot more to the point. More punchy, and not spelling or grammar errors. But in the end, I realize when I read other writers using AI, that my articles then end up sounding like that of everyone else. And also I am reminded that I actually want to hear their unique voice, regardless of imperfect it might be.
I am not rejecting AI outright, but I am more conscious of not using it so much that my own uniqueness disappears. I want anyone reading me to know that these are my imperfect thoughts and reflections. Not mass produced "elegance" from AI.
I’m a professional proofreader and editor, and I love this piece. As a linguist I also seem to have a sensitivity or “allergy” (LOL) to AI, it lacks a certain quality that natural human language has, and I think you correctly identify this as a lack of unique individual direct life experience and viewpoint. When I work with authors my goal is to give the work a beautiful finish that also allows the reader to see the natural grain underneath (to borrow a metaphor from woodworking). I don’t want to sanitize or homogenize the work in a way that erases the author’s unique humanity, which is what I notice in AI generated works. My YA kiddo plans to study astrophysics and tells me that AI is fantastic for improving the quality of telescope images and for quantum computing but just rolls her eyes at using it for human language applications.