Ghostwriting is not new. It’s been a legitimate professional practice for decades across publishing, speeches, memoirs, corporate communications. The person whose name goes on the work didn’t write every word, and most of the time everyone involved understands that.
AI changes the economics but does it change the ethics?
Here’s the specific scenario I’ve been thinking about. A freelance writer takes on a ghostwriting contract. They use AI to draft most of the content, refine it to match the client’s voice, deliver it. The client publishes it under their own name. The client either doesn’t know or doesn’t ask how the content was produced.
Traditional ghostwriting: acceptable. AI-assisted ghostwriting with disclosure to the client: arguably acceptable. AI-assisted ghostwriting where the client has no idea the writer is using AI: where does that sit?
I’d push back on the easy answer that it’s fine as long as the output quality is there. The client is paying for a professional service and has some reasonable expectation about what that service involves. If the nature of the work has fundamentally changed, is there a disclosure obligation even if the contract doesn’t require it?
Genuinely uncertain on this one. Not trying to be preachy about it. Curious how others who do ghostwriting work think about this.
honestly i’ve thought about this a lot. my take is that ghostwriting has always been about selling an outcome, not a process. clients hire you for the result: content that sounds like them and does its job. how you get there is your craft.
that said. i do think there’s a version of this that crosses a line. if a client is paying premium rates specifically because they believe they’re getting original human creativity and craft, and you’re mostly just prompting and lightly editing, you’re misrepresenting what they’re buying.
the thing is though… most clients don’t ask. and when they do ask, i tell them. that’s where i’ve landed.
the framing of “disclosure obligation” assumes the client would care. some would. a lot wouldn’t. they want content that works, full stop.
i think the more useful question is: are you delivering what was promised? if yes, the method is your business. if the output is worse because you’re over-relying on AI and not catching its problems, that’s the actual breach of contract.
anyway. the line for me is output quality, not production method. that’s always been true.
In my experience, the framing matters enormously here. “Ghostwriting with AI assistance” is a service. “Charging senior rates for junior-effort work” is a different conversation.
The ethics question is secondary to the value question. If clients are paying for strategic thinking, editorial judgment, and brand voice expertise, those things should be present in the final product regardless of the tools used to produce the draft. The issue isn’t AI. The issue is whether the professional value being sold is actually being delivered.
What most freelancers miss is that clients aren’t buying hours or effort. They’re buying outcomes and expertise. The differentiation is in the execution, not the tool stack.
I’m not anti-technology here, but I do think “the client only cares about the outcome” can be a convenient way to avoid a harder question.
If a client hired a human researcher and writer partly because they value that human perspective and judgment, and they’re getting something substantially different, they deserve to know. Not because of some abstract ethics principle but because informed consent matters in professional relationships.
You don’t outsource thinking. That’s the part I keep coming back to. If the strategic thinking and editorial judgment are still genuinely yours, the AI is just a faster typewriter. But if those are being outsourced too, that’s a different arrangement than what was sold.
to be fair, this debate is happening in a context where AI use is becoming so normalized that “did you use AI” is starting to sound like “did you use spellcheck.”
the more useful framing might be: what are you actually being paid for? if it’s your taste, your strategic thinking, your understanding of the client’s audience — those things have to be in the work. AI can help you get there faster. it can’t replace those things.
the clients who will care are the ones who are also thinking about this. the ones who aren’t thinking about it probably just want the deliverable. honestly both are fine as long as you know which situation you’re in.