What Writing Teachers Know About AI That Tip Sheets Don't Teach
There is a version of AI training that treats prompting like a technical discipline. It shows up as tip sheets, copy-paste templates, and listicles built around specific commands: "add 'think step by step' for better reasoning," "use 'act as an expert' to sharpen tone," "append 'be concise' when outputs run long." The implicit promise is that better outputs come from better syntax. Memorize enough patterns and the tool performs. This framing is everywhere right now, and it is mostly wrong.
Prompting isn't a technical skill. It's a writing skill. And the distinction matters.
When a prompt produces weak output, the failure is almost always rhetorical. The writer didn't identify the audience clearly enough. The writer may not have a clear understanding of the purpose. Their instructions are vague. The context AI needed to respond usefully was left out, assumed, or buried. These are the same problems that have always produced weak emails, unclear memos, and reports that make readers work too hard to find the point.
Writing teachers encourage first-year writing students to think about who they're writing to and what they're trying to accomplish. The same question applies to prompts. For a human reader, unclear audience and purpose produce weak writing. For AI, they produce something worse—output that is fluent and confident, but it's wrong for the situation in ways that aren't always obvious until you've already spent time cleaning it up. A prompt without a clear sense of audience is like a memo addressed "to whom it may concern." Technically a document that fits the bill, but not quality communication.
The good news is that weak prompts fail in recognizable ways — and there is an entire discipline, composition studies, built around understanding exactly how communication breaks down and why. The same patterns apply to prompts. When you know what to look for, diagnosis is fast.
If the output is too generic:
The prompt is missing audience. AI wrote for everyone, which means it wrote for no one. Ask yourself: who specifically is reading this, what do they already know, and what do they need to do after reading it? Put those answers in the prompt.
If the output misses the point:
The purpose wasn't clear. AI filled the space with something plausible rather than something useful. Before you reprompt, finish this sentence in one clause: "After reading this, my reader needs to ___." That answer is your purpose. It belongs in the prompt.
If the output has the right content but the wrong tone:
The context is thin. AI doesn't know your relationship with this client, how your organization communicates, or what this particular reader tends to push back on. The more of that situational texture you provide, the less cleanup you do on the other end.
If the output is close but keeps drifting in the wrong direction:
You're missing constraints. Constraints aren't just guardrails, they're how you encode your knowledge of the reader into the prompt. "Do not use passive voice" and "this audience is skeptical of vendor promises" are both constraints. They do different things, but both tell AI something a strong writer would already know before sitting down to draft.
None of this requires learning a new technical skill.
It requires applying the same critical thinking to prompts that good writers apply to drafts: reading what you produced, asking why it isn't working, and revising toward something more specific. The writers on your team who do this instinctively with documents can learn to do it with prompts quickly. The ones who struggle with prompts are often the same ones who struggle with documents, for the same reasons.
AI didn't create a new problem. It just made an existing one harder to ignore.
Howard Workshop Group helps professional teams build AI prompting skills grounded in practical communication. If your team has been getting inconsistent results and wants to understand why, we'd love to talk—or hear what patterns you've been running into in the comments.
This post was drafted and revised using Claude as a writing tool. The ideas, structure, and voice are mine—the AI handled the keystrokes. Image generated with Grok.