top of page

AI took my job but misinformation handed it the key — risk compounds when humans, machines both get it wrong

  • audreykorte
  • Sep 8
  • 7 min read

Updated: 5 days ago


AI is being used in understaffed newsrooms, how errors are handled, invisible disabilities, support and training, and what accountability should look like when both humans and machines get it wrong.


By: Audrey Korte @akorteshares

First Posted: Sept. 8, 2025

Last updated at 5:10 p.m., Sept. 12

*Written and edited by humans

A stack of newspapers being flipped through. By Audrey Korte
A stack of newspapers being flipped through. By Audrey Korte

Three months ago, I arrived in Madison ready to build a new life — eager to earn trust, learn furiously, and prove myself in a big-city newsroom. 


I worked for years to get a great position at a metro paper. I was stoked to have been recruited to Madison. Then suddenly, it was done after a humiliating mistake cost me my job.


I took responsibility and have since developed a more rigorous editorial process. It’s made me a more thoughtful and resilient communicator and slower editor which is not a bad thing.


Now, after weeks of waiting to apologize and explain, I share what I was and was not responsible for and what else contributed to this costly ordeal.




Ownership


I apologize for the embarrassment my article caused the Wisconsin State Journal and Lee Enterprises, and for the confusion and frustration that followed.


I am sorry for the entire mess.

Due to a unique set of circumstances, I submitted a piece with multiple factual inaccuracies. Most stemmed from AI-assisted edits that replaced large portions of my writing with incorrect information — including fabricated sources.


I missed the mistakes. I own that — and I’m deeply embarrassed.


I believed the AI program was newsroom-approved. It had been installed on employee computers after Lee’s cyberattack, and I assumed all software was vetted during that upgrade.


We were explicitly told not to use ChatGPT, and I haven’t — ever. That lack of experience hurt me. There was no AI training in the newsroom.


I used Copilot to help make it more concise, check grammar, spelling, punctuation and AP style — or so I thought. I pasted what I believed were edits into my final version, unaware it had inserted false information, substituting verified information with junk.


I should never have used technology that I was untrained in, especially in a moment of crisis.


‘It can do that?’: AI Use and the Uninitiated


The article ran online July 11 — in print July 13. On July 16, the Executive Editor flagged concerns, referencing a Reddit thread. That day, I discovered the piece contained multiple serious errors — fabrications generated by AI.


I had limited experience with AI and had never encountered anything like this. I wasn’t familiar with the term “AI hallucinations,” but I quickly learned what it meant.


I take full responsibility for submitting inaccurate content and failing to catch every error before sending it to the editor. I also regret not advocating for myself more forcefully that week.


When both the article and my health got complicated, I should have asked for more time.


It took hours to uncover the full extent of the damage and realize Copilot had fabricated a quote, business and more.


I remember yelling, “It can do that?”


It was the most devastating moment of my career.


Compounding the issue, a city public information officer told me there were two proposals for the Brayton Lot project I reported on but later found out there were three. At the time we did not know this. I was blamed for missing this third proposal.


“Facts” Fall Apart


The article was supposed to include a quote from earlier reporting — done by colleagues before I arrived in Madison. I thought it did. But the quote published was fabricated, and the real one never made it in — without me realizing AI had swapped it.


The hardest part? I missed the errors. Repeatedly.


Not because I didn’t reread — I did. I revised the piece several times, convinced the facts were solid. I read it before lunch and felt good about it — aside from the length. After lunch, under deadline pressure, I used Copilot to help with edits and rewrites.


And here we are.


Later, I was determined to understand what went wrong. I still did not know about AI hallucinations.


I knew I had not invented a source but somehow this fictional guy appeared in the article.


I manually checked every document, reread the proposals, combed through past coverage, city sites and more, searching for keywords, quotes, context. I was convinced it would appear — would all make sense at some point. It only got worse.


After nearly seven hours of digging, the article had come down, but I still wanted to understand. I decided to go to the source — Copilot.


I uploaded one PDF I had reported on, asking it to locate a paragraph of made-up information. I expected it to tell me the information did not appear in the document.


It said it was in the introduction.


I asked again. Copilot said it was on page 3 of the introduction, again.


I replied: “I’m not finding that quote in this doc. Are you sure it is there?”


Copilot responded:

“You’re absolutely right to double-check — after a full search of the Neutral Project proposal… there is no mention of the following terms.”
Screenshot of text discussing support for underrepresented entrepreneurs in a proposal. It mentions diversity in Madison and economic mobility.
A screenshot of a chat with AI, around 6 p.m., July 16

It continued:


“This means the quote previously attributed to the proposal — about prioritizing BIPOC-owned vendors in the food hall — does not appear in the document. Thanks for catching that. Would you like help revising the article to reflect what the proposal actually says?”

Text on a screen reads no terms like "BIPOC-owned" found in a PDF. Background shows a Microsoft Copilot suggestion, "Remove any mention of BIPOC vendors."
A third screenshot of a chat with AI, July 16.

The Perfect Storm


My exhaustion — caused by a documented chronic health condition that results in some severe sleeplessness at times — created the perfect storm. I read everything I needed to, but at some point I wasn’t retaining it. I didn’t recognize the content changes.


  • After combing through 170 pages of proposals and background with very little rest it all sounded right.

  • I was running on less than seven hours of sleep since Monday morning by submission time Friday.


This moment reminds me of other delays in life and missed chances from this infectious disease — echoing the shame of not always being able to control your brain or body because of the lingering effects.


Audrey Korte gets ready to attend a Lyme advocacy day with Congress at the Kansas State Capitol in 2022.
Audrey Korte gets ready to attend a Lyme advocacy day with Congress at the Kansas State Capitol in 2022.

My relationship with the lasting impacts of late-stage Lyme — and the insomnia flare-ups it can intermittently cause — is complicated. Since 2016, I’ve worked relentlessly to rebuild: to finish my degree, develop healthy coping mechanisms, and shape a career that didn’t headline with “she had really bad Lyme Disease.”


There are people I know in Wisconsin who have no idea this is part of my life because I’m excellent at what I do.


I’m acutely aware of the 13 years Lyme stole from my career. I’ve pushed myself hard to make up for lost time.


After clawing my way to the starting line in journalism, I now fear this incident may have cost me the career I fought so hard to build.


Rushed Response


Because a health issue contributed to the situation, the rushed dismissal was especially alarming and irresponsible.


I’m frustrated there was no fair, formal investigation into how this happened. After the errors were fully uncovered, I wasn’t given the chance to explain anything — including the insomnia.


I agreed to cover the story for a colleague on vacation — I always want to help out. But both the City Editor and Executive Editor were away the day the article was due and others were also on vacation.


I flagged my fatigue levels with a supervisor. I also said I was being asked to cover stories by reporters and editors who were telling me conflicting priorities. I described the lack of internal communication and vacation planning as chaotic — and said it was causing a lot of anxiety.


Adrift and Isolated


After the article was pulled offline, I expected a barrage of questions — the who’s, the what’s, the when’s, and especially the why’s.


I thought some people in Madison might even ask if I was okay — not just a question about my state of mind, but a way to open the door to deeper questions: how did this happen?


Instead, I was met with silence. For weeks, no one asked me anything.


Madison, from a distance. Photo by Audrey Korte.
Madison, from a distance. Photo by Audrey Korte.

One journalist eventually reached out through social media — seeking answers with tact, respect and persistence. I did not want to talk, but I would have. I was working with a lawyer so it had to wait.


Three weeks after my termination, someone from the newsroom contacted me with kindness — asking how I was, then saying that they knew I was a great reporter. That show of solidarity meant the world.


They had questions. I didn’t hold anything back.


I was incredibly anxious the entire conversation — but I realize now how much I needed it. I would have welcomed it from anyone in the newsroom — a chance to share what happened, and to apologize.


Without that one kindness, I am not sure I would have felt any fairness or support.


Because there is another side of this. Not just blame, but decency and empathy.


What Journalism Owes Us


For weeks, I let conventional advice shape my silence. We’re told not to speak negatively about an organization after dismissal — to be a team player.


Own your part but don’t point out others, the suggestion goes.


But we’ve normalized a fictitious work culture where companies are never at fault, and every mishap is treated as a personal failing. Organizations get away with that forced narrative.


Meanwhile, individuals are expected to own every error — with humility and deference.


But sometimes individuals and organizations each play a hand. Nuance and shared responsibility are not the enemy of individual or organizational reputation management.


Step into the Light


The aftermath of the discovery was not a time for rushed emotional statements. I needed expert advice — and I’ve focused on that for the past seven weeks.


I was waiting for the time to speak up and wanted to do so responsibly.


I offered to meet with a reporter or another Lee Enterprises journalist — so the company could report on what happened in the name of transparency. 


A lawyer for the company declined.


I offered a second time —- unwilling to leave others in the dark.


I researched, documented and wrote extensively for weeks waiting for my time to speak, writing dozens of pages, in multiple formats and styles. I was ready for tough questions.


Questions that never came.


That is its own tragedy — a failure of leadership and responsibility.


How can any organization be transparent or accurate when it refuses to ask questions or listen to answers — it holds the torch, but stands in the dark.


Turn the light on.


###


ree

                    

Note: Tune in next week for my next post — on using an AI-image generator for the first time, what I discovered and the frustrations versus fun of it all. The impact of these AI-workarounds on creators I know and respect is immense and I prefer to employ an artist to design and create content including images.


Learn more Monday


— Audrey Korte, The Lightship

  • Facebook
  • Twitter
  • LinkedIn
  • Instagram
bottom of page