Newsroom Now: Guidelines for covering and using AI

During this panel, we got to hear from a global investigative journalist, a strategic advisor and a vice president for a media company. We often blame the tech people who had to deal with the beginning of AI, but we forget to talk about accountability for not having our say in how AI was handled initially. Vice president for Morgan Murphy Media laid out that a small amount of time is dedicated to original content creation while the majority is dedicated to versioning things that are not necessarily the raw material. The way we use AI must be made by decisions that are not out of fear. We still human touch not only to better the system. In the end, laws are made by humans. This tension between input and output is something we need to continue to navigate as AI develops and grows in the newsroom.

Five Key Takeaways:

  • Social media has created so many jobs and now with the emergence of AI new jobs will be available.
  • We need to form part of the greater info ecosystem. If we are not part of it, we lose our audience.
  • AI to help do your job better and be used to version your stories but not for creating original content.
  • We should not confuse prediction for some kind of truth and judgment.
  • Journalists need to be transparent about how they use AI tools and where they get their information from.

Leave a comment