A laptop with ChatGPT open on the screen, with a prompt to generate a comprehensive generative AI policy for a college or university marketing team

Creating a Generative AI Policy for Your Higher Ed Marketing Communications Team

Your marketing and communications team serves a unique purpose for your college or university. That means you need a generative AI policy tailored to your specific functions.

2023 will be known as the year that artificial intelligence (AI) entered the chat and dominated the conversation. Higher ed has been no exception — from the classroom to HR to admissions, university students, staff, and faculty have been figuring out how to wrangle the massive potential (and dilemmas) posed by generative AI technologies. 

If there’s one thing universities love, it’s a good policy, and it’s been heartening to see many institutions creating comprehensive policies addressing the use of AI tools and approaches. (In December 2023, EDUCAUSE published a good guide to creating an institutional policy regarding the use of generative AI). 

Some examples of good institutional AI policies include:

  • Harvard University — right off the bat, this document is framed as “initial” (since this is all very much a moving target) and is nicely organized in a readable, usable format
  • Boise State University and University of Wisconsin-Madison — these feel more “straight outta legal counsel” but helpfully link to other relevant policies at the university
  • Vanderbilt University — notably housed on the HR site, not the IT site, this guidance (not a policy) is concise and focused on shaping thoughtful, responsible usage by staff

While policies that broadly address copyright, data privacy, academic integrity, and other topics are critically important to create (and keep revising, as our understanding evolves), what about higher ed marketing and communications teams? 

Why Would I Need a Generative AI Policy Just for My Higher Ed Marcom Team?

Higher ed marcom teams serve a unique function for the institution — a function directly tied to advancing its business goals and building affinity and trust with key audiences and constituencies.

And as we know, there has been an explosion of chatter around how AI tools can make that function a ton easier — producing high-quality content with ease, creating personalized experiences that engage and convert at remarkable rates, making quick work of search engine optimization efforts, and more.

While there is significant potential, caution is also warranted. And broad generative AI policies may not provide actionable guidance for a higher ed marketing communications team. What uses are okay, and where should we steer clear? How does using these tools change our workflow? How do we account for the brand in AI-generated content? And how do we deal with bias? 

Examples of Standout Generative AI Policies in Higher Ed Marketing Communications

A survey of the landscape revealed precious few higher ed marketing communication-focused generative AI policies. But we wanted to take this opportunity to give them a shout out and highlight what they address that you should consider incorporating into a policy of your own.

University of Oregon

Since we are all still in learning mode with regard to AI, it’s good to frame these as “guiding principles.” The focus on citations is important, since understanding the source of content is even more important in the age of AI. Leading with the need for humans to review AI-generated outcomes is excellent, and we’re always glad to see a reminder to monitor AI outcomes for bias.

The list of “appropriate outcomes” is particularly helpful — including examples such as “generate outlines or drafts as jumping-off points” and “optimize publicly available web content for SEO” — it’s realistic and provides a tangible guidepost for how best to use these technologies in our work. 

Michigan State University

MSU also shared “explored use cases,” such as generating alt text for images and suggesting article headlines. This section implies that there was (or there is ongoing) comprehensive exploration of various use cases, some of which were deemed to be of higher risk and/or lower value than others. This is great — this kind of ongoing exploration should be part of the process for higher ed marcom teams. There will only be more tools and approaches coming down the pike, and our understanding of how best to use them must keep evolving.

The “Disclaimers” section is helpful, noting some of the training parameters and limitations of different tools, as well as legislative factors. People new to AI may not be familiar with these parameters, and they are important to keep in mind. Similarly, “Things to Keep in Mind” provide helpful guardrails against assuming AI is a marketing panacea, while reminding us that this is all still evolving.

The links to specific tools and resources are great — I’d be curious to know which the MSU team have evaluated or are actively using.

North Carolina State University Extension

Starting with an overview is helpful, since not everyone may be on the same page about what AI is or means. It is also good to clarify that this is a living document of guidelines, and not a formal policy — especially as some legal waters continue to be muddy, this distinction could prove prudent.

NC State also specifically addresses prompt writing, which is an essential skill for effective use of generative AI tools. It would not be surprising to see internal training on prompt writing pop up on campuses in the next year or two.

The page also notes that it was last updated in November 2023 — as we document policies and guidelines for our teams on a topic that is highly in flux, dating these documents is key.

NC State also explicitly encourages “thoughtful” engagement and experimentation with AI. As they should!

University of Utah

Utah’s AI guidelines include a lot of what some of these other universities do — background on AI, guiding principles for usage (notably phrased with nine “We believe” statements), examples of acceptable use (as well as examples of explicitly prohibited uses). 

But the most notable thing here is that the university has a Marketing and Communications AI Use Working Group. Encouraging experimentation is great, but convening a body of marcom professionals from across the university to share learnings and contribute to the ongoing evolution of these guidelines is next level. Well done. The communications team shared more context about their efforts on the university’s blog.

University of Wisconsin - Milwaukee

This comprehensive set of guidelines covers similar territory in terms of overview, guiding principles, acceptable and prohibited use, and warning against bias. But the best thing UW Milwaukee does here is outline a recommended production process for using AI tools to generate marketing content. 

The process includes creating a project brief on how AI will be used, approach for human review and revision of generated content, usage of approved style guides to ensure brand alignment, and more. 

As exploration shifts into practice, defining responsible processes for incorporating AI tools into your marketing project workflow will prove increasingly valuable.

Best Practices for Generative AI Policies in Higher Ed Marketing Communications

After reviewing these generative AI guidelines, the following elements stand out as must-haves for your marcom team’s policy:

  • Brief overview of AI — does not need to be exhaustive and can link to other authoritative sources, but touch on areas such as what it is, what it can and can’t do, definition of terms, current landscape (legal, etc.), outlining limitations or parameters in terms of LLM (large language model) training/datasets/etc.
  • Guiding principles/best practices/do’s & dont’s on how to use generative AI tools
  • Examples of appropriate and prohibited usage
  • Guidance on how to properly cite AI-generated content in publications
  • Warnings about the proliferation of bias in AI-generated content and guidance on how to control for this
  • Data privacy guidance (regarding the sharing of confidential data and personal information — this could go beyond, say, students’ personal information to include survey responses or email replies with personally identifiable information!)
  • Links to recommended AI tools and resources (preferably vetted or regularly referenced by your own team)
  • Links to other relevant policies or resources at your university (bonus points for a link to an AI working group!)
  • Asserting the need for human review of AI-generated content and personal responsibility in the generating of such content
  • Specifically encouraging experimentation with AI tools (with guidance on how to effectively do so)
  • Introductory guidance on prompt writing
  • Recommended process for using AI tools in your project workflow
  • Disclaimers about AI being an evolving field and that the guidelines are a living document and not a formal policy (noting the date they were last updated)

Creating and maintaining an evolving set of guidelines for a practice that is very much in flux may fly counter to higher ed’s predilection for fixed and vetted policies. But with the impact of generative AI on our work only poised to grow, it is imperative that higher ed adapt a nimble approach to guiding marketing and communications teams to leverage these technologies appropriately and effectively.  

(For more coverage, see this Jan. 22, 2024 article from The Chronicle of Higher Education, Your College’s New Marketing Campaign, With a Boost From AI.”)

A version of this article originally appeared on Inside Higher Ed’s Call to Action Blog on January 9, 2024