From the Desk of the CISO: Not All AI Tools Are Created Equal 

Artificial intelligence tools have become a genuine part of how many of us work. Drafting communications, summarizing long documents, analyzing data, brainstorming — these tools are useful, and they’re not going anywhere. As with any tool, proliferation of data comes with risk. Today’s column is to help ensure you understand the difference between the AI tools that are riskier than others. Some are safe for university business and others aren’t.

And that difference matters more than most people realize.

Here’s the line I want you to remember: if you’re logging into an AI tool with your own personal account — even if you signed up using your syr.edu email address — you are not using an approved university service, and university data does not belong there.

Let me explain why that distinction matters.

Syracuse University has negotiated enterprise agreements for several AI tools, including Claude and Microsoft Copilot for work (note: this is different from the free Copilot you might find on the web). Departments can also request access to enterprise ChatGPT. What makes these different isn’t just the brand name — it’s the contract behind them. Our enterprise agreements include data protection provisions that prohibit these vendors from using your inputs to train their models, require them to handle university data responsibly, and hold them to standards consistent with our obligations under regulations like FERPA. When you access these tools through Syracuse’s authentication system — meaning you’re logging in through our single signon — you’re working within that protected environment.

A personal account lives completely outside of that protection. It doesn’t matter if you used your syr.edu address to sign up, and it doesn’t matter if it’s the same brand as an approved tool. A personal ChatGPT, Claude, or  Copilot account — none of those carry our enterprise protections. The terms of service you agreed to when you created that account are between you and that vendor, not between that vendor and Syracuse University. Your inputs may be used to train models. Your data may be retained in ways we can’t control or audit. And if something goes wrong, we have very little recourse.

So what does this mean practically? Before you use any AI tool for university work, ask yourself one question: did I get here through Syracuse’s login system? If the answer is yes, you’re likely in the right place. If you logged in with a personal password you created on your own, stop and find the approved version — or reach out to my team at infosec@syr.edu if you’re not sure one exists.

The productivity benefits of AI are real, and we want you to take advantage of them. We’ve worked hard to make sure approved tools are available. But those benefits come with a responsibility to use the right tool for the right job — and to keep university and student data where it belongs.

When in doubt, ask. We’re here to help.

Chris Croad is the Chief Information Security Officer for Syracuse University Information Technology Services.