Using Claude’s Microsoft 365 Connector

By Shannon Glennon, AI Technology Transformation Specialist, Syracuse University ITS 

By now you’ve likely used Claude to write and brainstorm, but it’s capable of a lot more when connected to the tools you use every day. By connecting Claude to your Microsoft 365 account, you can bring your actual work — your emails, calendar, files and more — directly into the conversation. Here’s why that matters, and how to get started. 

Why Connect Claude to Microsoft 365? 

Without a connector, Claude works in isolation. It can help you think, write and plan, but it doesn’t know anything about your work. Connecting Claude to Microsoft 365 changes that. It gives Claude real-time access to your Outlook calendar and email, OneDrive/SharePoint documents and Teams messages — so instead of describing your situation, you can just ask Claude to check. 

How to Set It Up 

  1. Open Claude.ai and login 
  1. Click on your profile/account avatar on the bottom left of the sidebar 
  1. Navigate to Settings 
  1. Select Connectors 
  1. Find Microsoft 365 in the list and click Connect 
  1. Sign in with your Syracuse University NetID credentials when prompted 
  1. Approve the requested permissions — Claude will only access what you authorize 

That’s it. Once connected, Claude can pull in relevant context from your M365 environment when you ask. 

What Can It Actually Do? 

Here are a few sample prompts you can try: 

  • “Summarize my unread emails from this week and flag anything urgent.” — Claude scans your Outlook inbox and gives you a prioritized digest. 
  • “What meetings do I have tomorrow, and can you draft a prep agenda for my 2pm?” — Claude checks your calendar and builds a ready-to-use agenda. 
  • “Find the budget proposal I shared on Teams last month and help me update the executive summary.” — Claude locates the document and helps you refine it in context. 
  • “Draft a response to the message from my department chair about the spring schedule.” — Claude locates the message and writes a professional reply in your voice. 
  • “Review my work from the past week and create a summary I can bring to my team meeting.” — Claude pulls from your emails, calendar, and shared documents to build a cohesive recap of your activity, decisions made, and next steps — ready to present as needed. 

Tips for Getting the Most Out of It 

Be specific about what you want Claude to look at. The more context you give, the better the output. “My email” is vague; “emails from the budget committee in the past two weeks” is actionable. 

Treat it like a smart research assistant, not a search engine. Claude can read, synthesize and respond, not just find. Ask it to do something with what it finds. 

Start with low-stakes tasks. Try it first to summarize, draft or prepare for a meeting. Once you get a feel for how it works, you’ll find your own high-value use cases. 

Always review what Claude produces. AI tools are powerful, but you bring the fact checking. Read and edit before you send. 

From the Desk of the CISO: Not All AI Tools Are Created Equal 

Artificial intelligence tools have become a genuine part of how many of us work. Drafting communications, summarizing long documents, analyzing data, brainstorming — these tools are useful, and they’re not going anywhere. As with any tool, proliferation of data comes with risk. Today’s column is to help ensure you understand the difference between the AI tools that are riskier than others. Some are safe for university business and others aren’t.

And that difference matters more than most people realize.

Here’s the line I want you to remember: if you’re logging into an AI tool with your own personal account — even if you signed up using your syr.edu email address — you are not using an approved university service, and university data does not belong there.

Let me explain why that distinction matters.

Syracuse University has negotiated enterprise agreements for several AI tools, including Claude and Microsoft Copilot for work (note: this is different from the free Copilot you might find on the web). Departments can also request access to enterprise ChatGPT. What makes these different isn’t just the brand name — it’s the contract behind them. Our enterprise agreements include data protection provisions that prohibit these vendors from using your inputs to train their models, require them to handle university data responsibly, and hold them to standards consistent with our obligations under regulations like FERPA. When you access these tools through Syracuse’s authentication system — meaning you’re logging in through our single signon — you’re working within that protected environment.

A personal account lives completely outside of that protection. It doesn’t matter if you used your syr.edu address to sign up, and it doesn’t matter if it’s the same brand as an approved tool. A personal ChatGPT, Claude, or  Copilot account — none of those carry our enterprise protections. The terms of service you agreed to when you created that account are between you and that vendor, not between that vendor and Syracuse University. Your inputs may be used to train models. Your data may be retained in ways we can’t control or audit. And if something goes wrong, we have very little recourse.

So what does this mean practically? Before you use any AI tool for university work, ask yourself one question: did I get here through Syracuse’s login system? If the answer is yes, you’re likely in the right place. If you logged in with a personal password you created on your own, stop and find the approved version — or reach out to my team at infosec@syr.edu if you’re not sure one exists.

The productivity benefits of AI are real, and we want you to take advantage of them. We’ve worked hard to make sure approved tools are available. But those benefits come with a responsibility to use the right tool for the right job — and to keep university and student data where it belongs.

When in doubt, ask. We’re here to help.

Chris Croad is the Chief Information Security Officer for Syracuse University Information Technology Services.

Device Theft and Loss: Why Encryption Is Your Best Defense 

Laptops, smartphones, and USB drives are convenient but losing one can quickly become a serious security incident. It is a very real possibility that devices containing sensitive university and/or personal data can be lost or stolen. The good news is that encryption can make the difference between a costly data breach and a manageable inconvenience.  

What’s at risk?  
An unencrypted lost device can expose student records, employee personal information, research data, financial records, and university login credentials. In many cases, a single stolen laptop could trigger a FERPA or HIPAA compliance incident requiring mandatory reporting and potentially affecting thousands of individuals.  

What is encryption?  
Encryption scrambles the data on your device so that it is completely unreadable to anyone who does not have the correct credentials. Even if a thief physically possesses your laptop or flash drive, they cannot access the files stored on it.  

What should you do? 

  • Laptops: University managed devices are already encrypted, no action needed on your part. If you use a personal laptop for university work, enable fulldisk encryption: BitLocker on Windows or FileVault on Mac. Contact ITS if you need help.  
  • Cell Phones: Personal and universitymanaged phones can both be encrypted simply by setting a strong PIN or passphrase. Both iOS and Android encrypt data automatically once a screen lock is enabled.  
  • Removable Storage: Avoid storing sensitive university data on USB drives when possible. If you must, use an encrypted drive.  

Always report a lost or stolen device to ITS immediately. Prompt reporting allows us to remotely wipe university data before it can be accessed.  

FERPA: What Faculty, Staff and Students Need to Know 

The Family Educational Rights and Privacy Act (FERPA) is a federal law that protects the privacy of student education records. Whether you’re a faculty member, staff employee, or student, understanding FERPA is not just a legal obligation, it’s a matter of trust.  

For Faculty and Staff, FERPA means you may only access student records when there is a legitimate educational need to do so. Sharing a student’s grades, enrollment status, financial information, or academic performance with anyone including parents, employers, or other faculty without the student’s consent is a violation of federal law. Even a casual conversation in a public hallway about a student’s academic standing can constitute a FERPA breach. Always verify your authority to access or share student information before doing so.  

For Students, FERPA gives you the right to inspect and review your own education records, request corrections to records you believe are inaccurate, and control how your information is disclosed to third parties. You also have the right to file a complaint with the U.S. Department of Education if you believe your rights have been violated.  

Key Reminders: 

  • Do not post grades publicly using names or identifiable information.  
  • Verify identity before discussing student information.  
  • Be cautious when emailing sensitive data, use secure systems.  
  • When in doubt, consult your department or ITS before sharing information.  

Protecting student privacy is a shared responsibility. Understanding FERPA helps maintain trust, compliance, and the integrity of our academic community.  

What Is Doxxing and Why Should You Care?

You may have heard the term “doxxing” thrown around online, but it’s no longer just an internet culture buzzword. It’s a real threat that affects students, faculty, and staff at universities across the country and understanding it could protect you from serious harm. 

What Is Doxxing? 

Doxxing (from “dropping documents”) is the act of researching and publicly exposing someone’s private information without their consent. This can include a home address, phone number, workplace, class schedule, family members’ names, or financial details. The goal is almost always to intimidate, harass, or harm the target and the information is often posted publicly to encourage others to pile on. 

What makes doxxing especially unsettling is how little it requires. Much of the information used in doxxing attacks is already technically public scattered across social media profiles, old forum posts, public records, and data broker websites. A motivated bad actor can piece together a surprisingly complete picture of your life in just a few hours. 

Why Is It Dangerous? 

Being doxxed can range from an unsettling experience to a genuinely serious safety concern. At minimum, victims often deal with unwanted contact, harassment, and a persistent feeling of vulnerability. In more serious cases it can affect someone’s professional reputation, physical safety, or mental wellbeing. 

University communities are especially vulnerable. Public-facing faculty, student activists, researchers working on controversial topics, and student journalists are among the more frequent targets. 

How to Protect Yourself 

You don’t need to disappear from the internet — but a few smart habits go a long way: 

  • Audit your social media. Review what’s publicly visible on your profiles. Your location, daily routine, and workplace are valuable to bad actors. Tighten your privacy settings. 
  • Search yourself. Google your name periodically, including variations with your city or phone number. See what comes up — then work to remove it. 
  • Opt out of data brokers. Sites like Spokeo, Whitepages, and BeenVerified sell your personal data. Most offer opt-out processes, and services like DeleteMe can automate removal. 
  • Separate your identities. Use different usernames across platforms, especially in communities where conflict is common. Avoid linking your real name to accounts where you engage in controversial discussions. 
  • Use your university address. For any public-facing communications, list your campus address or P.O. box rather than your home. 

If It Happens to You 

Document everything — screenshots, links, timestamps. Report it to the platform, to University Public Safety, and to local law enforcement if threats are involved. You are not alone, and you are not powerless.