• Skip to primary navigation
  • Skip to main content
  • Skip to footer

Teaching in Higher Ed

  • Podcast
  • Blog
  • SPEAKING
  • Media
  • Recommendations
  • About
  • Contact
BLOG POST

My Three Categories of Considerations for Using AI Tools

By Bonni Stachowiak | April 10, 2026 | | XFacebookLinkedInEmail

abstract image with glowing boxes

This is the third post in a series about AI tools, broadly speaking, and Claude Cowork, specifically. The first post was about slowing down and giving yourself permission not to rush. The second addressed AI, privacy, and the risks worth understanding before diving in. This one is where I get specific about the framework I use when making decisions about my own use of AI and how I advise you to proceed with caution, given that you're not solely making decisions regarding your own risk profile.

I think of it as three categories of considerations: your employer, your own privacy, and other people's information. Working through each of these helped me figure out what I would and would not give Claude access to, when experimenting with Claude Cowork in recent weeks. Your situation will be different from mine, but I hope walking through my reasoning gives you a useful starting point.

Your Employer

If you work for yourself, you can probably skip this section. But many of us have at least one employer, and there are some things worth thinking through carefully before using any AI tool on anything work-related. This varies depending on where you live, and in the United States, it varies by state. I will speak in generalities without claiming to be an attorney, or even playing one on tv.

Most employers have something in their employee handbook about how equipment and systems belong to the institution, and how anything done on them should be for professional and business-related purposes. You should be familiar with your employer's policies around AI use.

There are also specific regulations to consider depending on your field and industry. In higher education, FERPA protects student information. Anything involving student data needs to stay within systems that have appropriate privacy protections in place.

For me, this is actually fairly clean, because I have some big walls up already. I use my own device, not a university-issued one. My university runs on Office 365, and the AI capabilities within that ecosystem, including Copilot in Outlook and other Microsoft tools, fall under the same privacy protections that cover the rest of Office 365. So if I am going to use AI on anything related to my work, especially anything that touches student data, that stays within Microsoft's tools and the privacy and security infrastructure set up by my employer.

That means Claude Cowork is for personal and professional work that is entirely separate from my university role. Anything involving students, course materials tied to my institution, or files that live within university systems stays out of it entirely.

If you are thinking through your own employer situation, the questions worth asking are: Does your institution have an AI policy? If so, what does it say about tools you bring in on your own? What data would you potentially be sharing, and does any of it fall under regulatory protections? And which AI tools, if any, are already approved and in use within your institution's existing systems?

This is a little bit of a tangent, but since we're talking about AI and policies, this repository of syllabi policies for Generative AI from Lance Eaton is worth bookmarking, as it is the most comprehensive one I've seen.

Your Own Privacy Comfort Level

This is the category that requires the most personal reflection, because it depends entirely on what you value and what you are comfortable with. Most of us haven't done sufficient thinking about privacy in general, let alone when applied to AI. Three links from Civics of Technology well worth exploring are:

  1. Teaching Practical Privacy: Notes from a Librarian – guidance and resources on thinking through your digital security and digital privacy
  2. Nothing to Hide: Student Arguments – Assignment that helps facilitate deeper thinking for students and lifelong learners
  3. Privacy Week Events – Extensive reading list, resources, and video of their webinars

I have a few different mental buckets for my own information. At one end, there are things I am very protective of: personal journals and reflections, health information, financial data, anything that feels like something I would want shared or otherwise bought/sold. Another tangent here, because I just reminded myself of this famous speech from Lloyd (played by John Cusack) in the movie Say Anything. He talks about what he wants to do with his career, starting at the 1:00 minute mark.

Ok. Now we're back to the main topic, now that we've revisited not wanting to have anything to do with… well, hopefully you watched it… I digress… 

I would not keep any of that sort of information in files or folders that Claude can access. If I were going to journal consistently (sigh…), it would happen in an app specifically designed for that purpose, with the kind of encryption I am comfortable with. That is its own walled garden. I subscribe to Day One and have for years, and feel good about their privacy picture for my use cases.

At the other end, there is information that is already publicly available. I have been podcasting for over twelve years (and was regularly on Dave's Coaching for Leaders podcast before that. I have shared a lot of things on both shows. Transcripts of those episodes exist on the web. If something is the kind of thing I might have said publicly, I am generally less cautious about it being in an AI-accessible space.

Most of what I work with in Claude falls somewhere between those two ends. Notes I took on a book I was reading. Research I compiled on a topic. Working drafts of writing. These feel like appropriate things to have within Claude's reach. I have not given Anthropic permission to train on my data. That is a setting users can configure, and I strongly recommend you check yours.

One thing I have found worth doing is thinking not just about which apps or tools I give access to, but what is actually inside them. A notes app might have grocery lists right alongside something much more sensitive. A calendar might have a podcast recording and a medical appointment on the same week. It is worth going folder by folder, and sometimes note by note, rather than just making a whole-app decision. My default is to block access and go slow, until I have had the chance to think things through and carefully research the implications.

Other People's Information

This is the category I feel most strongly about, and it is also the one that requires the most nuance.

The hard line for me is this: I am not comfortable giving any AI access to someone else's phone number, date of birth, or email address. Those things are off limits, and I would not give Claude access to my contacts app — just as I've never given access to that sort of data when social media companies and business applications have tried to get me to share it.

But some of what I keep in my notes relates to people who are already publicly findable, and I think it is worth explaining where I draw that line.

I host the Teaching in Higher Ed podcast, and I maintain records of the body of work we have built over the years on the website (which is openly licensed and free for anyone to use): episode numbers, guest names, topics we covered, key resources mentioned, and transcripts. Having those records accessible is simply part of how I do that work.

I take a similar approach with workshops I attend. My notes on an online workshop include the presenter's name, the topics covered, and key resources shared. The person's name is already online. Their slides are often publicly shared. Their name is associated with the event listing and its promotion. That is the kind of information I consider reasonable to keep in my Obsidian notes, which I have given Claude access to in some cases. The point is that I only keep within Claude's reach information that would otherwise be accessible via the open web.

If I attend a private or confidential conversation, one that is not recorded and where the content is not publicly available, those notes belong somewhere outside of any files or folders I have allowed Claude to access.

Revisit Your Permissions Over Time

One last thing worth saying: granting access is not a one-time decision. Most of us, once we have connected an app or given a tool permission, never go back to check on it. But your life changes, your files change, and the tools themselves change, including their privacy policies, sometimes significantly.

It is worth building a habit of auditing what you have connected, even briefly, every few months. Ask yourself: does this tool still need access to this? Has anything changed in what that folder or app contains? Have the terms of service changed in ways that would affect your decision?

The same thoughtfulness you bring to the initial decision is worth bringing back periodically.

Putting It Together

These three categories, your employer, your own privacy comfort level, and other people's information, have been the framework I have used to think through what I will and will not give Claude Cowork access to. My privacy risk profile is likely different than yours and I encourage you to remember to go slow.

Photo credit: Fabio on Unsplash

 

Filed Under: Productivity

Bonni Stachowiak

Bonni Stachowiak is dean of teaching and learning and professor of business and management at Vanguard University. She hosts Teaching in Higher Ed, a weekly podcast on the art and science of teaching with over five million downloads. Bonni holds a doctorate in Organizational Leadership and speaks widely on teaching, curiosity, digital pedagogy, and leadership. She often joins her husband, Dave, on his Coaching for Leaders podcast.

Woman sits at a desk, holding a sign that reads: "Show up for the work."

GET CONNECTED

JOIN OVER 4,000 EDUCATORS

Subscribe to the weekly email update and receive the most recent episode's show notes, as well as some other bonus resources.

Please enter your name.
Please enter a valid email address.
JOIN
Something went wrong. Please check your entries and try again.

Related Blog Posts

  • New tools page added
  • Top 100 tools for 2015
  • Sharing tools

TOOLS

  • Blog
  • Podcast
  • Community
  • Weekly Update

RESOURCES

  • Recommendations
  • EdTech Essentials Guide
  • The Productive Online Professor
  • How to Listen to Podcasts

Subscribe to Podcast

Apple PodcastsSpotifyAndroidby EmailRSSMore Subscribe Options

ABOUT

  • Bonni Stachowiak
  • Speaking + Workshops
  • Podcast FAQs
  • Media Kit
  • Lilly Conferences Partnership

CONTACT

  • Get in Touch
  • Support the Podcast
  • Sponsorship
  • Privacy Policy

CONNECT

  • LinkedIn
  • Instagram
  • RSS

CC BY-NC-SA 4.0 Teaching in Higher Ed | Designed by Anchored Design