General Guidance for Integrated AI Tools
Applications and software providers are integrating AI technology into their products at a rapid pace. Here are some general guidelines so that you can use AI safely and responsibly in your work, education, or research.
This article applies to: Artificial Intelligence (AI)
Know What the AI Tool Does
Most AI tools are approved for low-risk data only. Check the Regulated Data Chart (requires login) before exposing medium- or high-risk data to an AI tool. Understand how the AI tool is used in the software or application. Find this information on the application’s AI overview page (for example, Zoom AI Companion or Copilot).
You Are Responsible for Protecting Your Data
All Cornell users are responsible for complying with university policies 4.12 (Data Stewardship and Custodianship), 5.10 (Information Security) and 4.21 (Research Data Retention).
Most AI tools are approved for low-risk data only. Check the Regulated Data Chart before exposing medium- or high-risk data to an AI tool.
Assess Data Risk First
Do not use an AI tool to record, share, or store recordings that involve discussion of confidential or sensitive information. For example:
- high-risk or regulated data
- personnel information
- social security number
- credit or debit card numbers
- driver’s license number
- visa or passport number
- bank account number
- personal health-related information
- personal financial information
- education records (FERPA)
- personally identifiable information (PII)
- etc.
- privileged legal matters
- any other confidential or sensitive information
- personnel matters
- security matters
- protected research information
- birthdates
- etc.
- Most integrated AI tools cannot be used in:
- clinical, telemedicine, or healthcare settings (e.g., during any patient encounters)
- peer review meetings
- or IRB meetings
Check for Accuracy
If you share content that has gone through AI processing (for example: meeting summaries or email drafts), review the content for accuracy, or confidential or restricted information, before you share it.
Respect Participant Preferences
In apps or situations involving multiple participants, you must inform all parties when an AI tool is in active use. Respect the wishes of participants who request disabling these tools.
Situations Not Covered Here
Some AI tools may have specific features not covered in this general guidance. Before engaging an AI tool within an application, check for any use cases or situations unique to that tool that you may need to consider.
Comments?
To share feedback about this page or request support, log in with your NetID