The AI tools  currently approved for use with public and confidential data are: 
- Microsoft 365 Copilot & Copilot Chat 
- Zoom Companion AI
- Vertex AI
- Azure Open AI
For a full list of applications and tools approved for university use, as well as NOT approved, please visit the page Reviewed University Applications (requires SSO). 
The University of Colorado has adopted data classification types including Highly Confidential Information, Confidential Information and Public Information. Review the CU Data Classification Table for more information about data types (or see accordion below and chart to the right). 
Be cautious when using AI tools, ensuring that personal, sensitive and university data are not uploaded or shared with unvetted AI systems, as they may not be secure. How an AI tool or assistant processes and uses the data that is input into it is a key factor in determining its security. 
In addition to using university approved AI tools with public or confidential data, all users must follow and abide by university policy - and relevant state and federal law regarding protecting data and information systems. These include but are not limited to: 
Please note: For Microsoft AI tools, though Microsoft assures that your data and university data won’t be used to train their artificial bot, it is still advisable to exercise caution. When using Copilot Chat, be sure you are logged in to your university account and confirm that the "protected shield" is visible in the prompt area. Additional information about securely using AI are available. Visit the ISIC webpages: Microsoft Copilot products, Zoom AI Companion features and AI Security and Compliance. 
Request an Applications Assessment for assistance vetting AI prior to acquiring a technology, particularly if the AI is intended for clinical purposes or will use highly confidential data such as FERPA or HIPAA data.