Read Artificial Intelligence Usage Policy

Institutional and system-wide policies are under development.

At present, the default policy of the Morningside Graduate School of Biomedical Sciences is to permit use of generative AI with attribution.  Attribution includes documentation of the AI source, a statement declaring how it was used, and a list of prompts used.

Course directors set policy for their courses and are required to clearly state their AI policy in the syllabus.  If AI is prohibited for specific assignments or course activities, course directors are responsible for equitably enforcing the policy, including a statement in the syllabus describing how compliance will be monitored and any penalties for violation.

Use of AI detection tools is prohibited as part of our academic policies. These tools have been shown to produce unreliable and potentially biased results, leading to unfair or inaccurate assessments. Instead, we encourage faculty and students to focus on evaluating the originality, quality, and academic integrity of the work through critical engagement and feedback.

What Should Not Be Shared with GenAI: Ensuring Data Privacy and Integrity

When using Generative AI tools, it’s crucial to avoid inputting any sensitive or proprietary information that could violate privacy laws or intellectual property rights. Specifically:

  • FERPA-Protected Student Information:
    Under the Family Educational Rights and Privacy Act (FERPA), students’ educational records are protected, and sharing personally identifiable information (PII) with AI tools may breach confidentiality. This includes:
    • Student grades
    • Assignments with identifying information
    • Any other data that can be linked to a specific student

Faculty and staff should never input student data or course management system information that contains PII into AI platforms.

  • HIPAA-Regulated Health Information:
    If you work in fields involving health-related research or education, be mindful of the Health Insurance Portability and Accountability Act (HIPAA) when interacting with AI tools. This means you should avoid sharing:
    • Any individually identifiable health information
    • Research data that contains patient records or health identifiers

AI tools are not HIPAA-compliant, and sharing this data could lead to violations of patient privacy.

  • Unpublished Research or Proprietary Information:
    AI platforms are not secure repositories for confidential research. Therefore, you should avoid sharing:
    • Unpublished research results
    • Novel hypotheses, data, or methodologies
    • Proprietary technologies or inventions under development

There's a risk that anything inputted into GenAI tools may be used for further training or inadvertently exposed, which could compromise intellectual property and affect future publication or patent rights.

  • Personal Identifiable Information (PII):
    Beyond FERPA and HIPAA, do not input any form of PII such as:
    • Social security numbers
    • Home addresses, phone numbers, or other personal contact information
    • Passport or ID details

AI-use Policy Templates

As part of the Morningside GSBS commitment to academic integrity and clarity, faculty are required to include a statement on their syllabus outlining their course policy regarding the use of generative artificial intelligence (GenAI) tools. To support you in this effort, we have developed a set of example templates addressing a range of GenAI-use approaches, from full prohibition to conditional or permissive use with guidelines.

These templates are designed as starting points and can be freely adapted to fit the unique needs of your course. Please note that they are not prescriptive, and you are encouraged to customize them as appropriate. Feedback and suggestions for additional examples are always welcome as we continue to navigate the evolving role of AI in education.

View AI-use Policy Templates