With rapid product development occurring in generative artificial intelligence (A.I.), universities and educational programs must keep current to incorporate A.I. tools and platforms such as ChatGPT and large language models (LLMs).
The University of Michigan has a course titled “Advanced Artificial Intelligence” that covers LLMs and generative A.I. Meanwhile Google Cloud has a microlearning course, “Introduction to Generative A.I.,” that explains the technology and how to develop generative A.I. apps. You can also find courses covering generative A.I. on e-learning services such as Coursera, which lists a course from Vanderbilt University called “Prompt Engineering for ChatGPT.”
What can we expect as universities adapt their curriculums to incorporate these fast-emerging areas of A.I.?
Generative A.I. and its LLMs will be taught across several disciplines, including the arts and sciences, engineering and medicine, said Nancy Wang, a member of the board of directors for Penn Engineering Online, as well as a director of product and engineering and general manager at Amazon Web Services.
“Linguistics is typically a major that's offered definitely through the arts and sciences college, but oftentimes linguistics also has a computer science component,” Wang noted. “We really see generative A.I. dispersing through academia in that way where there is going to be application in many different disciplines.”
In March, for example, the University of Southern California announced a Center for Generative A.I. and Society that would study the impact of A.I. in culture, education, media and society. The $10 million center will focus on ethical use and innovation in generative A.I. USC is opening a School of Advanced Computing within the School of Engineering in fall 2024 that will help back the new Center for Generative A.I. and Society.
Wang sees universities teaching generative A.I. both in person and remotely: “Just like anything else, data structures or algorithms can be a hybrid experience.”
Generative A.I. at the University of Michigan
The University of Michigan teaches generative A.I. to three types of students: those who use generative A.I. tools in a straightforward manner, those who can fine-tune generative A.I. models for their own software tools, and those who develop their own generative A.I. tools, said Karthik Duraisamy, director of the Michigan Institute for Computational Discovery and Engineering as well as founder and chief scientist at Geminus A.I.
Universities will need a secure GPU infrastructure with raw compute power to run generative A.I. courses and train and fine-tune data models, he added: “That's one of the most important things—you need an infrastructure where certain types of data can be protected.”
A Focus on Foundation Models
Courses in generative A.I. at UM focus on foundation models, according to Duraisamy. “If you take a class on generative A.I., for instance, you would study the foundations… Then at some point you'd have your own generative AI model that does a particular task.”
Eventually, students will study how to use commercial tools in which they can adapt models to their applications. Students work from open-source models, building and fine-tuning them. “There are a lot of quite powerful tools out there that are open source, and people can take them, play with them and develop them, and fine-tune them further,” Duraisamy said.
Foundational tools should even take priority over studying tools like ChatGPT itself. “I think the right approach to education is to have a very strong foundation in the fundamentals,” Duraisamy explained. “Whether that is ChatGPT or generative A.I., I think the foundation of statistical learning theory comes first, and then you build a layer of sophisticated applications in A.I. Then you have generative A.I. sitting on top.”