This is strange to me. Did the students create the deepfake nudes or did software create those nudes? A normal image editor program won't just create explicit material on its own, the user has to do it with a mouse/tablet/whatever. But AI algorithms will. Even if the kids were giving the instructions to AI, why isn't the software and/or the company that runs it at least somewhat liable for creating child porn?
Suppose the students drew the nudes themselves, but they were really bad stick figure drawings with a name beside them to say who it's supposed to be? Is that illegal? What if they were really good artists and the drawing looked like a photo? At what point is the art considered good enough to be illegal?