Friday, March 3, 2023

Chat GPT: "I think you know what the problem is just as well as I do."


Outsourcing empathy? What does it it mean to have a machine craft or draft messages of sympathy, solidarity, equity and justice?


The apology email also acknowledged that despite adding messages of inclusivity, using an AI program to write a condolences note was inappropriate. “While we believe in the message of inclusivity expressed in the email, using ChatGPT to generate communications on behalf of our community in a time of sorrow and in response to a tragedy contradicts the values that characterize Peabody College,” the email read.



The first text output from ChatGPT reads like a convincing summary of Bloomberg’s post-electoral philanthropic activities – complete with a quote from Bloomberg, himself. But the I-Team could find no record of the former mayor ever uttering those words.  When the chatbot was reminded to include commentary from Bloomberg’s critics, ChatGPT seemed to make up entirely fabricated quotes from phony anonymous sources. And those fake sources appear to skewer the former mayor for using his wealth to influence public policy. 



What are the expectations for scholarship in the sciences and why?

from Science magazine:

For years, authors at the Science family of journals have signed a license certifying that “the Work is an original” (italics added). For the Science journals, the word “original” is enough to signal that text written by ChatGPT is not acceptable: It is, after all, plagiarized from ChatGPT. Further, our authors certify that they themselves are accountable for the research in the paper. Still, to make matters explicit, we are now updating our license and Editorial Policies to specify that text generated by ChatGPT (or any other AI tools) cannot be used in the work, nor can figures, images, or graphics be the products of such tools. And an AI program cannot be an author. A violation of these policies will constitute scientific misconduct no different from altered images or plagiarism of existing works. Of course, there are many legitimate data sets (not the text of a paper) that are intentionally generated by AI in research papers, and these are not covered by this change.

Most instances of scientific misconduct that the Science journals deal with occur because of an inadequate amount of human attention. Shortcuts are taken by using image manipulation programs such as Photoshop or by copying text from other sources. Altered images and copied text may go unnoticed because they receive too little scrutiny from each of the authors. On our end, errors happen when editors and reviewers don’t listen to their inner skeptic or when we fail to focus sharply on the details. At a time when trust in science is eroding, it’s important for scientists to recommit to careful and meticulous attention to details.

The scientific record is ultimately one of the human endeavor of struggling with important questions. [Emphasis all mine] Machines play an important role, but as tools for the people posing the hypotheses, designing the experiments, and making sense of the results. Ultimately the product must come from—and be expressed by—the wonderful computer in our heads.



No comments:

Roy's obituary in LA Times and Register: "we were lucky to have you while we did"

  This ran in the Sunday December 24, 2023 edition of the Los Angeles Times and the Orange County Register : July 14, 1955 - November 20, 2...