Opinion | ChatGPT and the Human Mind: How Do They Compare?

More from our inbox:

To the Editor:

Re “Noam Chomsky: The False Promise of ChatGPT,” by Noam Chomsky, Ian Roberts and Jeffrey Watumull (Opinion guest essay, nytimes.com, March 8):

Dr. Chomsky and his co-authors are correct that A.I. is nothing like the human mind, which took millions of years to evolve using the resources of the whole earth. A.I. developed over a few decades using a minuscule fraction of the earth’s riches.

The human brain is amazingly slow, inaccurate and forgetful. It is incapable of quick high-precision floating-point arithmetic, which solves equations to many decimal places. Computers are millions of times faster, with essentially infallible memory, perfect attention and limitless patience. The computer was a product of the human mind, which is truly wonderful.

Contrary to the writers’ assertions, there is no doubt that machines will eclipse and replace humans at science, math and engineering within this century. But future A.I. will exploit Bayesian algorithms rather than boring old deep learning like ChatGPT. (Bayesian methods use the minimal amount of training data, promise optimal accuracy and quantify uncertainty, capabilities that deep learning lacks.)

It is hard to imagine that computers would also eclipse humans in terms of evil.

Fred Daum
Carlisle, Mass.

To the Editor:

Noam Chomsky and his co-authors have explained from a linguistic perspective the unbridgeable chasm that separates A.I. and chatbots, remarkable products of language analysis and synthesis, from human intelligence and knowledge.

But there is a more fundamental difference than the ones mentioned. The intelligence that chatbots create is an abstraction of mind and knowledge, amputated from the primary human data of bodily feelings and emotions on the one hand, and from sensory-perceptual awareness of the external world on the other.

The only way technology can solve this problem would be to create hybrid humans with implanted robotic connections, a development I shudder to contemplate.

Michael Robbins
Amherst, Mass.
The writer is a psychoanalyst, a former professor of clinical psychiatry at Harvard University, and the author of “Consciousness, Language and Self.”

To the Editor:

In their thoughtful and clarifying article on the new breed of A.I. marvels, Noam Chomsky and his co-authors conclude that “we can only laugh or cry at their popularity.”

On balance, I fear that tears are in order, followed rapidly by hard work to circumvent the potentially destructive powers of artificial intelligence. The West’s lethal cocktail of judgmentalism, commodification and surveillance could all too conceivably lead to A.I. being employed primarily for the oppression of the individual.

Once that happens, we will be looking to Kafka, Bulgakov and Frost for lessons on how to say one thing but mean entirely another.

Fin Keegan
Newport, Ireland

To the Editor:

It’s been less than six months since ChatGPT exploded into public awareness. It immediately became controversial. Some would outlaw it. Some embrace it. Others applaud.

ChatGPT is a top-notch new learning tool. It even has the potential to break writer’s block. Why are schools pushing back? Some fear cheating, as though rectitude were more important than learning.

Consider this. Assign students to have ChatGPT write a paper. Then, ask those students to critique the resulting essay by standards of logic, bias, scholarship, content, style and creative thinking. After that, ask the students to rewrite the paper to overcome the shortcomings that their critique has disclosed.

I can’t think of a better way to teach better thinking, better writing and better research than by having human students critique a machine-written essay.

What are we afraid of? Let’s have faith in our human species.

Jack Cumming
Carlsbad, Calif.

To the Editor:

Noam Chomsky and his co-authors are right on target. ChatGPT is fascinating, but the hype is way overblown.

My experiences in two areas of interest could not have been more different. In the data science arena, it performs very well when writing Python programs based on my demands, although it requires some editing.

On the other hand, in my hobby area, history, it produces wildly inaccurate results but delivers them with great confidence. The reasons it does this are provided by the essay’s writers.

Sorry, kids, I would not count on it to write term papers.

Roger Gates
Fort Worth

Transgender Men at Wellesley

To the Editor:

Re “Wellesley Students Vote to Open Admissions to Transgender Men” (news article, March 15):

Wellesley students pressuring the college to admit trans men have the issue exactly backward. They fail to make the appropriate distinction between sex and culture.

Sex is a biological category generally assigned at birth (or at some point in utero). Its various components may occasionally be at odds with one another. Gender is a cultural category that reflects how a person lives a life, which may at times be at odds with that person’s sex.

Women’s colleges are cultural/educational institutions devoted to women. They commonly admit trans women, as well they should. It is not in line with that mission to admit trans men or even those preferring to escape traditional gender categorization altogether.

Judith Shapiro
Bryn Mawr, Pa.
The writer was the president of Barnard College from 1994 to 2008 and is emerita professor of anthropology at Barnard and Bryn Mawr College.

Bias in Lending

To the Editor:

“‘Excuse After Excuse’: Black and Latino Developers Struggle to Expand” (Real Estate, March 5) points to lack of capital access as a key reason for the abysmal number of successful Black and Latino developers. This challenge is experienced by people of color across industries.

To fix this, we must reform lending’s most consequential step: underwriting. Traditionally, underwriters look unfavorably on factors like smaller down payments and higher debt-to-income ratios that are more prevalent among nonwhite borrowers because of longstanding systemic racism.

There are more fair methods to determine an applicant’s likelihood and ability to repay. Our Underwriting for Racial Justice working group includes lenders piloting different underwriting approaches, such as evaluating credit histories instead of using hard credit score cutoffs. The result is more racially diverse and high-performing portfolios.

The financial industry has an opportunity to replace underwriting standards that perpetuate the crisis of representation in the development industry and beyond. We can spread more equitable practices to make real, systemic change.

Erin Kilmer Neel
Oakland, Calif.
The writer is executive director of Beneficial State Foundation, which seeks a more equitable banking system.

Pay a Living Wage

To the Editor:

“How Tech Tips the Scales on Gratuities” (Business, March 2) shines a bright light on a systemic issue reflecting how this country values its workers. Rather than use tech to guilt customers into tipping, we should pay all workers a living wage that’s baked into the cost of goods and services, as in so many other nations.

Tom Salyers
Takoma Park, Md.
The writer is director of communications at the Center for Law and Social Policy.

Source: Read Full Article