Editorial, Opinion

EDITORIAL: ChatGPT may be on the rise — but there is something distinctively human missing

Technology, science and mathematics are practically the cornerstones of the American economy — but as these sectors overtake the public domain, are we beginning to stumble down the narrow path of losing a sense of human touch completely?

Chloe Patel | Senior Graphic Artist

The United States prides itself on its innovations in the field of technological engineering, and indeed, some of our accomplishments have worked to make everyday life more convenient and efficient. However, recent drawbacks have surfaced from developments in the artificial intelligence world with systems like ChatGPT. 

This form of AI is unique. It can mimic distinctly human virtues such as critical thinking, intellect and structured prose in formats like email, scholarly articles and academic essays. 

As a large language model, these bits of intelligence calculate the probability of forthcoming words and other codes. ChatGPT’s are then able to build large collections of text by configuring the words that are likely to follow in one coalesced sequence. The model has been optimized to read dialogue so that humans can essentially tell it what to write.

The goal of these systems is to increase their advanced natural language so that they understand how to answer questions, respond with reason and eliminate contradictory statements. 

In theory, the tool can offer a great deal of convenience to those looking to write an email in a pinch, or write a quick essay they forgot about. The perceived benefits of this system have generated its recent publicity — after all, what stressed college students wouldn’t love to have a perfectly structured essay written before their eyes?

But, alas, there are rightfully many concerns with trusting artificial intelligence to write on behalf of humans. 

Often, they are prone to a lack of reliability and analysis in their reporting and reception of human speech. Although the language produced by these systems reflects real-world discourse, they are going off what they have been fed, and have no mechanisms for actual reasoning in a way that makes sense. 

There is also regard for their accuracy as these systems contain virtually no fact-checking mechanisms to differentiate between true and false. Usually, their output can simply be automated in a way that proliferates misinformation.

Chiefly, the primary worry is ChatGPT’s threat to authenticity. While AI’s capacity for human mimicry is practically endless, and its ability to spit out grade-A essays is impressive, there is something distinctly personal missing from these generative outputs.

What is to become of human writing if constantly compared against a system designed to value perfectionism over novelty? It not only questions the quality of human knowledge, but also the need for genuine interaction with one another.

This $29 billion bot business may certainly have the complex capability to parrot human speech, but lacks the creativity of thought. There is more to writing than just arranging words on the paper to appeal to semantics — one must also consider the personal sentiments, intellectual thought and bursts of ingenuity that factor into making a statement authentic.

The evolution of ChatGPT into the default mechanism for crafting essays and emails could effectively squander basic communication skills, and atrophy our sense of originality.

Written and verbal communication skills are the foundation of all human relationships. Whether in the form of a simple email, or a face-to-face conversation, the natural ways in which we interact with one another are exclusive to our species and ​​inimitable by technology. 

While the resourcefulness of AI can provide a quality background for most basic research inquiries, the true process of writing and its end results is too paramount to leave to the bots.

This article was written by Co-Opinion Editor Analise Bruno








More Articles

Comments are closed.