Getting to be more literate than humans

Lucinda McKnight, lecturer at Deakin University, Australia, has a February 9, 2021 essay about literacy in the coming age of artificial intelligence (AI) for The Conversation (Note 1: You can also find this essay as a February 10, 2021 news item on phys.org; Note 2: Links have been removed),

Students across Australia have started the new school year using pencils, pens and keyboards to learn to write.

In workplaces, machines are also learning to write, so effectively that within a few years they may write better than humans.

Sometimes they already do, as apps like Grammarly demonstrate. Certainly, much everyday writing humans now do may soon be done by machines with artificial intelligence (AI).

The predictive text commonly used by phone and email software is a form of AI writing that countless humans use every day.

According to an industry research organisation Gartner, AI and related technology will automate production of 30% of all content found on the internet by 2022.

Some prose, poetry, reports, newsletters, opinion articles, reviews, slogans and scripts are already being written by artificial intelligence.

Literacy increasingly means and includes interacting with and critically evaluating AI.

This means our children should no longer be taught just formulaic writing. [emphasis mine] Instead, writing education should encompass skills that go beyond the capacities of artificial intelligence.

McKnight’s focus is on how Australian education should approach the coming AI writer ‘supremacy’, from her February 9, 2021 essay (Note: Links have been removed),

In 2019, the New Yorker magazine did an experiment to see if IT company OpenAI’s natural language generator GPT-2 could write an entire article in the magazine’s distinctive style. This attempt had limited success, with the generator making many errors.

But by 2020, GPT-3, the new version of the machine, trained on even more data, wrote an article for The Guardian newspaper with the headline “A robot wrote this entire article. Are you scared yet, human?”

This latest much improved generator has implications for the future of journalism, as the Elon Musk-funded OpenAI invests ever more in research and development.

AI writing is said to have voice but no soul. Human writers, as the New Yorker’s John Seabrook says, give “color, personality and emotion to writing by bending the rules”. Students, therefore, need to learn the rules and be encouraged to break them.

Creativity and co-creativity (with machines) should be fostered. Machines are trained on a finite amount of data, to predict and replicate, not to innovate in meaningful and deliberate ways.

AI cannot yet plan and does not have a purpose. Students need to hone skills in purposeful writing that achieves their communication goals.

AI is not yet as complex as the human brain. Humans detect humor and satire. They know words can have multiple and subtle meanings. Humans are capable of perception and insight; they can make advanced evaluative judgements about good and bad writing.

There are calls for humans to become expert in sophisticated forms of writing and in editing writing created by robots as vital future skills.

… OpenAI’s managers originally refused to release GPT-3, ostensibly because they were concerned about the generator being used to create fake material, such as reviews of products or election-related commentary.

AI writing bots have no conscience and may need to be eliminated by humans, as with Microsoft’s racist Twitter prototype, Tay.

Critical, compassionate and nuanced assessment of what AI produces, management and monitoring of content, and decision-making and empathy with readers are all part of the “writing” roles of a democratic future.

It’s an interesting line of thought and McKnight’s ideas about writing education could be applicable beyond Australia., assuming you accept her basic premise.

I have a few other postings here about AI and writing:

Writing and AI or is a robot writing this blog? a July 16, 2014 posting

AI (artificial intelligence) text generator, too dangerous to release? a February 18, 2019 posting

Automated science writing? a September 16, 2019 posting

It seems I have a lot of questions* about the automation of any kind of writing.

*’question’ changed to ‘questions’ on November 25, 2021.

Leave a Reply

Your email address will not be published. Required fields are marked *