Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
post

Google announces plans to develop tools to help journalists create stories

Fri 21 Jul 2023    
EcoBalance
| 3 min read

According to Google, it has had discussions with leaders in the news industry about its plans and is now developing artificial intelligence technologies to assist journalists in creating stories and headlines.

Concerns about the reliability of the quickly developing technology and the possibility that it may someday cause human journalists to lose their employment in a sector that is already experiencing financial hardship are already being raised.

Multiple news agencies have already received briefings on the project Google is working on.

In a prepared statement, Google stated that tools augmented by artificial intelligence might assist in providing journalists with options for headlines or different writing styles when they are working on a piece – defining it as a means to improve labor and efficiency.

“These tools are not intended to, and cannot, replace the essential role journalists have in reporting, creating and fact-checking their articles,” Google said.

Concerns from news organizations and other professions about whether technology companies are fairly compensating them to utilize their published works to advance AI systems known as big language models coincide with the discussion about how to apply the most recent AI writing tools.

Tech companies have had to ingest vast troves of written works, such as news stories and digitized novels, in order to construct AI systems that can produce writing that is similar to human-produced works. Some of the data, which is obtained through the internet, is not always disclosed by the companies.

AP and ChatGPT creator OpenAI signed a partnership last week for the AI business to license AP’s database of news articles dating back to 1985. The financial details weren’t made public.

A family of so-called generative AI tools that are getting better at emulating various writing styles, as well as visual art and other media, including chatbots like ChatGPT and Google’s own Bard. They are already widely used by many people to save time while writing letters and other common documents or when assisting with homework.

The systems are dangerous for uses like gathering news or delivering medical advice since they can easily spew out falsehoods that people who are unfamiliar with a subject might not recognize.

Chatbots like ChatGPT and Google’s own Bard are part of a set of so-called generative AI tools that are getting better at emulating various writing styles, as well as visual art and other media. They are already widely used by many people to save time while writing letters and other common documents or when assisting with homework.

However, the systems are dangerous for uses like gathering news or delivering medical advice since they can easily spew out falsehoods that people who are unfamiliar with a subject might not recognize.

In the past, Google has been cautious about implementing its AI innovations, even in its leading search engine, which users depend on to deliver reliable information. However, the public’s enthusiasm with ChatGPT since its release in the latter part of last year has pushed tech firms to demonstrate new AI goods and services.

“In an ideal world, technology like Google is discussing can add important information to the world”, said Kelly McBride, an expert in journalism ethics for the Poynter Institute. “It could document public meetings where there are no longer human journalists to attend and create narratives about what is going on.”

She added that there is a chance that technology could advance more quickly than a new economic model supporting local news can be found, which will lead to the temptation to replace human journalists with AI tools.

Journalism organizations need to take into account the likelihood that the technology, especially in its early phases, may be accountable for mistakes; the reputational harm could outweigh any financial benefits.

“I don’t think there will be a single ethical explosion that will ruin everything,” McBride said. “Instead, I think it’s going to be more of an erosion of quality and a bunch of small things that erode confidence in the news media.”

She noted that news organizations are in a crucial position where they can make use of resources that technology companies want, like access to archival data, and develop a financial structure that doesn’t lean too far in the way of firms like Google. History may not be on their side.

“This is a whole new level of threat,” she said, “and it’s not like we can turn back.”


Leave a Reply