Geographic biases exist in ChatGPT, reveal researchers


New York, Dec 16 (IANS): Researchers have discovered limitations in ChatGPT’s capacity to provide location-specific information about environmental justice issues.

Their findings, published in the journal Telematics and Informatics, suggest the potential for geographic biases existing in current generative artificial intelligence (AI) models.

“As a geographer and geospatial data scientist, generative AI is a tool with powerful potential,” said Assistant Professor Junghwan Kim of the College of Natural Resources and Environment at Virginia Tech.

“At the same time, we need to investigate the limitations of the technology to ensure that future developers recognize the possibilities of biases. That was the driving motivation of this research,” Kim added.

Utilising a list of the 3,108 counties in the contiguous United States, the research group asked the ChatGPT interface to answer a prompt asking about the environmental justice issues in each county.

The researchers selected environmental justice as a topic to expand the range of questions typically used to test the performance of generative AI tools.

Asking questions by county allowed the researchers to measure ChatGPT responses against sociodemographic considerations such as population density and median household income.

ChatGPT was able to provide location-specific information about environmental justice issues for just 515 of the 3018 counties entered, or 17 per cent.

With generative AI emerging as a new gateway tool for gaining information, the testing of potential biases in modeling outputs is an important part of improving programs such as ChatGPT.

“While more study is needed, our findings reveal that geographic biases currently exist in the ChatGPT model,” said Kim.

“This is a starting point to investigate how programmers and AI developers might be able to anticipate and mitigate the disparity of information between big and small cities, between urban and rural environments,” the authors noted.

 

  

Top Stories


Leave a Comment

Title: Geographic biases exist in ChatGPT, reveal researchers



You have 2000 characters left.

Disclaimer:

Please write your correct name and email address. Kindly do not post any personal, abusive, defamatory, infringing, obscene, indecent, discriminatory or unlawful or similar comments. Daijiworld.com will not be responsible for any defamatory message posted under this article.

Please note that sending false messages to insult, defame, intimidate, mislead or deceive people or to intentionally cause public disorder is punishable under law. It is obligatory on Daijiworld to provide the IP address and other details of senders of such comments, to the authority concerned upon request.

Hence, sending offensive comments using daijiworld will be purely at your own risk, and in no way will Daijiworld.com be held responsible.