Researchers have discovered limitations in ChatGPT’s capacity to provide location-specific information about environmental justice issues.

Their findings, published in the journal Telematics and Informatics, suggest the potential for geographic biases existing in current generative artificial intelligence (AI) models.

“As a geographer and geospatial data scientist, generative AI is a tool with powerful potential,” said Assistant Professor Junghwan Kim of the College of Natural Resources and Environment at Virginia Tech.

“At the same time, we need to investigate the limitations of the technology to ensure that future developers recognize the possibilities of biases. That was the driving motivation of this research,” Kim added.

Utilizing a list of the 3,108 counties in the contiguous United States, the research group asked the ChatGPT interface to answer a prompt asking about the environmental justice issues in each county.

The researchers selected environmental justice as a topic to expand the range of questions typically used to test the performance of generative AI tools.

Buy Me a Coffee

Asking questions by county allowed the researchers to measure ChatGPT responses against sociodemographic considerations such as population density and median household income.

ChatGPT was able to provide location-specific information about environmental justice issues for just 515 of the 3018 counties entered, or 17 percent.

With generative AI emerging as a new gateway tool for gaining information, the testing of potential biases in modeling outputs is an important part of improving programs such as ChatGPT.

“While more study is needed, our findings reveal that geographic biases currently exist in the ChatGPT model,” said Kim.

READ
SpaceX Aces Starship 6th Test Flight, Fails to Catch Booster

“This is a starting point to investigate how programmers and AI developers might be able to anticipate and mitigate the disparity of information between big and small cities, between urban and rural environments,” the authors noted.