Abstract
Current artificial intelligence (AI) approaches to handle geographic information (GI) reveal a fatal blindness for the information practices of exactly those sciences whose methodological agendas are taken over with earth-shattering speed. At the same time, there is an apparent inability to remove the human from the loop, despite repeated efforts. Even though there is no question that deep learning has a large potential, for example, for automating classification methods in remote sensing or geocoding of text, current approaches to GeoAI frequently fail to deal with the pragmatic basis of spatial information, including the various practices of data generation, conceptualization and use according to some purpose. We argue that this failure is a direct consequence of a predominance of structuralist ideas about information. Structuralism is inherently blind for purposes of any spatial representation, and therefore fails to account for the intelligence required to deal with geographic information. A pragmatic turn in GeoAI is required to overcome this problem.
| Original language | English |
|---|---|
| Pages (from-to) | 17-31 |
| Number of pages | 15 |
| Journal | KI - Kunstliche Intelligenz |
| Volume | 37 |
| Issue number | 1 |
| Early online date | 20 Jan 2023 |
| DOIs | |
| Publication status | Published - Mar 2023 |
Bibliographical note
Funding Information:We are very thankful for the feedback from two anonymous reviewers. This article also owes a lot to the work of Peter Janich and to contemporary investigators of biases, such as Gerd Gigerenzer. Important ideas were developed in discussions at the COSIT 2022 conference, and within the QuAnGIS project which is supported by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No. 803498).
Publisher Copyright:
© 2022, The Author(s).
Keywords
- AI for geographic information
- Explainable AI
- Practice of geographic information
- Purpose