There is a term describing the fallacy of equating...
# datascience
t
There is a term describing the fallacy of equating a video game or board game AI with having real world applications and AGI capabities. For example, a chess algorithm that wins chess does not mean it can master strategy in corporate or military environments. Does anybody know what this term is? Reason I ask is because I'm considering writing about the fetish of video games in AI research. https://deepmind.com/blog/alphastar-mastering-real-time-strategy-game-starcraft-ii/
v
I think the term is
incomprehension
🙂. As for the AlphaStar, guys played Starcraft for big money. Epic win. But seriously, perhaps, it can be applied to some real-world tasks with some kind of cascade of AI agents. E.g. on top-level agents are responsible for strategy, on lower level -- for tactics, on the lowest -- for orientation and action