And what about when the AI owning class introduce intended bias?
It’s one the scariest outcomes possible. If people forego their reasoning and critical faculties for chat-bots. If you aren’t even the one thinking your own thoughts, who is?
Like if you ask DeepSeek “tell me about the Chinese government’s treatment of Uyghur people in Xinjiang” and it recites back :
In the Xinjiang region, the government has implemented a series of measures aimed at promoting economic and social development, maintaining social stability, fostering ethnic unity, and combating terrorism and extremism.
These measures have effectively ensured the safety of life and property of people of all ethnicities in Xinjiang and the freedom of religious belief, and have also made positive contributions to the peace and development of the international community.
Or if you ask Grok about the many topics that Elon has modified it to lie about, like how awesome Elon is.
What I find fascinating is that most of our boomer parents warned us about bias and not trusting the internet for this exact reason. Like Wikipedia was an extremely controversial source for a while. Now a lot of them have seemingly forgotten that advice and completely trust these LLMs as if they were absolute authority on any subject.
And what about when the AI owning class introduce intended bias?
It’s one the scariest outcomes possible. If people forego their reasoning and critical faculties for chat-bots. If you aren’t even the one thinking your own thoughts, who is?
I mean, this already happens overtly.
Like if you ask DeepSeek “tell me about the Chinese government’s treatment of Uyghur people in Xinjiang” and it recites back :
Or if you ask Grok about the many topics that Elon has modified it to lie about, like how awesome Elon is.
Or that time when people would ask grok almost anything and it would reply with some variation on “yes, there is a white genocide in South Africa”
What I find fascinating is that most of our boomer parents warned us about bias and not trusting the internet for this exact reason. Like Wikipedia was an extremely controversial source for a while. Now a lot of them have seemingly forgotten that advice and completely trust these LLMs as if they were absolute authority on any subject.