Donate SIGN UP

Answers

1 to 19 of 19rss feed

Best Answer

No best answer has yet been selected by youngmafbog. Once a best answer has been selected, it will be shown here.

For more on marking an answer as the "Best Answer", please visit our FAQ.

Question Author
Quite TTT.
i dont really understand how this is news (to anybody) or even why people would spend money to investigate what is searingly obvious
Question Author
Perhaps because the Liberal Left deny it?

As no doubt we will see later in the day on this thread.
A rather important question is *why* is it biased so. Most of the time, so far as I'm aware, biases emerging from such software is accidental or unintentional, reflecting, and then often amplifying, biases in the source data.
Question Author
San Francisco programmers Clare. That is why it is no surprise.

Whilst what you say about data source is true its really not difficult to make a program do what you want (ie reflect your thinking). This happens all the time just in normal code. Ask 10 people to code something you will get ten different ways of doing it. The more complex the code the more versions you will get.
You're gonna meet some gentle people there.
The algorithms that chatgtp uses are gleaned from datasets thousands of ‘reliable’ news sources.
That means sites such as the Daily Mail are not included.

https://www.theguardian.com/media/2019/jan/23/dont-trust-daily-mail-website-microsoft-browser-warns-users

If you get your world view from the pages of the DM, you will percieve virtually any other source as being left of your views.
Yes it's also clearly true that if you pick the training data with a deliberate bias then it's no surprise if the results reflect that. Still, this paper makes no comment on what the source of such bias is, and it's premature to assume that this was intentional.

Question Author
//it's premature to assume that this was intentional.//

I cannot disagree with that.

But we have to start somewhere. Now it is out there let us see if it gets corrected.

presumably Musk is upset as he thinks it should be taking more notice of stuff on X.
some thoughtful responses here about why this might, or might not, be a problem

https://www.sciencemediacentre.org/expert-reaction-to-study-measuring-political-bias-in-chatgpt/
The research does not say the AIs answers are untrue, it says they are not right wing enough.

Perhaps it has a bias to truth ? :-)
Question Author
//If you get your world view from the pages of the DM//

Grow up.

The study was done by the University of East Anglia. Are you really saying they real the Mail?

Yes jon Musk previously said it, seems he is proved right. Any comment on that?
Question Author
//The research does not say the AIs answers are untrue, it says they are not right wing enough.//

No it doesnt, it says they are not neutral and have a left wing bias.

Stop spinning it.
Question Author
Whats that got to do with the price of tea jno?

No one apart from Gromit is citing the DM. Has Sky got problems with ChatGPT as well? And all the other sites?

No surprise the BBC is quiet.
you referred to the Mail yourself. If the Mail objects to ChatGBT using its content, then that reduces the amount of right-wing speech it will ingest.
Question Author
//you referred to the Mail yourself.//

No I didnt, I was answering Gromit and his post.

Stop spinning things.

1 to 19 of 19rss feed

Do you know the answer?

Chatgbt Has Left Wing Bias

Answer Question >>

Related Questions

Sorry, we can't find any related questions. Try using the search bar at the top of the page to search for some keywords, or choose a topic and submit your own question.