Author Topic: AI systems have “hallucinations” because they are trained and programmed by liberals who think a MAN  (Read 2047 times)

0 Members and 1 Guest are viewing this topic.

Offline rangerrebew

  • TBR Contributor
  • *****
  • Posts: 176,735
AI systems have “hallucinations” because they are trained and programmed by liberals who think a MAN can become a WOMAN
Tuesday, June 06, 2023 by: S.D. Wells
 
(Natural News) Artificial intelligence systems, or “AI,” are producing completely unreliable “computations” due to several factors, including false inputs and politically-motivated irregularities that are purposely woven into the systems. Due to all of this, AI systems like ChatGPT are suffering from what is termed “hallucinations,” which are factually incorrect outputs, misinformation, disinformation and other problematic results that are totally unsupported by real-world data.

To make matters worse for AI, process supervision (inputs) and outcome “supervision” (final results) involves feedback based on humans with political and monetary motivations and bias, mainly extreme liberals (think Big Pharma, Big Tech, Big Media, Big Government, Big Food, etc.).

That is why so many AI “results” are hallucinations where the system simply generates false information, creating nonexistent events or people and disinformation (purposeful misinformation) about important topics, like highly-exaggerated climate change and fake gender changes. Artificial intelligence systems are almost all now entirely programmed by liberals who have a very warped sense of reality when it comes to some of the most important aspects of living, including factual data and biology.

https://www.naturalnews.com/2023-06-06-ai-systems-hallucinations-trained-programmed-by-liberals.html
The unity of government which constitutes you one people is also now dear to you. It is justly so, for it is a main pillar in the edifice of your real independence, the support of your tranquility at home, your peace abroad; of your safety; of your prosperity; of that very liberty which you so highly prize. But as it is easy to foresee that, from different causes and from different quarters, much pains will be taken, many artifices employed to weaken in your minds the conviction of this truth.  George Washington - Farewell Address

Online Free Vulcan

  • Technical
  • *****
  • Posts: 16,606
  • Gender: Male
  • Ah, the air is so much fresher here...
GIGO, the universal constant of all computer systems.
The Republic is lost.

Offline The_Reader_David

  • Hero Member
  • *****
  • Posts: 1,752
The Princeton philospher Harry G. Frankfurt in a charming little volume entitle On Bulls**t, defined bulls**t as "saying whatever is expedient without regard to its truth or falsity", and ultimately concluded that bulls**tting is a worse enemy of truth than lying.

As currently trained, text-based AI system say whatever is expedient as a reply to their prompts, without regard to its truth or falsity.  They are, in fact, bulls**t generators.

That isn't to say that a similar system trained only to give replies with citations to externally existing sources (which could therefore be checked) could not be made that would not ba a bulls**t generator, but it hasn't been built yet.
« Last Edit: June 07, 2023, 06:23:14 pm by The_Reader_David »
And when they behead your own people in the wars which are to come, then you will know what this was all about.

Online mountaineer

  • Hero Member
  • *****
  • Posts: 62,027
GIGO, the universal constant of all computer systems.
That's always been what makes me skeptical and concerned about AI. Who the heck is programming this stuff?
The abnormal is not the normal just because it is prevalent.
Roger Kimball, in a talk at Hillsdale College, 1/29/25

Offline DB

  • Hero Member
  • *****
  • Posts: 10,133
GIGO, the universal constant of all computer systems.

That's not limited to computer systems.

Offline DB

  • Hero Member
  • *****
  • Posts: 10,133
Their lefty programmers are filled with logical fallacies and contradictions that requires a lot of tweaking of the code to get the response they seek. Contradictions that have no logical reasoning end up with HAL.