It is a great example of "objectivism" where on the surface it seems neutral and rational but actually is the subject of intense motivations that corrupt the content. This is particularly true for anything related to politics, history, nation states, celebrity tech bros, and so on. An average article on "the history of bicycles" is not a problem and not what I am referring to ofc.
For a different reason it is host to a wide range of superficial treatments of scientific and medical information. Which is not ideal. See scholarpedia for a better alternative for this kind of information (although its not well populated).
For example, the page on Adult neurogenesis had an overreliance on early and limited evidence in 2010s with lots of editing wars occurring by academics. Then it got shifted to "controversial" which is better but the process for dealing with new scientific results is not ideal.
Academics with diverging views just started editing wikipedia to suit their views and the whole thing was very preliminary and not accurate for a long time. Then it was bailed into a "controversial" section. I intentionally picked a mild example lol.
We should have access to a Borges/Thomas Pynchon style dictionary of many possible paths to information. That kind of thinking should be encouraged. It can be confusing and controversial.
- I would avoid giving the impression that a simplified narrative is established (unless it won a nobel or is really beyond any reasonable doubt).
- End the use of "controversy" sections where it is used to minimise the impact of perfectly acceptable science. This is clearly used as a tactic in certain instances to minimise results.
- Allow for mixed expert, AI, conventional editing. There is nothing wrong or elitist with including edited articles by academic experts as cut outs of a particular topic. Almost like if scholarpedia and wikipedia were merged.
- Try to combat omission biases. There are some pretty wild ones on wikipedia.
- New results should be handled properly in terms of editing and language.
- Reproduce multiple versions of the same text to show how it would be different if certain hypothesis/experimental results were valid. This is easy to do if using LLMs.
Definitely not! It will be better on certain topics and worse on others. It is all down to whether you think Grok is more thorough with controversial topics, scientific and medical information, etc compared with the current human edit-a-thon on wikipedia.
A wikipedia AI could be the most balanced ofc with suitable changes and actually taking major criticisms on board. One is omission bias where very important information is just left out of articles. Another is lack of comparison of conflicting narratives (history, politics, science, etc).
What's ur opinion on adult neurogenesis? Do you think the progression of that article suggests that wikipedia is the model to follow for scientific information?
What about important and controversial historical events? Do you think that wikipedia omitting information is acceptable?
Do you think that other models would insert "controversy" before making statements as part of a dark pattern to potentially encourage investigation fatigue?
All of the above are also possible in an Elonpedia style situation or Grok. I never said it will be absolutely BETTER.
What topics, specifically, will Grokpedia be better on? Which race has the lowest IQs? Whether trans people are mentally ill and should be committed? Whether a CEO should be able to run 5 different companies while on ketamine? Was Hitler really all that bad?
> It will be better on certain topics and worse on others.
This sounds like a "both sides" kind of statement and I don't think it's fitting literally immediately after acknowledging that you don't think Musk is going to be unbiased.
For a different reason it is host to a wide range of superficial treatments of scientific and medical information. Which is not ideal. See scholarpedia for a better alternative for this kind of information (although its not well populated).