Eliezer typical-minding the world (see the April Fools post)
not enough instrumental ideas
None of the above are actually bad.
I haven't noticed a serious problem with any of these.
Not using terms already defined by other fields of research (reinventing the wheel)
Hard to explain shortly, but I think the understanding that is lacking is pretty well summed up in the quote "There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy. "
Seemed unwelcoming to outsiders.
Lack of a big blinking sign saying start here stupid
Sequences are too long for someone starting out
Too much handwaving, not enough rigour
I never noticed any big problems
Too much eschewing of work done by experts not in LessWrong
complete ignorance and rejection of established philosophy of ethics and epistemology (but not decision theory)
None of the above.
Intense political association with US-style libertarianism
Focus on interpretation of quantum mechanics
Not enough emphasis on lifestyle and application.
Much apparent philosophical convergence was due to accepting positions which seem natural to those with the particular cognitive style common to Less Wrongers, not because the arguments for those positions were good.
too much invention of terms that already have established equivalents in some other discipline so that people less familiar with LW have to find the LW-version of terms they already know
Over-eagerness to draw practical conclusions from theoretical arguments.
Bayesianism as a dogma and unifying paradigm, when it isn't even computable over interesting, i.e. unbounded, domains.
EY lacking understanding of normies
Inappropriate attempts to apply "meta" and math
Too much of a personality cult
Most people in LessWrong weren't/aren't very good at presenting their thoughts in an effective manner socially speaking - being able to get others to listen.
Too excited to jump down the throat of anyone who said something that could be seen as disagreeing with the sequences
UTILITARIANISM? REALLY? MOSQUITO NETS?
The core information about key claims was (and remains) too spread out over the sequences. We desperately need an 80/20'd version of AI-to-Zombies.
Over reliance on nerd culture
Too self-centered - for example, making up in-group jargon for something that already has a technical term
Lesswrong failed to acknowledge it reinvented the wheel why when I can only consider a willful ignorance of the history of philosophy. And then it went off the deep end by writing Harry Potter fanfiction.
People aren't comfortable with more casual interaction.
Too much focus on a specific sort of overly abstract analysis of AI, which ignores practical issues that make AI safety both more difficult and less important.
Eliezer's overconfience about Many Worlds, and overconfidence in general
Personality cult of Yudkowsky
Lack of norms around what was useful discussion
Vague thinking on many-worlds (way better than copenhagen of course but information-theoretical still wins out imho)
Too dismissive of correction and criticism beyond community's orthodoxy
overly reverential attitude to Yudkowsky resulting in difficulty in rationally assessing the quality of his work and arguments.
Too many threads veering into fringe politics.
Way too big for its britches; rampant overconfidence and self absorption despite not being especially clever or insightful. Very poor engagement with existing literature. What was good wasn't new, what was new wasn't good.
Narrow-minded worldview, acceptance of one viewpoint without knowledge of alternatives; blanket disregard of most of philosophy
It's considered to have "peaked"?
The "point" of LW seems to be "bullshitting about rationality-adjacent topics" and it accomplishes that pretty well.
Projections of confidence/arrogance that attracted more people who projected confidence/arrogance. The original confidence/arrogance may or may not have been warranted, but subsequent levels of it often weren't.
Too much not-invented-here syndrome
Some members tend not to be very rational in their commenting, actively outgrouping members.
Failure to actually address foundational issues when they were brought up
Thinking that if you've read the sequences then you're more rational than other people, and so trusting your instinct more and listening to others less, ironically making you less rational.
Thought that we were smarter than everybody else, thus: Neglecting the Virtue of Scholarship, dismissing dissenting thought too quickly
Usual drift into a social club
Too much arrogance on Eliezer's part, not enough thinking outside his box / meta-thinking about what the community might be missing by largely being formed of INTJ quantifier etc types.
Obsession with cognitive biases and "rationality is about System 2 overriding System 1".
It wasn't the pure crystallized epiphany-juice that the Sequences were.
Failure to encourage/generate high quality content outside of the core authors
Taking EY seriously
Lack of focus on AI
Failure to understand the motivation of non-LW poeple, the straw-man often applied to cultural/philosophical/religious communities outside LW.
Reinventing the wheel -- A lot of Less Wrong concepts were already concepts in academia outside Less Wrong
too harsh for newcomers who are legitimately trying to learn by asking 'valid' questions getting burned and downvoted
I think there was a little too much focus on EY the person/author rather than the arguments, many of his arguments were good but even his poor arguments were privileged a little too much
Human minds aren't capable of the analysis necessary to make the sort of broad claims that LW (or any ideological group) does, and so the group ends up doing a lot of ideological short cuts that look like the short cuts of any political group.
Intimidating, hard to fit in
Antipathy towards domain experts
Split in focus of the community between practical life improvement and philosophizing about Future AI Gods.
failure to understand that Not Taking Ideas Seriously has societal value in general/for many people
Rationalism doesn't work. You won't get better results in life by applying this method.
Not enough skepticism for the standard Progressive cultural beliefs.
Conflation of epistemic rationality with instrumental rationality.
Sequences were too tightly intertwined
"Worship" of effective gods in the community (i.e. Eliezer), which could have been circumvented by full anonymity
Focus on AI without any domain expertise
too much emphasis on what essentially amounted to "self-help"
Bad philosophy and history of science
Cesspool comment sections - they were a weird mix of insightful, logically-sound discussion and useless, didn't-read-the-article ranting.
Epiphany Addiction Shots
Extreme Self Congratulatory Tone. EXTREME. UGH. Sorry. Wait, is this a "Philosophical Issue"? Seems to fit the category about as much as "Too Much Jargon."
Too often overconfidently incorrect.
Too much reliance on Yudkowsky's sequences. There was a void left when he stopped writing.
Many Worlds as a litmus test for rationality.
Subtle stuff I don't feel like writing an issue about right now.
False dichotomy between rational discussion and personally liking or being friends with someone
Too foussed on one person
Logical chains getting too long.
Weird treatment of feminism/women's stuff kept me from taking it too seriously
Not enough people contributing relative to consuming - needed to be more of a place of purpose, and without that most of the people who actually make things went elsewhere
Too focused on expanding community
Made it hard for good content to be generated and iterated on because people were reluctant to post
Excessively shallow, S-type thought processes.
If someone isn't practicing, then they're an armchair scholar, and they don't matter. Most Lesswrongers are armchair scholars (I think)
Not enough focus on actual interpersonal social lifestyle issues
Focus on Yudkowsky and his pet interests instead of rationality in general.
Had no problems at its peak
Failure to incorporate more topics for conversation, such as Literature, the History of Philosophy, and metaphysics.
No problems at peak.
crappy web technology
Came across as too insular
It's a sort of strange place. Like the Louvre it is a palace of wonders, but accessible to only those with a degree in Fine Arts and where all the staff are carrying machine-guns.
Seems to worship intelligence.
General dismissal of expertise as a valid thing by certain community leaders
Too closely associated with/uncritical of Eliezer Yudkowsky
I am unaware of any problems
People trying to drive out minority viewpoints
Lack of results, best summarized by Yvain's "Extreme Rationality: It's Not That Great" post.
Ignorant of / divorced from mainstream philospophy
Too confident that LessWrongers understood things better than outsiders
Insistence on utilitarianism as simple mathematical calculation.
Mostly seemed to be a collection of smug jerks engaging in mental masturbation about how smart they were compared to others, rather than providing useful or interesting insights.
I feel that the decline started when "all the easy stuff had been done". The simple fact of the matter is that the sequences cover pretty much all of the interesting and important problems, and the vast bulk of recent posts have just been philosophical masturbation.
Not enough focus on intentionally building skills.
Too weird for most people.
Too many Epiphany Addiction baits
Not practical enough
It is turning into a cult
Too many of the arguments were cargo-cultish and not very good.
Focus on cryonics/AI is ok, but wasn't organized into subsections enough to sort th.
Too scattershot, not enough cumulative progress on developing useful thinking skills.
Not enough iteration on jargon.
Insufficient effort put into establishing a canon beyond the sequences/a condensed form of a philosophical/decision-theoretic theory of everything; Bad formatting/layout: insightful comments take too much effort to find, insightful posts take too much effort to find
Too distracted self-help, and identity. Abstract junk was always lesswrong's strength.
No complaints really. LW at its peak was pretty great.
Over confidence of specific viewpoints of scientific understanding. Specifically QM.
I don't think anyone rightly criticises the idea of mind uploading.
Insufficient discounting of long inferential chains, and even considering this to be a virtue
Not so effective EA
A lack of focus on cultural questions
Too much nitty in-fighting and navel-gazing as opposed to action.
Calling Less Wrong a cult of personality around Eliezer Yudkowsky is a defensible position.
Focus on effective altruism. While I agree with the premises and am going to participate, I'm not really interested in reading posts on it.
I believe LW focused too much on theory and long-term applications. There was a great deal of time spent on fantasies of beisutuskai, but very little advice for what we can do to improve right now. We need concrete steps now, otherwise we will not attain the hypothesized far-future awesomeness.
Too accepting of bullshit feelings and random people not feeling 'welcome' in the community.
Added to my neuroticism about self-value
"I am a member of this community, therefore my thinking is lightyears above everyone else, and whatever comes out of my mouth is superior to actual research."
Although I think these are the "biggest" philosophical problems, I don't consider them particularly large. They're the least small of our small problems.
I don't really think anything was wrong with it
People on Less Wrong argue about a lot of stuff. Sometimes I think a "LessWronger" is actually someone who criticizes LessWrong all the time. LessWrong is fine! It doesn't really have problems other than the tendency to attach problems to itself.
Focus on Eliezer's fundraising
Disorganized giant web of links and dependencies made it hard for newcomers
Uncritical hero worship by too many new members
Needs more instrumental-minded actions and posts
It's scary and abrasive and has culty vibes and I can't link other people to it
People were overoptimistic about the difficulty in writing good blog posts, even for a community of like-minded people. Just because you know a lot about AI/psych/stats/CS, doesn't mean you can write a good blog post, esp in a reasonable amount of time.
Not sure what its "peak" was. Obviously, it's peaked at some point, but I'm not sure when it was.
Not enough focus on practical, non-extreme things we can do to help the world
Mistrust in academia
Eliezer does not know shit about quantum mechanics while pretending he does
Not enough of the nameless virtue
LW's core concerns around decision theory and AI were ones that only few people could contribute to meaningfully. The rest of us could only lurk and nod along in vigorous agreement at our computer screens. The bit that interested everybody, the biases, rationality and self-improvement bits were great, but not as focused. We learned a lot, but those issues are not things that will keep a community together for ever. People will move on to more specific projects, which is exactly what happened.
The metaethics sequence was too unfocused and inconclusive.
Too confident that they know better than outsiders
Mild bias against Blue Tribe people with preference for Grey Tribe.
Elitism (dismissive/disvaluing of time spent with people or wisdom learned from people who weren't 120+ IQ protogeniuses).
Obsession with the Bayesian view of statistics to the exclusion of other alternatives.
There didn't seem to be an in-between place between those most invested in the community and those who were still in-progress with the sequences. The community on the site felt inaccessible.
(Slightly) reinventing the wheel
People not being able to SELF-apply the discourse ideals and then attempting to talk about topics above our sanity waterline, like politics and sex and sexual politics.
You need to link facebook/reddit more. Alternatively I would create one board like hackernews which has all the good stuff everyday. Your main issue is discoverability.
Sequences really need to be redone with A/B testing and maybe youtube videos. Exercises and questions would be good.
Tackling hard problems in a public forum? That sounds pretty doomed to fail.
This is like the second time I've ever been to the site, so I don't know.
Dismissal of related work done in relevant mainstream fields, including philosophy and computer science
Coalesced around autodidacts with little formal training in math, which leads to a lot of verbiage and not a lot of rigor, eg, Eliezer's comments about whether randomness is ever necessary for an algorithm.
A tendency to hyperfocus on the ridiculousness of Roko's Basilisk stopped most people (inside and outside of LessWrong) from thinking about more-plausible acausal-trade-based ideas.
To be fair I wasn't around for the peak, however, I'd cite the aforementioned Basilisk. It suddenly presents a sort of 'Rational Devil' to a super intelligent Al's 'Rational Messiah'. It's silly and mildly off-putting. Thankfully the information on the site, especially the Sequences, is far too useful to just toss aside.
Normalizing behaviors that cement the community's low status from the perspective of the rest of society. This is a self-reinforcing participation filter.
Insufficient recruitment of academics
Widespread usage of some bullshit elements of the right-wing political culture like "Maslow's hierarchy of needs", "Overton windows", Chesterton quotes and the worship of awful people like Eric S. Reymond and Robin Hanson
Too little awareness of existing work/broader context; "NIH syndrome" and LW exceptionalism
Don't know when it was best.
I got the sense that certain pet explanatory frameworks (e.g. Bayesian probability) motivated the substance of, and positions taken by the core LW teachings, and not the other way around; meanwhile invocations of other frameworks (e.g. computational complexity theory) were absent or lacking. This is probably mostly a matter of knowledge gaps, though.
Very broad but shallow knowledge - LessWrong writers (and commenters) enjoy trotting out complex features of philosophy and science and citing poorly explained summaries so as to prove their points by hand-waving
The linking/interconnectedness was overwhelming going in for the first time, almost to the level of TVTropes.
I cannot say, I was not aware of this site back then.
Too much ingroup signaling
Loss of purpose/drive
Too much frequentism bashing from people who didn't know shit about what it is
Sometimes people who weren't Eliezer or Scott would make posts
Too much talk. Not enough action. / Too much focus on epistemic rationality and navel-gazing, not enough practical advice and communities.
Besides creating some (probably very fun) communities in nerd Dense areas in the US, LessWrong doesn't seem to have done enough to actually help people be more productive, as a matter of fact, it acts(ed) as a productivity sink for people who feel like it helps to read it, but who are just spending time with little benefit.