Socionics meets the following criteria for it being pseudoscience:

Pseudoscience

Pseudoscience is often characterized by the following: contradictory, exaggerated or
unfalsifiable claims; reliance on confirmation bias rather than rigorous attempts at refutation; lack of openness to evaluation by other experts; and absence of systematic practices when developing theories.

The following are some of the indicators of the possible presence of pseudoscience:

Use of vague, exaggerated or untestable claims

  • Assertion of scientific claims that are vague rather than precise, and that lack specific measurements[43]
  • Assertion of a claim with little or no explanatory power.[31]
  • Failure to make use of operational definitions (i.e. publicly accessible definitions of the variables, terms, or objects of interest so that persons other than the definer can measure or test them independently)[Note 4] (See also: Reproducibility).
  • Lack of boundary conditions: Most well-supported scientific theories possess well-articulated limitations under which the predicted phenomena do and do not apply.[46]


Over-reliance on confirmation rather than refutation

  • Assertions that do not allow the logical possibility that they can be shown to be false by observation or physical experiment (see also: Falsifiability).[17][48]
  • Assertion of claims that a theory predicts something that it has not been shown to predict.[49] Scientific claims that do not confer any predictive power are considered at best "conjectures", or at worst "pseudoscience" (e.g. Ignoratio elenchi)[50]
  • Assertion that claims which have not been proven false must therefore be true, and vice versa (see: Argument from ignorance).[51]
  • Over-reliance on testimonial, anecdotal evidence, or personal experience: This evidence may be useful for the context of discovery (i.e. hypothesis generation), but should not be used in the context of justification (e.g. Statistical hypothesis testing).[52]
  • Presentation of data that seems to support claims while suppressing or refusing to consider data that conflict with those claims.[25] This is an example of selection bias, a distortion of evidence or data that arises from the way that the data are collected. It is sometimes referred to as the selection effect.
  • Promulgating to the status of facts excessive or untested claims that have been previously published elsewhere; an accumulation of such uncritical secondary reports, which do not otherwise contribute their own empirical investigation, is called the Woozle effect.[53]


Lack of openness to testing by other experts


  • Evasion of peer review before publicizing results (termed "science by press conference"):[54][56][Note 5] Some proponents of ideas that contradict accepted scientific theories avoid subjecting their ideas to peer review, sometimes on the grounds that peer review is biased towards established paradigms, and sometimes on the grounds that assertions cannot be evaluated adequately using standard scientific methods. By remaining insulated from the peer review process, these proponents forgo the opportunity of corrective feedback from informed colleagues.[55]
  • Some agencies, institutions, and publications that fund scientific research require authors to share data so others can evaluate a paper independently. Failure to provide adequate information for other researchers to reproduce the claims contributes to a lack of openness.[57]
  • Substantive debate on the evidence by knowledgeable proponents of all view points is not encouraged.[58]


Absence of progress


  • Failure to progress towards additional evidence of its claims.[48][Note 6]Terence Hines has identified astrology as a subject that has changed very little in the past two millennia.[46][59] (see also: scientific progress)
  • Lack of self-correction: scientific research programmes make mistakes, but they tend to reduce these errors over time.[60] By contrast, ideas may be regarded as pseudoscientific because they have remained unaltered despite contradictory evidence. The work Scientists Confront Velikovsky (1976) Cornell University, also delves into these features in some detail, as does the work of Thomas Kuhn, e.g. The Structure of Scientific Revolutions (1962) which also discusses some of the items on the list of characteristics of pseudoscience.
  • Statistical significance of supporting experimental results does not improve over time and are usually close to the cutoff for statistical significance. Normally, experimental techniques improve or the experiments are repeated, and this gives ever stronger evidence. If statistical significance does not improve, this typically shows the experiments have just been repeated until a success occurs due to chance variations.


Personalization of issues




Further reading:

Michael Shermer's theory of belief-dependent realism is driven by the belief that the brain is essentially a "belief engine," which scans data perceived by the senses and looks for patterns and meaning. There is also the tendency for the brain to create cognitive biases, as a result of inferences and assumptions made without logic and based on instinct — usually resulting in patterns in cognition. These tendencies of patternicityand agenticity are also driven "by a meta-bias called the bias blind spot, or the tendency to recognize the power of cognitive biases in other people but to be blind to their influence on our own beliefs."[77] Lindeman states that social motives (i.e., "to comprehend self and the world, to have a sense of control over outcomes, to belong, to find the world benevolent and to maintain one's self-esteem") are often "more easily" fulfilled by pseudoscience than by scientific information. Furthermore, pseudoscientific explanations are generally not analyzed rationally, but instead experientially. Operating within a different set of rules compared to rational thinking, experiential thinking regards an explanation as valid if the explanation is "personally functional, satisfying and sufficient", offering a description of the world that may be more personal than can be provided by science and reducing the amount of potential work involved in understanding complex events and outcomes.[78]

There is a trend to believe in pseudoscience more than scientific evidence.[79] Some people believe the prevalence of pseudoscientific beliefs is due to widespread "scientific illiteracy".[80] Individuals lacking scientific literacy are more susceptible to wishful thinking, since they are likely to turn to immediate gratification powered by System 1, our default operating system which requires little to no effort. This system encourages one to accept the conclusions they believe, and reject the ones they don't. Further analysis of complex pseudoscientific phenomena require System 2, which follows rules, compares objects along multiple dimensions, and weighs options. These two systems have several other differences which are further discussed in the dual-process theory.[citation needed] The scientific and secular systems of morality and meaning are generally unsatisfying to most people. Humans are, by nature, a forward-minded species pursuing greater avenues of happiness and satisfaction, but we are all too frequently willing to grasp at unrealistic promises of a better life.[81]

https://en.wikipedia.org/wiki/Pseudoscience

-------

It's actually pretty astonishing how much they fit in with Socionics. This pretty much confirms that Socionics is indeed pseudoscience.