Definify.com

Webster 1913 Edition


Organicism

Or-gan′i-cism

,
Noun.
(Med.)
The doctrine of the localization of disease, or which refers it always to a material lesion of an organ.
Dunglison.

Definition 2024


organicism

organicism

English

Noun

organicism (countable and uncountable, plural organicisms)

  1. (philosophy) The treatment of society or the universe as if it were an organism.
  2. The theory that the total organization of an organism is more important than the functioning of its individual organs.
  3. (dated, medicine) The theory that disease is a result of structural alteration of organs.

Translations

See also

Anagrams