This is Everything2's science writing group, existing to encourage, facilitate and organise the writing and discussion of science on this site. Members are usually willing to give feedback on any writing on scientific topics, answer questions and correct mistakes.

The E2_Science joint user is here make it easier to collectively edit and maintain the various indexes of science topics: Scientists and all its sub-indexes, physics, astronomy, biochemistry and protein. More may follow. It also collects various useful links on its home node.

Note that there is also a separate e^2 usergroup for the discussion of specifically mathematical topics.


Venerable members of this group:

Oolong@+, CapnTrippy, enth, Professor Pi, RainDropUp, Razhumikin, Anark, The Alchemist, tom f, charlie_b, ariels, esapersona, Siobhan, Tiefling, rdude, liveforever, Catchpole, Blush Response, Serjeant's Muse, pimephalis, BaronWR, abiessu, melknia, IWhoSawTheFace, 10998521, sloebertje, getha, siren, pjd, dgrnx, flyingroc, althorrat, elem_125, DoctorX, RPGeek, redbaker, unperson, Iguanaonastick, Taliesin's Muse, Zarkonnen, SharQ, Calast, idan, heppigirl, The Lush, ncc05, Lifix, Akchizar, Palpz, Two Sheds, Gorgonzola, SciPhi, SyntaxVorlon, Redalien, Berek, fallensparks, GunpowderGreen, dichotomyboi, sehrgut, cordyceps, maverickmath, eien_meru, museman, cpt_ahab, mcd, Pandeism Fish, corvus, decoy hunches, Stuart$+, raincomplex, Tem42@
This group of 71 members is led by Oolong@+

Pellagra is a disease resulting from a nutritional deficiency, the lack of niacin, one of the B vitamins. Since pellagra is caused by lack of niacin, which is present in many foods and can be synthesized in the body from the tryptophan in others, only a few types of diet lead to it. The disease was almost unknown to the world before the 1500s, because the most common cause of pellagra is a diet made up almost exclusively of corn (that is, maize or Indian corn -- the plant of the scientific name zea mays). Since this plant is native to the Western Hemisphere, pellagra was not found in the Eastern Hemisphere until this plant had been imported and become a major food crop.

However, pellagra seems to have been quite rare in the Western Hemisphere before the arrival of Europeans, even though corn was one of the major food crops. This seems to be because the other major plant foods, beans and squashes, provided sufficient niacin and also because the methods used to make corn tortillas involved soaking the kernels of corn in alkali or lime and water, processes which release what little niacin is present in corn from the chemical bonds that usually make it unavailable for the body's nutritional needs. The disease remained rare in areas where these foods continued to be the standard diet after European colonization.

However, in places where cornbread, corn mush/polenta, and other corn products became the major food for poor people who could not afford additional foods, pellagra became widespread among the poor. Doctors first recorded the disease in Europe in the early 1700s in northwestern Spain; it was then called "mal de la rosa," or the illness of redness. Certainly the red scaly sores were the first noticeable symptom of the three major ones (later called the three Ds: dermatitis, diarrhea, and dementia). Similar symptoms got different names in other areas, such as "mal del higado" in other parts of Spain and "mal de la teste" in parts of France. "Pellagra" is a regional Italian name for the disease which means "rough skin."

These different names made it difficult for doctors to realize that it really was the same disease. Probably that was a handicap in figuring out what caused the condition. French physicians in the 19th century often thought it was something caught from sheep that for which certain types of people had a hereditary susceptibility. Others realized that it only affected those who ate mostly corn, but thought the disease was a result of mold on the corn, in the same way that ergotism is caused by rye infected with the ergot fungus. Additional theories included bad water, excessive use of sea salt, sunstroke (since the skin lesions of the disease tend to recur in spring and summer) and that it was just a hereditary condition. Also, many cases of pellagra went unrecognized, mistaken for some other condition.

Pellagra was not much recorded in North America until the late 1800s, though examining of old records makes it seem to have been around for at least half a century before. It became widely known as a health problem in the early 1900s, particularly in the South. At the 1909 First National Conference on Pellagra, the fact that the disease seemed to occur mostly in people who subsisted on corn was emphasized, but the speakers seemed to support the idea that moldy corn or some problem with imported corn was the cause -- no one seemed to want to think that ordinary, locally grown corn could cause any problems.

Treatment ideas varied widely. Arsenic was popular for a while, as was Paul Ehrlich's arsenic-related anti-syphilis drug Salvarsan; other drugs administered included strychnine and quinine. And that's just the things that doctors prescribed -- the number of quack remedies was enormous.

Dr. Joseph Goldberger's observations and experiments in 1914-6 with children in orphanages and volunteer prison inmates showed that pellagra could be cured and prevented by adding other kinds of foods to the diet, that it could not be transmitted from one person to another, and finally that it could be induced in previously healthy people by giving them the limited diet that had been observed in areas where pellagra was prevalent. (His conclusions took a while to be accepted widely, but are now recognized to be the first proof.) At this time, though, it was still not known exactly what component was lacking in the diet.

A disease called blacktongue found in dogs was discovered to be the canine equivalent of pellagra, and this allowed Goldberger and others to experiment with exactly what foods and what chemicals caused and prevented the disease. Brewer's yeast was discovered to prevent and cure pellagra, but this only worked for pellagrins who could still eat -- those far into the course of the disease had loss of appetite, burning in the throat, nausea, and often delusions, and any of those could interfere with administering a curative by mouth. An extract of liver was found to work if injected in large doses, but this was very expensive and the poor people who got the disease could not usually afford it.

In 1935, Conrad A. Elvehjem and Carl J. Koehm, employees of the Department of Agricultural Chemistry at the University of Wisconsin, isolated two chemicals, nicotinic acid and nicotinamide, which both would cure blacktongue in dogs in a very short time. Nicotinic acid was eventually discovered to be the real factor needed to prevent the disease, but in 1941 the chemical was renamed "niacin" to avoid confusion with nicotine.

The discovery of the vitamin lacking in the pellagrin diet did not automatically make the disease disappear. However, in the early 1940s, U.S. bakeries and mills started enriching the flour they ground with some of the nutrients the wheat lost in the milling process. Some Southern states started to require the same be done with cornmeal. In 1943 during World War II, enrichment was made mandatory by the U.S. government. After the war, it was no longer mandatory but most bread producers continued to enrich their bread, and this essentially eliminated pellagra in the U.S. except as a side effect of other diseases (alcoholism, anorexia) in which little niacin is eaten or something prevents the proper absorption of niacin.

The same reduction of pellagra has happened in most other industrialized countries. The place it is most often seen now is portions of the Deccan Plateau of India, where the main food source is a type of millet/sorghum which, although it technically contains sufficient niacin, also contains a substance called leucine which prevents the proper absorption of the niacin. Parts of Egypt and South Africa also had a lot of pellagra among the poor in the later 20th century.

The first visible symptom of pellagra is usually the lesions on the skin and mucous membranes. It also causes weakness, weight loss, irritability, depression, and eventually mental confusion and memory loss. Pellagra victims often ended up in insane asylums. Without treatment, pellagra leads to death, sometimes by malnutrition, since the patient often loses the strength to eat, has mouth lesions making it painful to eat, suffers enough intestinal damage that anything they do eat cannot be absorbed, and loses nutrients from extreme diarrhea. Other causes of death can be blood loss from intestinal bleeding or the encephalopathic syndrome resulting from the effects on the brain.

There's no chemical test for pellagra; it is diagnosed from the symptoms and from the patient's improvement when given niacin.

Sources:
Roe, Daphne A. A Plague of Corn: A Social History of Pellagra. Ithaca: Cornell University Press, 1973.
http://www.nlm.nih.gov/medlineplus/ency/article/000342.htm
http://www.emedicine.com/ped/topic1755.htm
http://www.mc.vanderbilt.edu/biolib/hc/nh7.html
http://www.healthatoz.com/healthatoz/Atoz/ency/pellagra.jsp http://www.wrongdiagnosis.com/p/pellagra/symptoms.htm

An order of magnitude is, roughly, a factor of ten. So 1000 is an order of magnitude bigger than 100, which is an order of magnitude bigger than 10, and so on.

Order of magnitude notation (also known as scientific notation) is based on this concept, expressing it compactly in the form x×10y. For example we can write the speed of light as 3×108m/s, which means 3 with 8 zeros after it - 300,000,000, or three hundred million. This is indispensable in science, where almost every discipline needs to talk about numbers which are much, much bigger than other numbers, and nobody wants to have to write out that an electron weighs 0.000,000,000,000,000,000,000,000,000,000,091kg when they could just write 9.1×10-31. The importance of things happening on mind-bogglingly different scales is explored to great effect in the short film Powers of Ten.

Very big and very tiny numbers are always difficult for humans to get their heads round, and writing them down the way we write more familiar numbers is just disorienting - one reason that public discourse about science, finance and anything to do with statistics tends to be massively confused and often misleading. Switching to an exponential scale like order of magnitude is an incredibly useful trick for expressing enormous differences in quantity. It is still not altogether intuitive, but at least it makes their representation a tractable problem. Exponential scales are also used on graphs sometimes, so that an order of magnitude increase is represented by a constant distance on one of the axes. This can be confusing initially, but it is invaluable when variations are important at both very small and very large scales, and especially when the variables you are interested in change exponentially. In a sense, an exponential scale helps us to compare like with like - a difference of 1g is big when you're talking about something that only weighs 10g altogether, but if you add 1g to something that already weighs 1000g, it is likely to be irrelevant. A linear scale obscures this by showing a 1g difference as being equal wherever it occurs, whereas an exponential scale helpfully shows a 10% increase as being equal whether it is 1g added to 10g, or 100g added to 1000g - it brings out differences on the same order of magnitude as the thing that is changing.

As an interesting aside, the human perceptual system also uses the same trick, presumably for the same reason of needing to meaningfully compare amounts that sometimes differ wildly. For example a repeated doubling of brightness, pitch or loudness is perceived as a steady increase, when in fact the rate of change is constantly increasing. This comes out in the way cameras and televisions are designed, the fact that musical scales are divided into octaves, and the way that an increase of ten decibels represents a ten-fold increase in the power of a sound.

In science, it is quite often enough to know just the order of magnitude of a number, to get an idea of whether something is important, or a plausible candidate for a result. If you come up with an answer that is a hundred times bigger than you expected, you should probably go back to the drawing board, but if you can show that something is about a million times too small to make a noticeable difference, getting your answer wrong by a factor of four in either direction isn't going to be disastrous. Order of magnitude is often employed as a deliberately fuzzy concept, for this reason - when people talk about numbers being 'on the order of tens of millions', that means that they might well be ten times bigger or smaller, but we probably don't need to worry about it.

"I have two small embryos preserved in alcohol, that I forgot to label. At present I am unable to determine the genus to which they belong. They may be lizards, small birds, or even mammals."

Karl von Baer (1828)


The changes undergone by a developing embryo forms an hourglass: early on embryos from different species vary significantly, later on they converge to be similar, and finally they diverge again, to develop into very different animals. This middle stage - the phylotypic stage - is when different animals from different threads of life, whether they be zebrafish or mice or chicken all resemble each other.

Why?

Von Baer was a Christian believer, and saw the handiwork of god implicit in the similarities. The elegance of having different species appear similar before sprouting variations suggested a proto-plan to which each animal abided before going its own way. The nineteenth century scientist was right to see a single plan maintained across the various species, but his explanation was too simple.

In late 2010 the science journal Nature published two studies that addressed this problem, here's what they found:

 

 

1. Gene expression divergence recapitulates the developmental hourglass model.

The central dogma of biology says that DNA is the instruction manual. DNA is used by being made into RNA and then into proteins. All cells have the same DNA, but they use it differently. The DNA that is being read by a cell is being expressed.

Samples were taken from embryos at different time points and measured using microarray technology: the expressed genes from the sample were copied, labeled, and washed over a microarray containing a pattern of DNA fragments. The microarray was scanned for the label, and computed, providing an output of which genes were expressed.

Once they had measured gene expression over time for a single species, it was easy to repeat the procedure for the other species.

With this data the group could ask: how doe the difference in gene expression between species change as the animals' develop? And the answer: Gene expression is very different at the beginning and end of embryo development, but is most similar in the middle. The group also showed that the genes that best conformed to this hourglass model were genes known to be involved in embryo development. Thus they could conclude the phylotypic stage of development could be defined, not just on the basis of morphology, but also on the basis of developmental genes' expression.

2. A phylogenetically based transcriptome age index mirrors ontogenetic divergence patterns.

Whereas the above study compared gene expression across species, this study looked at gene expression in a single animal.

This group had previously developed a technique called phylostratigraphy (phylum + stratigraphy). This is a bioinfomatic technique that analyzes the contents of a DNA, and attributes ages to the genes. It achieves this by determining when a particular gene arose on the basis of which forms of life have it, and which don't. If, for example, a gene is present in a mouse and a human, it means that the gene was likely present in the common ancestor of mice and humans. If the gene only appears in one of mice or humans, then it likely arose after the two lineages diverged.

Here too, the group took samples at different stages of embryonic development, then submitted the samples to analysis using microarrays. This output told them which genes were activated at different stages of development. They then submitted this data to their phylostraitographic algorithm.

The output of this was to tell them the evolutionary age of the genes expressed, and how this changes as the animal develops. Here too they found an hourglass pattern: The genes at early and late stages are the most recent, whereas those in the middle are the oldest. In other words, during the phylotypic stage of development, organisms tend towards using their older genes.

What does it all mean?

To restate, the two papers showed that, as compared to other stages, the genes expressed during the phylotypic stage:

  • Display higher similarity between species
  • Display older ancestry

This provides a genetic and evolutionary basis for the morphological similarities observed by old Karl von Baer, but it also reinforces another old question: why? There are two viable and popular, and somewhat overlapping, hypotheses that could explain this phenomenon. Both hypotheses assume that there is some feature which is so necessary to development that it is resistant to evolutionary change.

The first hypothesis predicts that the feature that is resistant to change in the phylotypic stage is the interconnectedness of the signaling networks. The second hypothesis predicts that the resistant feature is the connection between growth and patterning, as evidenced by the primordial Hox genes.

The first hypothesis interprets the phylotypic stage as being the last time in development when everything depends on everything else. Later development is significantly modular, so that for example each limb develops with reference to itself, and the effect of mutations may be limited to affecting discrete pathways. By depending on an increased level of pathway interconnectivity, the phylotypic stage tests and ensures the viability of the entire network.

The second hypothesis interprets the phylotypic stage as being the informational cornerstone of development. The formation of complex lifeforms depend on the construction of a form-neutral skeletal pattern, onto which specific instructions for form may be interpreted. This requirement for a base code onto which any and all instructions can be linked is provided during the phylotypic stage.

 

Which ever hypothesis may be true, it remains the case that the phylotypic stage is representative of one of evolution's quirks: great success is irreversible. Certain evolutionary products were only won once: think ribosomes and amino acids, think eukaryotes' dependence on mitochondria, and so forth. This is why, if only in potential, intelligent design and starting from scratch might be the best answer.

 

References: Evolutionary biology: Genomic hourglass and Gene expression divergence recapitulates the developmental hourglass model and A phylogenetically based transcriptome age index mirrors ontogenetic divergence patterns all from Nature 468 (2010).

Giardia lamblia is very unusual in evolutionary terms. It seems to be an intermediate between bacteria and multicellular life. Specifically, it is a eukaryote, but does not have any mitochondria. Modern classification would make it a separate kingdom within the Eucarya, parallel to the common ancestor of all the other kingdoms.

Until very recently it was also thought to have neither endoplasmic reticulum nor Golgi apparatus. These have now been detected. This discovery confirms a theory that the cell nucleus evolved in the same evolutionary event as the endomembranes. All creatures that have one have the other. The Golgi apparatus is present throughout the life cycle but in functionally different forms, which had earlier prevented its identification.

It and some other protists lack mitochondria. This would normally be taken to mean that they are from a line of early eukaryotes that diverged before our ancestors fused with symbiotic bacteria (perhaps 1000 million years ago), as suggested in the endosymbiotic hypothesis. But Bohdan Soltys and Rad Gupta, the scientists who found the endoplasmic reticulum in 1996, also report traces of a mitochondrial protein called hsp60, which may mean that precursors to Giardia had them and lost them.

This protozoon has a trophozoite phase, that is one that adheres to the columnar cells of the villi of the intestine. It has 4 pairs of flagella and a sucking disk to attach to the surface of the villus and a beak for eating honey. This form has two nuclei. In the cyst phase found in the faeces it is a roundish thing with four nuclei. Presumably the cyst phase it what it turns into to reproduce, leaving one host and spreading into the water for others to pick up.

The diarrhoeic disease it causes is called giardiasis, or colloquially beaver fever, because it is often caught in waters where such animals have been excreting.

Giardia were among the "animalcules" observed by van Leeuwenhoek in the 1600s.

It is named for the Belgian marine biologist Alfred Giard (1846-1908), who was a Neo-Lamarckian and a professor at the Sorbonne, and founder of a laboratory at Wimereux. (He is called French by some less reliable sources I've seen; but I haven't been able to find anything much about him.)