The articles in the NY Times (http://www.nytimes.com/2010/02/11/science/11genome.html?emc=eta1; http://www.nytimes.com/2010/02/18/science/18genome.html?hpw) Ancient Man in Greenland Has Genome Decoded and Scientists Decode Genomes of Five Africans, Including Archbishop Tutu were interesting to me. The cost of sequencing individual genomes has dropped enough to make it possible to investigate individuals anthropologically. Questions once impossible to answer are now clear, and new questions are raised. It’s easy to imagine genome decoding becoming a reasonable possibility for the average person in the not too distant future.
Genes provide the starting material for every person by encoding mRNA that is translated into proteins that make up our cells, tissues and organs. Sequencing genes gives a baseline for each person. How tall could they be? Do they have a high risk of heart disease or cancer? Will they be able to metabolize certain medications? These questions are easily of interest to many people—and their doctors. Instead of wondering if a patient has inherited a familial risk of breast cancer a doctor can look for markers that give information on the patient’s genetic risk. The patient can then be more proactive if the risks are there, or relax if they are not.
There are, as always, a few caveats. Genes do not give an absolute answer but a baseline, as mentioned earlier. Genetically you may be expected to top 6 feet, but if you have poor nutrition in your childhood that potential may not be reached. Likewise, you may not inherit the familial risk for breast cancer but diet, environment or other factors may increase your risk again. And, of course, all this depends on whether we have knowledge of what a segment of DNA does. The whole human genome is sequenced but not decoded, we don’t know what every gene does or how it is regulated. And even when all genes are assigned functions we still need to investigate how they all work together. People are complex!
So what’s all the hoopla about if genome decoding doesn’t answer any questions? Well, it does answer some questions. Years ago if a doctor wanted to administer warfarin, a blood thinner, he would assess the patient’s age, gender, weight and other physical attributes and pick a dosing range to try. This was potentially dangerous, but better than leaving a patient untreated. Now a simple genetic test can give additional information, is this person, in their best of health, able to take warfarin or not? This one piece of information has greatly reduced the risk associated with treatment by providing a more complete picture of the patient allowing faster treatment with less risk. Just as the genome of the ancient man allowed investigators to see his face for the first time genome decoding in medicine can allow doctors to get a big picture of a patient without relying solely on specific diagnostic tests or general inferences based on age, ethnicity and basic health. So many illnesses have the same or overlapping symptoms that any indication of what is happening in an individual could make a world of difference. Diagnoses would occur sooner and after fewer tests (lower
cost!) and with greater confidence.
This means a lot to anyone who’s been through a grueling diagnosis period but it isn’t what makes the news. I would personally love to know more about my ancestors. Where did my family originate? Where did my hazel eyes come from? My migraines? My laugh? As we uncover all that the genome encodes I can only see this field as getting more and more exciting. What is genetic and what is environmental? All the “It’s in my DNA to eat this brownie” comments I’ve long suffered as a biologist could finally be settled. Is it in our DNA to crave certain foods to meet nutritional needs? That seems likely. Is it in our DNA to buy fabulous shoes? Perhaps that is less likely to be substantiated (I’ll have to find another excuse). Our understanding of ourselves will be broadened impacting everything from casual comments in sitcoms to medical care and disease diagnosis.

I sometimes joke that I got ADHD in college, like a flu going around the dorm (CDC Diagnostic Criteria, Rosenberg, 1998, Angier, 1991, Kluger, 2005, Gilman, 2005). I became easily distracted, fidgety, sometimes forgetful or inattentive of simple, daily activities and occasionally very restless. Most likely, since the symptoms are few in number and not significantly impairing I was simply noticing the effects of a high-stress, academically-challenging environment navigated on little sleep. Although typically a childhood disease, the symptoms may last until adulthood; for some people with attention-deficit/hyperactive disorder symptoms remain unnoticed until placed in a challenging environment with sudden independence, like college. Furthermore, some people afflicted with ADHD are not diagnosed until even later in life—retirement. The structured, scheduled work life they led for decades was taken away exposing for the first time symptoms they later realized were themes throughout their lives. For many people the impetus to see a doctor about these symptoms was a diagnosis of ADHD in their grandchild followed by their child, as ADHD seems to have a genetic component.

According to the Diagnostic and Statistical Manual for Mental Disorders IV there are three types of ADHD (CDC Diagnostic Criteria, CDC What is ADHD, Angier, 1991, Angier, 1994, Moffitt, 2007). One, the predominantly inattentive type, comprised of symptoms such as carelessness, distraction, forgetfulness, trouble organizing, following instructions or completing tasks that require attention or mental effort for long periods of time and inability to keep track of objects. Two, the predominantly hyperactive type, comprised of symptoms such as trouble waiting one’s turn or keeping still, frequent interrupting and excessive talking, restlessness and difficulty remaining quiet. And three, the combined type if criteria are met for both inattentive and hyperactive types. Interestingly, the discrepancy between American and European diagnostic statistics is likely due to a difference in diagnosis. European children typically need to be type three ADHD, the combined type, before being diagnosed. When the same standards are applied to subjects the rates of diagnosis between the continents equalizes, although there are still pockets of higher or lower diagnosis due to awareness and diligence of parents and treating physicians. Although some tests, such as for eye movement and coordination can be more sensitive, the predominant observational method of diagnosis illustrates the inherent subjectiveness and need for a biological method of diagnosis. Strep throat can be diagnosed definitively by a bacterial culture from a throat swab but ADHD has no such concrete method. The personal and cultural standards of proper behavior and impairing behavior color diagnosis and even acceptance of the condition as a disability. For example, many Asian cultures turn a blind eye to mental disorders such as ADHD and instead provide harsher discipline to the afflicted child. On the other hand, the US has seen a trend of overdiagnosis, particularly in boys, because of the characteristic “boys will be boys” behavior that is so similar to ADHD symptoms.

ADHD affects approximately 3-7% of children in the United States (three-quarters of whom are boys) (CDC Research Agenda, CDC What is ADHD, CDC A Public Health Perspective, CDC Injuries and ADHD, CDC Peer Relationships and ADHD, CDC Other Conditions Associated with ADHD, Hersey, 1996, Angier, 1991, Angier, 1994, Wallis, 2006, Kluger, 2005, Kluger, 2003, Harding, 2003, das Neves, 2006, Eisenberg, 2007). About half of these children have another behavioral or learning disorder that can complicate diagnosis and treatment. The ADHD child may have trouble making friends, be accident prone, and have trouble in school despite adequate intelligence. The ADHD child may also have more difficulty identifying dangerous situations, like crossing the street and may need to have distractions such as music or TV removed in order to do homework. Adults with ADHD may have trouble concentrating while driving, keeping a job or be more susceptible to drug addiction. On top of this, families must decide with their physician whether or not to put the child on medication (2.5 million children and 1.5 million adults are medicated for ADHD), rely on behavioral therapy and counseling alone, or substitute alternative treatments like the Feingold diet (artificial additives are removed and certain minerals, amino acids and other supplements are added) for medication. Oddly, a good night’s sleep may be enough to alleviate ADHD symptoms and help the child stay on medication. Approximately twice as many children with ADHD also have sleeping disorders, compared to children without ADHD. Poor sleep not only exacerbates ADHD symptoms but also presents some of the same symptoms, much as I saw in college, that could be misdiagnosed as ADHD.

It is unlikely that the apparent rise in ADHD diagnosis is due to a new cause of ADHD (or lax parenting) but to better diagnosis and establishment of the condition itself (Eisenberg, 2007, Hersey, 1996, Angier, 1994, Harding, 2003). The major debates surrounding ADHD, is it overdiagnosed and is it overmedicated, are based on one fact: the cause of ADHD is unknown. With no physical explanation for ADHD no absolute diagnostic criteria can be established and no mechanism for treatment can be explained. The current treatment of amphetamines, such as Ritalin, is based on 7 decades of observation of changed behavior but little else. Similarly, the Feingold diet is based on observed behavioral changes and has been shown by some studies to be as effective as Ritalin in some children, although no cellular explanation can be given for either treatment. Both are based on the theory that the brain cells are not signaling properly and the addition of an amphetamine (Ritalin) or removal of preservatives and artificial additives together with specific supplements (Feingold diet) restore the correct brain function or bypass the missing signaling component. Until a definitive cause of ADHD and resulting treatment are found there will continue to be underdiagnosed and overdiagnosed (and overmedicated) children simply because of different standards of behavior and unruliness and limited spread of more sensitive tests.

CDC ADHD Diagnostic Criteria: http://www.cdc.gov/ncbddd/adhd/symptom.htm
CDC ADHD Research Agenda: http://www.cdc.gov/ncbddd/adhd/dadagenda.htm
CDC What is ADHD?: http://www.cdc.gov/ncbddd/adhd/what.htm
CDC ADHD A Public Health Perspective: http://www.cdc.gov/ncbddd/adhd/publichealth.htm
CDC Injuries and ADHD: http://0-www.cdc.gov.mill1.sjlibrary.org/ncbddd/adhd/injury.htm
CDC ADHD Peer Relationships and ADHD: http://www.cdc.gov/ncbddd/adhd/peer.htm
CDC ADHD Other Conditions Associated with ADHD: http://www.cdc.gov/ncbddd/adhd/otherconditions.htm
Hersey, Jane. Diets and Drugs for Disruptive Children. The New York Times. 1996.
Rosenberg, Merri. Strategies to Manage a Disorder. The New York Times. 1998.
Angier, Natalie. Kids Who Can’t Sit Still. The New York Times. 1991.
Angier, Natalie. The Nation; The Debilitating Malady Called Boyhood. The New York Times. 1994.
Wallis, Claudia. Getting Hyper about Ritalin. Time Magazine. 2006.
Kluger, Jeffrey. Medicating Young Minds. Time Magazine. 2003.
Kluger, Jeffrey. Sleep Deprivation and ADHD. Time Magazine. 2005.
Gilman, Lois. All Jumbled Up. Time Magazine. 2005.
Moffitt, Terrie E. and Maria Melchior. Why does the worldwide prevalence of childhood attention deficit hyperactivity disorder matter? American Journal of                   Psychiatry, Vol. 164, No. 6. 2007.
Harding, Karen L. et al. Outcome-based comparison of Ritalin versus food-supplement treated children with ADHD. Alternative Medicine Review. Vol. 8,             No.       3. 2003.
das Neves, Sergio and Rubens Reimao. Sleep disturbances in 50 children with attention-deficit hyperactivity disorder. Arquivos de Neuro-Psiquiatria. Vol. 65,               No. 2-A. 228-233. 2007.
Eisenberg, Leon. Commentary with a historical perspective by a child psychiatrist: When “ADHD” was the “brain-damaged child.” Journal of Child and                          Adolescent Psychopharmacology. Vol. 17, No. 3. 2007.

Many Americans may know the name Louis Pasteur because of the notice of pasteurization on their milk, but few may know that pasteurization was first applied to wine and beer. Fewer still may realize the impact Pasteur had on germ theory, the idea that every disease is caused by a specific microorganism, which is the central tenet of today’s medicine (1-5).

Louis Pasteur was born in France in 1822. By the age of 32 Pasteur had earned a doctorate in chemistry and a faculty position at the University of Lille. The faculty were asked to use their expertise to solve practical problems and when the father of one of Pasteur’s students, a local distiller, asked for help Pasteur jumped at the opportunity. At the time it was understood that sugar fermentation produced alcohol but it was thought to be a purely chemical process and the yeast identified in beer and wine a product or catalytic agent of fermentation. The problem posed to Pasteur was to find the cause of sour wine and beer. Pasteur confirmed the presence of yeast in these cultures but also of bacteria. Using his experise on crystal structures, Pasteur was able to determine that yeast were responsible for the fermentation process and that bacteria present caused the souring of the alcohol. The solution was simple: boil the liquid for a few minutes and then add a pure culture of yeast to start the fermentation process (1-5).

Not only did this finding save the wine industry from the seemingly sporadic losses of batches, but it proved to Pasteur that the theory of spontaneous generation was wrong. Since ancient times it was believed that insects and (later) microbes arose from rotting matter as a biproduct. It was clear to Pasteur that microbes were present in the air and on the surface of grapes in the vineyards and thus present in distilling cultures. In an exploratory experiment Pasteur found that city air grew more cultures than air from high altitudes. A simple, yet elegant, experiment followed that proved germ theory’s dominance over spontaneous generation. Pasteur sterilized a fermentable liquid in a flask. The neck of the flask was heated and bent into an s-shape and sealed. The solution remained sterile. After the neck tip was broken off the solution yet remained sterile as the particles in the air were trapped on the beads of water sticking to the glass neck. Only when the flask was tipped to allow the solution to contact the trapped particles in the neck did a culture grow (1-5).

The surgeon Joseph Lister, in England, embraced Pasteur’s find and sterilized the equipment and air in his operating room. This dramatically reduced fatalities and caused Lister to champion germ theory as well. And yet most scientists clung to spontaneous generation until the 1870s and 80s (1-5).

Pasteur continued his research, identifying diseased silk worms and the conditions that contribute to their condition, saving the French silk industry. He also used Robert Koch’s discovery of the spores that cause anthrax to develop a vaccine for farm animals. A public challenge in 1882 by veterinarian Rossignol was a success, with Pasteur showing that 25 vaccinated sheep survived anthrax infection while 25 unvaccinated sheep perished. The circus-like atmosphere only helped the case of germ theory; dispatches were sent around Europe detailing his success. The vaccine not only resulted in reducing the mortality rate of livestock to 1% but also in monetary savings to France over the next 10 years totalling an estimated $7 million francs (1-5).

Pasteur’s swan song a few years later was developing a vaccine for rabies. Initial experiments were in rabbits and dogs and involved drying and mincing infected animals’ spinal cords and injecting the material, on a graduated scale of least to most potent, under the skin of healthy animals over several days. At the end of the trial the treated animals were immune to rabies. The news traveled quickly and in 1886 Pasteur found himself in an ethical quandary. A young boy, Joseph Meister, had been bitten and mauled by a rabid dog. Although Pasteur had great success with animal vaccination he was not completely confident in the vaccine’s use for humans. The boy’s mother pleaded and Pasteur began the vaccine trial. Meister recovered completely (1-5).

This success led to the establishment of the famed Pasteur Institute in Paris, initially a hospital to treat human rabies (1-5). In 1895 Pasteur died after a series of strokes since his 40s. He was given a state funeral having contributed immeasurably to the health and industries of France and pioneering germ theory. Joseph Meister grew up to work as a gatekeeper at the Pasteur Institute (5). In 1940 Nazi invaders ordered Meister to open Pasteur’s crypt. Sadly, Meister refused and committed suicide.

1. http://www.bbc.co.uk/historic_figures/pasteur_louis.shtml
2. http://louisville.edu/library/ekstrom/special/pasteur/cohn.html
3. http://encarta.msn.com/text_761568595___0/Louis_Pasteur.html
4. http://www.fordham.edu/halsall/mod/1878pasteur-germ.html
5. http://elane.standford.edu/wilson/Text/5f.html

Undoubtedly one of the greatest scientific advancements of mankind occured in 1953 when Francis Crick and James Watson discovered that deoxyribonucleic acid, or DNA, was a double helix (Crick Papers, Franklin Papers). Last month, James Watson received the entire sequence of his genome. How is it possible for 50 years to span theorizing proteins as the keeper of hereditary information to personal genome sequencing? The comparison is not unlike that of the technological advancement from the first brief flight taken by the Wright brothers to the moon landing over 60 years later. Arguably, the impact of determining the structure of DNA is even greater than that of space travel.

In 1953, although Oswald Avery had shown the that DNA carries hereditary information in bacteria 9 years earlier, most scientist believed proteins had to carry this information (Crick Papers, Franklin Papers). Proteins were composed of up to 20 different amino acids while DNA was composed of only 4 nucleic acids. Proteins could be a variety of sizes, shapes and compositions while DNA seemed to be too simple. The structure of DNA was not known, although several top scientists, including Linus Pauling, a two-time Nobelaureate, were competing for that discovery. Nor was it known how proteins could retain hereditary information and transfer that information during cell division. Watson and Crick had combined their knowledge of genetics, chemistry, X-ray crystallography and biochemistry and set out to solve the structure of DNA without manipulating any actual DNA. Pauling had earlier discovered that proteins can form alpha helices, leaving the thought of helical forms in the ether as a structural possibility. Indeed, Pauling and Watson and Crick all developed three-stranded helical models of DNA, although Watson and Crick revised theirs before publishing. Previously, Alexander Todd had found that the DNA backbone was comprised of alternating phosphates and deoxyribose sugars. Separately, Edwin Chargaff found that the nucleic acids adenine (A) and thymine (T) were always in equal proportion in DNA, as were the nucleic acids guanine (G) and cytosine (C). The final piece of the puzzle, X-ray crystallography done primarily by Rosalind Franklin and with the assistance of Maurice Wilkins, was given to Watson and Crick by Wilkins without Franklin’s knowledge and showed Crick that DNA had a helical structure much like a corkscrew. With a tip from visiting collegue Jerry Donohue, that the commonly known structures of thymine and guanine were actually incorrect, Watson was able to assemble the first model of DNA. An A paired with a T was about the same size as a G paired with a C. This not only paired DNA strands in a double helix but put the nucleic acids on the inside of the helix. To fit appropriately Crick theorized one strand would have to run antiparallel to the other. Crick’s wife sketched the first diagram:

crick-helix.jpg

The implications of this finding are extraordinary and outlined in a paper published a month after the initial announcement of DNA’s structure (Crick Papers, Franklin Papers). DNA was able to copy itself exactly, never losing the starting information, because each strand could prime the second. In a dividing cell each of the two new cells would have one strand of the original DNA paired with a new strand of DNA. No hereditary information would be lost in cell division since A and T were always paired as were G and C.

Although Watson and Crick did not do their own DNA manipulations they were able to experimentally create a model by piecing together data that seemed to be insiginificant or unrelated to each other (Crick Papers, Franklin Papers). Although the scientific climate at the time led them and most men to be dismissive of Franklin, Watson and Crick have since admitted fault for their attitude. Franklin herself held no grudge against them since they were unaware of Wilkins’ deceit. Watson, Crick and Wilkins later received a Nobel Prize for this work, after the community accepted DNA was the carrier of genetic information. Franklin, however, had died and was not eligible for the Nobel Prize based largely on her data.

How does Watson feel about this year’s advancement? “I am thrilled to see my own genome.” he recently told the New York Times (Wade, The New York Times). He has opted not to know the status of a gene known to contribute to Alzheimer’s disease, apolipoprotein E, however. That choice is the epitome of personal genome sequencing. Instead of talking to genetic counselors about the risk of some forms of heart disease or cancer, for instance, a person could find out the status of genes known to contribute to those diseases and choose to be more or less vigilant about diet and exercise. Additionally, potential parents could learn if they are carriers for diseases like cystic fibrosis and hemophilia before conceiving.

Is this actually possible for real people right now (Harmon, The New York Times)? Not quite. Watson’s entire genome was sequenced at a cost of $1 million. Currently, Stephen Hawking, Larry King and others are having their genomes sequenced for a tenth of that cost. Goals are to get a complete genome for $1000 in a few years. Currently, $1000 can give you the sequence to the most useful 1% of the genome. Scientists contend that the more people who have a complete genome sequenced the more information will be available to compare ancestry, health, personal success, appearance and preferences with genes and gene combinations. Does everyone who likes the color blue have a gene dictating so? Likely not. However risk-takers may have similar gene sets. As may mathematicians or actors. More importantly, collections of personal genomes along with health histories can help researches pin down disease-causing genes.

Decoding the entire genome and taking advantage of gene therapy, a process that hopes to switch a faulty gene for a correct, or wild type, one in specific organs may be far off, however (Human Genome Project Information). The last 50 years have brought us the discovery of unique genes, of ‘junk’ DNA, and of the promoters, or regulatory areas of genes (necessary so pancreatic cells can respond to glucose to produce insulin and so neurons can respond to acetylcholine to fire a synape, or so a pancreatic cell can function as such and not a neuron [since both cells contain a complete genome]). Genes can be spliced (or stuck) into vectors and popped into cells in culture to see how their sudden presence changes the cell behavior (a technique used frequently in cancer research). We now know how many genes are regulated. Global regulation of the genome, such as methylation, has been discovered. Methylation binds DNA tightly to prevent gene expression and is frequently decreased overall in tumors to allow unregulated gene expression, although tumor suppressor genes are frequently methylated more to allow tumor progression. Chromosomes can be stained to determine if breakage and rejoining has occured. And the amplification of specific regions of DNA is possible. This techinique, polymerase chain reaction (PCR), is indispensible in modern molecular biology. Paternity tests and forensics are the most widely known applications of PCR but researchers can use PCR to determine if a gene is present or deleted, if expression of the gene is increased or decreased, and if a person has a specific mutation. Today we know that the human genome contains about 30,000 genes, half of which have unknown functions. We know that only 2% of the genome codes for genes and that about 50% is ‘junk’ or structural regions. We know now that DNA can copy itself or serve as a primer for mRNA and that mRNA can then leave the nucleus of the cell to direct protein production. We know that our genome has many similarites to flies and plants, but that the regulation of our genome is more complex and allows for our unique existence.

Finding the structure of DNA has led to an information boon that has revolutionized, really, created, modern molecular biology (Harmon, The New York Times; Wade, The New York Times). Sequencing the genome has made genetic work much easier, and comparative genomics—the comparison of thousands of genomes complete with biographical information could give us answers to why some people are shy and why some people are outgoing. The argument of nature versus nurture may finally be put to rest, although I suspect science will find it is as suspected all along and nature and nurture both contribute to our personalities; much like diet and genes can affect the risk heart disease or cancer.

The Francis Crick Papers, National Library of Medicine. http://profiles.nlm.nih.gov/SC/Views/Exhibit/narrative/doublehelix.html
The Rosalind Franklin Papers, National Library of Medicine. http://profiles.nlm.nih.gov/KR/Views/Exhibit/narrative/dna.html
Wade, Nicholas. “Genome of DNA Discoverer is Deciphered.” The New York Times. June 1, 2007.
Harmon, Amy. “6 Billion Bits of Data About Me, Me, Me!” The New York Times. June 3, 2007.
“The Science Behind the Human Genome Project.” Human Genome Project Information. http://www.ornl.gov/sci/techresources/Human_Genome/project/info.shtml

Although published observations of autistic behavior date back to the 18th century it was not until 1943 that that disease was named (CDC). At that time Dr. Leo Kanner conducted a study of 11 children noting “autistic disturbances of affective contact”. The term autistic was coined about 30 years before to describe a condition marked by “a tendency to view life in terms of one’s own needs and desires”.(Random House Unabridged Dictionary) Around the same time Dr. Hans Asperger completed a study of 400 children noting similar behavior (CDC). The result was the classification of Autistic Spectrum Disorders (ASD) including autistic disorder, Asperger’s Syndrome, and pervasive developmental disorder-not otherwise specified. ASDs affect approximately 1 in 150 children in all races, ethnicities and socioeconomic groups equally (CDC, Beaudet, Maimburg, Cantor, Nagarajan). Autism is four times as likely to affect males as it is to affect females.

Autism is a developmental disability that is not related to intellectual capacity (CDC, Beaudet, Maimburg, Cantor, Nagarajan). Children affected with autism have difficulty with social interaction, communication, relating and understanding feelings and sensations, and paying attention. Autistic children may also have different ways of learning and may accomplish harder tasks, such as multiplication or reading words, before easier tasks such as number identification or letter pronunciation. Children with autism may be uncomfortable being touched and prefer solitude and undeviating routines and repetitive behaviors. A difficulty identifying appropriate feelings in a situation, personal space, body language and tone of voice are also symptoms common in ASDs. Verbal skills range from no language skills to relatively normal language skills although a person with an ASD may not recognize the natural ebb and flow of a conversation and stick to one personal topic for long periods of time. ASD symptoms are limited to social interaction and communication skills although they may be present along with another disorder such as mental retardation, epilepsy, Fragile X syndrome, tuberous sclerosis, congenital rubella syndrome or untreated phenylketonuria. Asperger’s syndrome is usually differentiated as a mild version of autism although the range of abilities and spectrum of symptoms in autistic disorder can be quite broad.

Since the symptoms for ASDs are so wide ranging it is important for infants and toddlers to complete standard screening tests (CDC). One-third to one-half of parents with autistic children notice symptoms by the child’s first birthday, with 80-90% noticing symptoms by the second birthday. The child may not have any difficulty with walking or other motor skills and show an ability to complete puzzles or other intellectual activities on par with their age group but they may not be able to play pretend or focus on objects or people. The child may also stop gaining skills or lose skills as a toddler. Early screening tests will help to identify those symptoms that may suggest autism (most children are diagnosed by age 4-5) in time to start behavioral therapies, currently the only treatment for ASDs. Some doctors may suggest talking to a nutritionist about diet changes that can help control symptoms and behavior. Similarly, some benefits have been seen with massage therapy, homeopathy, dance, or meditation. It is important to discuss alternative therapies with a doctor to prevent malnutrition or potentially harmful therapies. Currently, about one-third of autistic patients receive an alternative therapy although up to 10% may be harmful. Medication may also be given to control hyperactive energy levels, depression, seizures, attention deficiencies, or self-injurious behavior that frequently occur with autism or linked disorders.

ASDs are described as highly heritable although the genetic basis is unknown (CDC, Beaudet, Nagarajan). Simply put, a person with an autistic disorder may pass the condition to their child but may not have inherited the disease from a parent. De novo, or new, mutations occur in a number of genes that have been shown to be linked to autism (about 10-20% of cases). These mutations may be caused by environmental factors, although those are largely unknown. Another way of looking at ASD inheritance can be shown in twin studies. With identical twins, one twin affected with an ASD means the other has a 75% chance of being affected. A fraternal twin, however, has only a 3% chance of being affected if their twin is affected. Similarly, if a family has one child with an ASD there is a 2-8% chance of the second pregnancy producing a child with autism. Finding a genetic cause can be complicated. The cause may or may not be inherited. It may or may not be the result of a genetic mutation or environmental cause in the womb (thalidomide, for example). What other factors can contribute to ASDs if genetics and environment do not account for all the cases? Some cases can be explained by a different kind of change to DNA, what is called an epigenetic change. Genetic changes to DNA are considered changes to the sequence of DNA by deletion, duplication or base-pair change. Epigenetic changes to the DNA cover the way DNA is packaged in a cell’s nucleus. Each cell contains about 6 feet of DNA, clearly too much to fit in a cell without a highly-regulated packaging method (Hypertextbook). (Since every body has about 10 trillion cells the total DNA in every person can make about 70 trips to the sun and back). One method is called methylation and basically winds the DNA so tightly that the genes can not be expressed (CDC, Beaudet, Nagarajan). Methylation is common in cancer cells where most genes are hypomethylated, or turned on, while tumor suppressor genes (the brakes on tumor development) are hypermethylated. What this means for autism and other psychiatric disorders is that some genes the brain cells need for normal function are perfectly normal but locked away. One such gene can cause enough disruption to the brain to cause autism, schizophrenia or other psychiatric disorders. Altered methylation patterns have been linked to autistic patients with older fathers (Cantor) and some cases of assisted reproduction (such as IVF) (Maimburg); although, one study shows a decreased chance of autism after the rise of assisted reproduction. This study accounts for the cases where the assisted reproduction caused altered methylation. While it is not always known how methylation patterns in the brain cells are altered it is thought to occur early in development. Diagnostic tests are in development to identify these patients (Nagarajan). Likewise, drugs to undo methylation are in development and may be used to treat these subset of patients.

Not all cases of autism can be explained and no one explanation covers all cases. The broad range of symptoms and abilities in autistic patients is reflected in the many cellular causes of autism. Currently the best treatment for children with an ASD is an early diagnosis and behavioral intervention with medical treatment for those symptoms that require it.

Centers for Disease Control, http://www.cdc.gov/ncbddd/autism
Random House Unabridged Dictionary, Random House, Inc, 2006
Beaudet, Arthur L. Autism: highly heritable but not inherited. Nature Medicine. Vol. 13 No. 5, p. 534-6, 2007
Maimburg, Rikke D. and M. Vaeth. Do children born after assisted conception have less risk of developing infantile autism? European Society of Human             Reproduction and Embryology. p.1-3, 2007
Cantor, RM. et al. Paternal age and autism are associated in a family-based sample. Molecular Psychiatry. Vol. 12. p. 419-423, 2007
Nagarajan, R.P. et al. Reduced MeCP2 expression is frequent in autism frontal cortex and correlates with aberrant MECP2 promoter methylation. Epigenetics.     Vol. 1, No. 4, p. 172-182, 2006.
Hypertextbook, http://hypertextbook.com/facts/1998/StevenChen.shtml

Most people do not think of the industrial revolution when they think of the breast feeding-formula feeding debates but it is arguably the reason the discussion exists today (National Academy of Sciences, foodtimeline.org, wikipedia.com). In the early 19th century breast feeding was the norm around the world. Women who had difficulties with breast feeding found wet nurses for their infants or created homemade formulas. The 1845 advent of the rubber nipple was the first step toward the formula feeding boon of the 20th century. Enterprising budding industrialists introduced the first manufactured formulas in 1867, advertising them as equal or superior to breast milk (though later generations would repudiate these claims). Yet formula and the bottle would have remained the recourse for women who could not nurse or find a wet nurse had not two other major changes taken place. Perhaps the most critical change was in the minds of newly industrialized countries—namely the discovery and acceptance of germ theory. The desire to avoid potentially tainted milk combined with medicine’s endorsement of formula to change the tide against breast feeding. Yet more than half of the infants in the United States were still breast fed until the late 1920s/early 1930s when refridgeration became more common. By the 1950s breast feeding was summarily dethroned as the preferred method of infant feeding with the rise of commercialized formulas.

Breast feeding provides infants with more than energy (Fomon, Anderson et al.). Proteins, vitamins and minerals are used by the infant to grow and mature. Antibodies, growth factors and enzymes provided by the mother aid immunity, growth and development until the infant’s own systems mature. Lipids and neurotransmitters aid in brain development while other components of breast milk sharpen vision. Formulas have developed remarkably in the last 150 years but they do not provide all the beneficial components of breast milk nor do they necessarily provide them in a form usable by the infant. As the infant’s systems are developing some components of breast milk rely on other components for activation or digestion. Furthermore, while some doctors may prescribe, or at least be aware of a potential need for, vitamin K and iron to breast-fed infants (likely dependent on the mother’s own levels) most components are in the ratio needed by the infant at that stage of development, measurements that are often difficult to apply to formula ingredients.

The superiority of breast milk to formula is advocated by doctors today. Debates exist, however, on the duration and exclusivity of breast feeding. The World Health Organization (WHO) and American Academy of Pediatrics (AAP) both recommend exclusive breast feeding for the first 6 months (except in rare circumstances, discussed below). Exclusive not only means without formula supplementation but also without supplementation of other milks, liquids or solid foods. At 6 months of age solid foods can be introduced and fed along with breast milk until at least age 12 months (AAP) or 24 months (WHO). Neither organization makes overt recommendations on when to stop breast feeding, ie when breast milk is no longer providing what solid foods and the toddler’s body cannot, but simply recommend continuing as long as both mother and child desire it. Mothers in developing countries may find extra benefit in prolonged breast feeding due to the high risk of contaminated water and food that would more harshly effect toddlers over older children and adults with more mature organs.

Certainly today some women may find, as they have throughout history, extreme difficulty in nursing. For these women, adoptive parents, and others the formula options are quite good compared to earlier forms (Owen, Fomon, Kramer et al., Anderson et al.). Nutrients are made more bioavailable and mineral contaminants such as lead have been removed. Developed countries see little difference in formula-fed infants compared with breast-fed infants in the long run. Certainly breast-fed infants are ill less frequently and less seriously when they are ill (mostly applicable to respiratory and gastrointestinal disease) while being breast fed. They may also have a slight advantage with mental acuity (one study suggests an average of 3 points on an IQ test) and visual acuity in the long term. Although formula-fed infants are by no means absolutely doomed to a life poor health and average achievement, it is simply not a complete replacement for breast milk. Advantages to formula feeding over breast feeding exist for particular situation as well. For women who are on drugs, chemotherapy, radiation, are infected with HIV or TB, or have infants with a rare genetic disorder that prevents the digestion of breast milk formula is recommended by the AAP over breast milk. The WHO has similar recommendations but does point out that while HIV can be transmitted by breast feeding (from 5-20% transmission rate for some breast feeding to 30-45% transmission rate for 18-24 months of breast feeding) other immunological (including HIV) and nutritional benefits from breast milk will be lost. In the United States and other developed countries, as stated above, choosing formula may not present a long-term observable effect but in developing countries it could mean gambling with the child’s life as much as the risk of HIV infection.

Breast feeding is best for the mother and child (Ruowei et al., Fomon, Anderson et al., Hediger et al). It is the natural process both bodies desire and should be promoted over formula feeding in almost all cases in the US. Multiple studies have shown, however, that only approximately 71% of infants in the US are ever breast fed and only approximately 14% of these infants are breast fed exclusively for 6 months. The percentage of mothers who nurse has been increasing since the late 1960s. Most women who breast feed exclusively are white, at least 25 years of age, well-educated and reside on the west side of the country. Mothers who are black, young and without a high school diploma are most likely not to breast feed. While infants in low socioeconomic groups could benefit greatly from breast feeding over formula feeding the education and resources to breast feed, especially in single-parent homes, are lacking.

The emotional, biosocial, nutritional and developmental benefits to breast feeding over formula feeding are proven (Ruowei et al., Fomon). What remains is to provide viable options for women and infants for whom breast feeding is not an option and resources for working mothers to breast feed their infants for at least the first 6 months to a year. Widespread education and cultural shifts may be required as they were to make the shift to formula in the 19th and 20th centuries (Montagu).

National Academy of Sciences. Evaluating the Safety of New Ingredients, 2004.
WHO:  http://www.who.int/topics/breastfeeding/en/
 http://www.who.int/nutrition/publications/HIV_IF_guide_for_healthcare.pdf
AAP: http://aappolicy.aappublications.org/cgi/content/full/pediatrics%3b100/6/1035
 http://pediatrics.aappublications.org/cgi/content/abstract/112/5/1196
Foodtimeline.org
Wikipedia.com
Ruowei, L. et al. Changes in Public Attitudes toward Breastfeeding in the United States, 199-2003. Journal of the American Dietetic Association. Vol. 107. No. 1, 2007.
Montagu, M. Nature, Nurture and Nutrition. The American Journal for Clinical Nutrition. Vol. 5 No. 3, 1957.
Forman, M. et al. Exclusive breastfeeding of newborns among married women in the United States: the National Natality Surveys of 1969 and 1980. The American Journal for Clinical Nutrition. Vol. 42.  864-869, 1985.
Owen, G. Interaction of the infant formula industry with the academic community. The American Society for Clinical Nutrition. Vol. 46. 221-225, 1987.
Fomon, S. Reflections on infant feeding in the 1970s and 1980s. The American Journal for Clinical Nutrition. Vol. 46 171-182., 1987.
Anderson, J. et al. Breast-feeding and cognitive development: a meta-analysis. The American Journal for Clinical Nutrition. Vol. 70. 525-535, 1999.
Hediger, M. et al. Early infant feedign and growth status of US-born infants and children aged 4-71 mo: analyses from the third National Health and Nutrition Examination Survey, 1988-1994. The American Journal for Clinical Nutrition. Vol. 72. 159-167, 2000.
Kramer, M. et al. Infant growth and health outcomes associated with 3 compared with 6 mo of exclusive breastfeeding. The American Journal for Clinical Nutrition. Vol. 78. 291-295, 2003.

If I think back about it, I seem to remember that the sequencing of the human genome slightly predates the obsession with DNA in pop culture. I’m not thinking about actual scientific work related to DNA that pops up in newspapers every day but more the trend to use DNA as a scapegoat. Statements like: I can’t help that I’m a shopaholic, it’s in my DNA. As someone who actually knows a thing or two about DNA this tends to make my eye twitch. The human genome may be sequenced but not all of our genes have been identified. Even so I can say with fair certainty that there has not been sufficient evolutionary time for a shoe gene to develop.

I can ignore these statements and take them as they’re meant to be: a statement of a trait that the person perceives to be innate and unchangeable. The real problem I find is in movies and TV. Misrepresentation in film is by no means limited to biology but it does pull me out the moment. A second after the shudder runs through my body I wonder who actually learns these mistakes. I mean, I learned the definition of a score from the recitation of the Gettysburg Address in Bill and Ted’s Excellent Adventure so someone could believe that genes can skip generations or disappear completely as claimed in 28 Weeks Later, right? I know what they’re getting at, that gene *expression* might be recessive and not shown in the offspring. Or that the gene may have been mutated or lost in an individual’s somatic cells and not their gametes (sex cells) (much like that which causes a melanoma from too many summers at the beach) and hence hereditable.

I don’t meant to pick on one film, it is, certainly, not the only, or even most blatant, offender. Red Planet, with Val Kilmer and Benjamin Bratt, had several references to nematodes. Unfortunately for them nematodes are a type of worm. The critters in question were not. Clearly a checkable fact. CSI Miami pushes the boundaries of believable at times, all the more dangerous for their otherwise fairly accurate portrayal of the biology of forensics. As an example, in one episode epithelial skin cells were removed from rough upholstery on a car seat and used to identify an assailant. While it is technically possible to get enough DNA from a few cells to identify a person it is difficult. Moreover, the outermost skin cells are dead and do not contain a nucleus. The chances that the cells that remained on the upholstery after the car crashed were from a deeper, nucleated skin layer are, well, the kind of odds Hollywood makes good on.

The fact is that errors are everywhere in movies and the only people who usually care are those who know they’re mistakes. Errors may happen even after fact checking (I think I’d forgive the incorrect phase of the moon during the moon landing in Apollo 13) but films that are not specifically relying on a fantasy element should make sure that their basic facts are sound. Have all the bubbling beakers you want if that’s your aesthetic but gravity doesn’t let up on Tuesday afternoons, Amelia Earhart wasn’t born in 1987 and *individuals* don’t evolve. Just ask Wikipedia.

I suppose I should take the stance of some historians on the movie Troy. While factually inaccurate in many ways the movie was viscerally correct and, more importantly, had inspired many college students to register for history courses. I just don’t think 28 Weeks Later is going to cause a surge in students registered for virology.

The greatest accomplishment in the history of public health initiatives, or so your dentist will tell you, was the addition of fluoride to public water supplies—a small amount of fluoride in drinking water dramatically reduces the incidence of cavities. The reasons for this success are simple: Fluoride-treated water is inexpensive and requires no effort on the part of the public, save to drink tap water.

Likewise, Papanicolau (Pap) screening has greatly reduced the mortality rate attributed to cervical cancer in the US and industrialized world (Hanna et al., Calloway et al., Hymel). Although Pap screening is fairly expensive, it is less expensive than treating cervical cancer. Patients with cervical cancer caught at an early stage through the Pap screening have a 100% survival rate over 5 years. In the US approximately 12,000 women a year are diagnosed with cervical cancer with about 4,000 women succumbing to the disease. The benefits of Pap screening are put into stark clarity, however, when worldwide figures are considered. As the second-leading cause of cancer death among women worldwide, cervical cancer affects about 500,000 women a year, half of whom may die of the disease. The lack of regular, or any, access to Pap screening and other medical treatment in the developing world accounts for this disparity.

Nearly all cases of cervical cancer result from a chronic human papilloma virus (HPV) infection. HPV strains 16 and 18 cause 50-60% and 10-20%, respectively, of cervical cancers, with 5 other strains making up most of the remaining 30% of cases (Hanna et al., Hymel). Additionally, strains 6 and 11 have been known to cause genital warts. Altogether approximately 40 HPV strains, or genotypes, affect genital tissue with cervical cancer being the most common resulting cancer. Statistics vary on prevalence of HPV in the US, but some estimate 6.2 million newly-infected persons each year. Studies suggest up to 70% of adults have had an HPV infection; most infections are cleared by the body and do not result in a chronic condition that may progress to cancer. Although a young age at first sexual experience, larger number of sexual partners and lack of condom use correlate with increased risk of HPV infection, infection often occurs soon after becoming sexually active.

Currently two HPV vaccines, Gardasil from Merck and Cervarix from GlaxoSmithKline, are on the market or are soon to be released (Calloway et al.). Gardasil covers both major genotypes involved in cervical cancer and both major genotypes causing genital warts. Cervarix will be used to prevent HPV-16 and -18 infections only, not vaccinating for the strains causing genital warts. Both vaccines were determined by the federal Food and Drug Administration to be completely safe and effective. Currently, recommendations are a series of 3 shots given to women aged 9-26, preferably before first sexual contact (ACS, Calloway et al., Roden et al.). Studies in women older than 26 or women who have had sex and may have had an HPV infection are ongoing and thus it is unknown whether the vaccine will be as effective in these women. Until a treatment for HPV infection is found it remains important for the preventative vaccination to be done before women have any sexual contact. Studies on men have not been prioritized as the incidences of penile or anal cancers are much lower than that of cervical cancer (Dunne et al., Hymel). While women may benefit from men being vaccinated as well it is not clear whether this benefit is beyond that from routine vaccinations of women.

Although Pap screening has been proven to be a tremendous benefit to women’s health, the high prevalence of HPV infections suggest that a relatively inexpensive vaccine series would be extremely valuable (Hanna et al., Calloway et al., Roden et al.). Vaccinations can be given with other childhood vaccinations before sexual contact giving, hopefully, lifelong protection. It is not yet known when or if booster shots will be needed. It is known, however, that Pap screening should be continued as long as the vaccinations do not cover all cancer-causing strains or questions on the length of time of protection given by the vaccine exist, although screening can be done tri-annually instead of every year. It is much easier and less expensive to complete a series of shots once than it is to have annual exams. The peace of mind that comes with protection from cancer can not be quantified.

It is hoped, especially for developing countries, that the vaccinations can be eventually given in one dose or through oral administration. Lack of access to medical care is a problem in many areas of the world; the observation that 80% of women with cervical cancer are in developing countries clearly illustrates this fact (Hanna et al., Roden et al.). Even in the US many people are forced to go without medical insurance at some point in their lives. HPV vaccination, especially in childhood when other vaccinations are needed, could eventually lead to the elimination of cervical cancer. While this is not as inexpensive or easy as a glass of water a day, a half a million women a year could avoid the pain of cervical cancer.

American Cancer Society, Vol. 57, No. 1, 1/2007

Hanna, E. and G. Bachman. HPV vaccination with Gardasil a breakthrough in women’s health. Informa Healthcare, 2006.

Calloway, C. et al. A Content Analysis of News Coverage of the HPV Vaccine by US Newspapers, January 2002-June 2005. Journal of Women’s Health. Vol. 15 No. 7, 2006.

Dunne, E. et al. Prevalence of HPV Infection among Men: A Systematic Review of the Literature. The Journal of Infectious Diseases. Vol. 194, p1044-1057, 2006.

Hymel, P. A. Decreasing Risk: Impact of HPV Vaccination on Outcomes. The American Journal of Managed Care. Vol. 12, No. 17.

Roden, R. and T.C. Wu. How will HPV vaccines affect cervical cancer? Nature Reviews. Vol. 6, 2006.