Pages

Monday 27 May 2013

TOP 20 SCIENCE/ SCIENTIST MISTAKES

Take a deep breath: Believe it or not, scientists are not always right. We really put them up on a pedestal, though, don't we? We quote scientists as experts, buy things if they're "scientifically proven" to work better … but scientists are human, too. It's just not fair to expect perfection out of them, is it? But come on, can't we at least ask for a reasonable level of competency?

1. The Circulatory System

 
You don't have to be a doctor to know how important the heart is…but back in ancient Greece, you could be a doctor and STILL have no idea how important the heart is.
Back then, doctors like second-century Greek physician Galen believed (no kidding) that the liver (not the heart) circulated blood (along with some bile and phlegm), while the heart (really) circulated "vital spirit"(whatever that is).
How could they be so wrong? It gets worse.
Galen hypothesized that the blood moved in a back-and-forth motion and was consumed by the organs as fuel. What's more, these ideas stuck around for a very long time. How long?
It wasn't until 1628 that English physician William Harvey let us in on our heart's big secret. His "An Anatomical Study of the Motion of the Heart and of the Blood in Animals" took a while to catch on, but a few hundred years later, it seems beyond common sense -- perhaps the ultimate compliment for a scientific idea.

2.The Earth Is the Center of the Universe

Chalk it up to humanity's collectively huge ego. Second-century astronomer Ptolemy's (blatantly wrong) Earth-centered model of the solar system didn't just stay in vogue for 20 or 30 years; it stuck around for a millennium and then some.
It wasn't until almost 1,400 years later that Copernicus published his heliocentric (sun-centered) model in 1543. Copernicus wasn't the first to suggest that the we orbited the sun, but his theory was the first to gain traction.
Ninety years after its publication, the Catholic Church was still clinging to the idea that we were at the center of it all and duking it out with Galileo over his defense of the Copernican view. Old habits die hard.

3: Germs in Surgery

Laugh or cry (take your pick), but up until the late 19th century, doctors didn't really see the need to wash their hands before picking up a scalpel.
The result? A lot of gangrene. Most early-19th century doctors tended to attribute contagion to "bad air" and blamed disease on imbalances of the "four humors" (that's blood, phlegm, yellow bile and black bile, in case you weren't familiar).
"Germ theory" (the revolutionary idea that germs cause disease) had been around for a while, but it wasn't till Louis Pasteur got behind it in the 1860s that people started listening. It took a while, but doctors like Joseph Lister eventually connected the dots and realized that hospitals and doctors had the potential to pass on life-threatening germs to patients.
Lister went on to pioneer the idea of actually cleaning wounds and using disinfectant. Remember him next time you reach for the Purell.

Take a deep breath: Believe it or not, scientists are not always right. We really put them up on a pedestal, though, don't we? We quote scientists as experts, buy things if they're "scientifically proven" to work better … but scientists are human, too. It's just not fair to expect perfection out of them, is it? But come on, can't we at least ask for a reasonable level of competency?

4: DNA: Not So Important


DNA was discovered in 1869, but for a long time, it was kind of the unappreciated assistant: doing all the work with none of the credit, always overshadowed by its flashier protein counterparts.
Even after experiments in the middle part of the 20th century offered proof that DNA was indeed the genetic material, many scientists held firmly that proteins, not DNA, were the key to heredity. DNA, they thought, was just too simple to carry so much information.
It wasn't until Watson and Crick published their all-important double-helical model of the structure of DNA in 1953 that biologists finally started to understand how such a simple molecule could do so much. Perhaps they were confusing simplicity with elegance.

5: The Atom Is the Smallest Particle in Existence


Believe it or not, we weren't actually all that stupid in ancient times. The idea that matter was composed of smaller, individual units (atoms) has been around for thousands of years -- but the idea that there was something smaller than that was a bit harder to come by.
It wasn't until the early 20th century, when physicists like J.J. Thompson, Ernest Rutherford, James Chadwick and Neils Bohr came along, that we started to sort out the basics of particle physics: protons, neutrons and electrons and how they make an atom what it is. Since then, we've come a long way: on to charmed quarks and Higgs bosons, anti-electrons and muon neutrinos. Let's hope it doesn't get too much more complicated than that.

6: The Earth Is Only 6,000 Years Old


Once upon a time, the Bible was considered a scientific work. Really. People just kind of assumed it was accurate, even when it didn't make much sense.
Take the age of the planet, for example.
Back in the 17th century, a religious scholar took a hard look at the Bible and estimated that creation happened around 4004 B.C. (you know, approximately). Add in nearly 2,000 more years to get to the 18th century, when Western, Bible-reading geologists started to realize that the Earth was constantly shifting and changing, and you get about 6,000 years.
Hmm ... those biblical scholars may have been a bit off. Current estimates, based on radioactive dating, place the age of the planet at around, oh, 4.5 BILLION years.
By the 19th century, geologists started putting the pieces together to realize that if geologic change was happening as slowly as they thought it was, and if this Darwin guy was at all right about evolution (which was also a slow process), the Earth had to be WAY older than they had thought. The emergence of radioactive dating in the early 20th century would eventually prove them right.

7: The Rain Follows the Plow

If only it were so easy. It's actually kind of shocking that humanity held on to the idea that land would become fertile through farming for so long. Didn't anyone look around and see that all this farming of arid land wasn't doing much?
So much for observation.
In reality, this quite erroneous theory (popular during the American and Australian expansions) may have stayed alive in part because it did sometimes work -- or at least it seemed to work.
What we know now is that the plow wasn't actually bringing the rain; long-term weather patterns were. Arid regions (like the American West, for example) go through long-term cyclical droughts, followed by cycles of wetter years. Wait long enough and you'll get a few wet ones.
There's just one problem: wait a few more years and all the rain just goes away - only now, you've got a civilization to support.

8: Phlogiston

What? You've never heard of phlogiston? Well, don't beat yourself up about it, because it's not real.
Phlogiston, proposed in 1667 by Johann Joachim Becher, was another element to add to the list (earth, water, air, fire and sometimes ether); it wasn't fire itself, but the stuff fire was made of. All combustible objects contained this stuff, Becher insisted, and they released it when they burned.
Scientists bought into the theory and used it to explain a few things about fire and burning: why things burned out (must have run out of phlogiston), why fire needed air to burn (air must absorb phlogiston), why we breathe (to get rid of phlogiston in the body).
Today, we know that we breathe to get oxygen to support cellular respiration, that objects need oxygen (or an oxidizing agent) to burn and that phlogiston just doesn't exist.

9: Heavier Objects Fall Faster

OK, trick question: do heavier objects fall faster than lighter ones? Today, we all know that they don't, but it's understandable how Aristotle could've gotten this one wrong.
It wasn't until Galileo came along in the late 16th century that anyone really tested this out. Though he most likely did not, as legend holds, drop weights from the tower of Pisa, Galileo did perform experiments to back up his theory that gravity accelerated all objects at the same rate. In the 17th century, Isaac Newton took us a step further, describing gravity as the attraction between two objects: on Earth, the most important being the attraction between one very massive object (our planet) and everything on it.
A couple of hundred years later, Albert Einstein's work would take us in a whole new direction, viewing gravity as the curvature that objects cause in space-time. And it's not over. To this day, physicists are ironing out the kinks and trying to find a theory that works equally well for the macroscopic, microscopic and even subatomic. Good luck with that.
 

10: Alchemy

 

The idea of morphing lead into gold may seem a little crazy these days, but take a step back and pretend you live in ancient or medieval times.
Pretend you never took high-school chemistry and know nothing about elements or atomic numbers or the periodic table. What you do know is that you've seen chemical reactions that seemed pretty impressive: substances change colors, spark, explode, evaporate, grow, shrink, make strange smells - all before your eyes.
Now, if chemistry can do all that, it seems pretty reasonable that it might be able to turn a dull, drab, gray metal into a bright, shiny yellow one, right? In the hopes of getting that job done, alchemists sought out the mythical "philosopher's stone," a substance that they believed would amplify their alchemical powers.
They also spent a lot of time looking for the "elixir of life." Never found that, either.

11. Don’t sap the very life out of the story.

The world of science is filled with researchers working on particles a fraction the size of an atom and studying cosmic distances that are incomprehensible to the average person. And barely a day goes by when a researcher doesn’t come up with insights or a discovery once thought to be impossible. It’s a world filled with wonder and awe. Don’t get bogged down in numbers and minutiae. Find the passion and excitement of the story — then share them.

12. Don’t leave out the science.

Some ongoing stories have significant science components. Two examples come immediately to mind: hydraulic fracturing (hydrofracking) and global warming. It’s not enough to write that the majority of scientists agree that the earth’s temperatures are increasing and that human activity is to blame. By mentioning how scientists take their readings and what they’re specifically finding, the public will acquire a deeper respect for the actual work involved and be in a better position to appreciate your stories. It may not be practical to include the science in every update, but consider doing so periodically.

13. Don’t get the science wrong.

Science is pretty complicated, whether it involves subatomic particles, chemical bonds, or DNA repair. It’s always better to take the time to write the story well, than to rush it for that day’s deadline. (Of course, that may mean negotiating with your editor for more time to do the story justice.) Get on good terms with a science press officer at a college who can put you in touch with an expert capable of explaining concepts in a simple, straightforward manner.

14. Don’t get stuck in the weeds.

The goal is to help people understand and appreciate the science in the story, not prepare for a physics mid-term. Every answer in science can lead to another “how” or “why” question. It may be enough to state that the waste product of hydrogen fuel cells is water, without discussing how hydrogen ions bond with hydroxide ions.

15. It’s OK to challenge an expert.

Scientists don’t always get it right (Can you say “cold fusion”?), and sometimes experts don’t explain things clearly. While you need to respect a scientist’s expertise, it’s important to maintain your skepticism and not relent when you find something to be confusing. Your loyalty lies with the public, not the scientist.

16. Make sure you get a second opinion.

My dad used to tell me that the same number of doctors finished in the bottom 10 percent of their class as finished in the top ten percent. While rank does not always indicate the quality of a doctor or scientist, his point remains — not all experts were created equally. Talk to a second scientist to verify what you’ve been told or to get a different perspective.

17. Don’t keep saying how dumb you are.

There are few things more ridiculous in journalism than having a broadcast host or reporter shake his or her head and say, “Golly, I’m lucky if I can tell Isaac Newton from a Fig Newton.”
Acting dumb does nothing to instill confidence in a science reporter. Journalists don’t take that approach in their political and economic reporting, so why do it with science? Reporters routinely go into interviews needing to learn about the subject at hand; science is no different. Do your homework and ask smart questions.

18. Don’t oversell research outcomes.

Scientific progress is rarely considered a breakthrough, and a discovery is not the same thing as a life-changing cure or an innovative new product. Research developments can be newsworthy without raising the public’s expectations.

19. There may not be an “other side” to the story.

There are people who believe the world is flat, that astronauts never landed on the moon, and that Elvis is still alive — but few journalists would consider including those angles in their stories. Learn which experts and theories are credible and take a stand for good science.

20. Don’t rely on inadequate experts.

Don’t get confused by credentials. A meteorologist is not the same as a climate scientist, and even a distinguished particle physicist is not necessarily an expert in quantum optics. Make sure an expert has the appropriate expertise.

Nasiru Loaded Blog News Comment for some suggestion or mistakes
we are welcoming your Opinion here

No comments:

Post a Comment