RSS
In this site you will get a daily update health contents and article. 100% only for you !!!
ads

10/8 Wired: Science

Wednesday, October 7, 2009

Please add updates@feedmyinbox.com to your address book to make sure you receive these messages in the future.
Wired: Science - Understanding the latest research and theories. Feed My Inbox

Nobel Prize for the Chemistry of Protein Production
October 7, 2009 at 4:48 pm

ribosome

This year’s Nobel Prize in Chemistry went to three molecular biologists who study ribosomes, the protein factories within cells.

Ribosomes were discovered in the 1950’s by George Palade, who went on to win the Nobel Prize in Physiology or Medicine for his work on the makeup of cells, but scientists weren’t able to take a close look at those organelles till the end of the century. Thomas Steitz, Venkatraman Ramakrishnan, and Ada Yonath developed tricks for examining the tiny structures with x-rays and electron beams. The high-resolution 3D images they acquired will help chemists develop a host of better medications.

"Scientists around the world are using the winners’ research to develop new antibiotics that can be used in the ongoing battle against antibiotic-resistant microbes that cause so much illness, suffering and death.” said Thomas Lane, president of the American Chemical Society, in a press release.

Dozens of antibiotics — including tetracycline and clindamycin — work by gumming up the ribosomes inside bacteria. Each of those medications is made up of relatively small molecules that can wedge tehmselves into crevices in the ribosome, destroying the microbes’ ability to make protein, and thus rendering them helpless.

Armed with 3D images of antibiotic molecules wedged into ribosomes, medicinal chemists can refine their strategy for fighting bacteria. They can find new weak spots in bacterial ribosomes.

That approach is a lot like the way that the Rebel Alliance destroyed the first Death Star: by looking at its blueprint and finding a weak spot. Except, in this case the researchers are looking for vulnerable nooks and crannies in a blob of RNA and protein, rather than a thermal exhaust port.

Dozens of 3D images that show antibiotics sticking to ribosomes are available in the Protein DataBank, and you can look at them yourself with a tool called First Glance.

Just type the Protein DataBank ID number for the ribosome that you want to look at, and then start exploring.

Here are some of the best structures:

Ribosome with Azithromycin: 1Z1K
Ribosome with Chloramphenicol: 1NJ1
Ribosome with ClindamycinL 1YJN

Image: A ribosome reads an mRNA sequence and produces protein according to its genetic code. Credit: Lawrence Berkeley National Laboratory

See Also:




Alligator Swamps Are Lousy With Monogamy
October 7, 2009 at 4:25 pm

alligator8

Alligators don’t seem to be the promiscuous, indiscriminate reptiles scientists once though they were. A new 10-year study of alligator mating habits shows that most female crocodilians prefer to mate over and over with the same male, despite encountering a vast array of eligible alligator bachelors each year.

As the only surviving members of a class of reptiles called archosaurs, which included dinosaurs and the ancient ancestors of birds, alligators are in a unique position to help scientists understand the mating patterns of dinosaurs and birds. For the past 10 years, ecologists have been tracking female alligators at the Rockefeller Wildlife Refuge in Louisiana and recording their mate preferences by looking at the DNA of their young. The data, published today in Molecular Ecology, reveals that up to 70 percent of female alligators choose the same partner year after year.

alligatornest“Given how incredibly open and dense the alligator population is at RWR, we didn’t expect to find fidelity,” biologist Stacey Lance of the Savannah River Ecology Laboratory in South Carolina said in a press release. “To actually find that 70 percent of our re-trapped females showed mate fidelity was really incredible. I don’t think any of us expected that the same pair of alligators that bred together in 1997 would still be breeding together in 2005 and may still be producing nests together to this day.”

Finding mate fidelity in alligators is surprising because most reptiles are polygamous, often mating with multiple partners during the same breeding year and producing young from multiple fathers. Alligators do exhibit multiple paternity — in this study, roughly 50 percent of nests contained eggs from more than one father — but surprisingly, females appeared to pick the same male (or males) year after year.

Because of the dense population of alligators at the wildlife refuge, the researchers don’t think the repeat pairings were a result of chance. Instead, it appears that female alligators are actively choosing specific males that they’ve mated with in the past. Only a few other reptilian species exhibit this type of mate preference, and this is the first time anyone has shown fidelity in alligators.

The researchers are still trying to understand what drives alligator mate choice, and how picking the same mate might benefit future generations. Unlike most other reptiles, female alligators spend significant energy nurturing their young, both by sitting on their nests and defending their babies once they’re born. It’s possible that a successful pairing in the past means a higher chance of successful breeding in the future, but the scientists say further study is necessary to prove what makes an alligator stay faithful.

“In this study, by combining molecular techniques with field studies we were able to figure something out about a species that we never would have known otherwise,” Lance said. “Hopefully future studies will also lead to some unexpected and equally fascinating results.”

Images: Phillip "Scooter" Trosclair

See Also:

Follow us on Twitter @wiredscience, and on Facebook.




More Than Meets the Eye: How the CCD Transformed Science
October 7, 2009 at 3:45 pm

first-device2

The 2009 Nobel Prize for Physics went, in part, to the inventors of the charge-coupled device George Smith and Willard Boyle this week. Their innovation, sketched out in 1969, is now the imager in millions of digital cameras and telescopes.

The very first prototype, pieced together months after Smith and Boyle laid out its working principles, is pictured above.

A charge-coupled device, in most applications, translates light into an electronic signal. Photons of light striking an array of capacitors create an electrical charge proportional to their intensity, which the charge-coupler transforms into voltage. That signal can be digitized and transformed by the dull magic of high-performance computing into Hubble’s images.

Millions of CCDs are made each year for mass market cameras, but they also proved a transformational technology in science by providing a much more sensitive light sensor than previously existed. After being overlooked for decades, the Nobel win was a mild surprise but well deserved.

“There wasn’t anything that could compete in scientific imaging,” said Tony Tyson, an astronomer at the University of California, Davis, who built the first CCD camera for scientific applications in the late 1970s. “You’re interested in getting very high signal to noise ratios. There’s nothing that really competes with CCDs.”

For the really dim things astronomers look at, the number of photons of light coming from a source is so small that each one counts. Out of every 100 photons, a CCD can record more than 90 of them. Photographic plates can barely reach 10 percent. And your eyes? Their quantum efficiency is in the 1 to 4 percent range.

70sccd

According to lore, Smith and Boyle sketched out the design for the ubiquitous imaging device in an hour, over lunch at Bell Labs in October 1969. Working under the intense pressure applied by their taskmaster of a boss, Jack Morton, the pair had the device fabricated within a couple of months. George Smith took a photo of it, which you can see at the top of the page.

The road, though, from the creation of the prototype to the development of an actual technology that could be used by by scientists and photographers was long and hard. Though CCDs would come to dominate astronomy, the device, as invented, was nowhere near high enough resolution to be worthwhile. With its poor signal-to-noise ratio, it was not immediately clear that the CCD was destined for greatness.

“I joined the company in 1969, the very year that Dick Boyle and George Smith invented this thing,” Tyson said. “I actually, frankly viewed it as a toy. It was so small and awfully noisy.”

Historians Robert W. Smith and Joseph N. Tararewicz note that “astronomers could not simply procure a CCD ‘off the shelf’ soon after the device’s invention at Bell Labs.” In fact, a number of other imaging systems were suggested for what became the Hubble Space Telescope including a panoply of image tubes.

Some astronomers, though, saw the potential for CCDs down the road. They were in Smith and Tararewicz’s terms, “counting on invention.”

In the face of budget cuts in 1974 that threatened putting the still high-end CCD technology on the Large Space Telescope (Hubble), an astronomer delivered an impassioned defense.

“To decide now, eight years before the LST can possibly fly, on what is already an out-dated detector, less than state-of-the-art, will be regarded in the future, I think as a poor choice of the options which are conceivable in other directions for cutting the cost of the LST,” Margaret Burbidge, a prominent astrophysicist, wrote. “It is like deciding to treat a sick patient by cutting out his heart on the grounds he would be saved the energy used by the heart muscles in pumping the blood around the body.”

Hard work by hundreds of scientists and engineers pushed CCDs closer to reality over the next few years. Companies like Fairchild, Kodak, Tektronix, rather than Bell Labs, developed the technology into usable form. Military, scientific, and consumer applications were all benefiting from the money being thrown at the CCD problems from different directions, but it was still tough.

“It was a very painful development,” Tyson said. “There were all these problems making really large cameras and getting uniform CCDs out of companies that were already pushing the envelope.”

Still, scientists like Tyson persevered. After nearly a decade, he put his latest camera on the 40-inch telescope at Mt. Palomar Observatory and was able to measure the distribution of the faint blue galaxies. That work became an important piece of evidence that dark energy — the mysterious force propelling the acceleration of the universe outward — actually exists.

Now, nearly every major astronomical observatory uses CCDs. They also remain the gold-standard for medical imaging, or really any type of science that needs to capture photons. Though CMOS imaging technology is making inroads in consumer technology, there’s “still nothing like a huge CCD,” Tyson said, for high-end science.

Tyson’s latest project is the Large Synoptic Sky Survey, which will incorporate a 3,200 megapixel camera. Plotted against time, CCD performance, measured in pixels, has grown at close to the same dizzying logarithmic rate that computing power has (see below).

Clearly, that hour of lunch at Bell Labs opened up a technological development path that was as broad and deep as nearly any in the twentieth century. And after decades of being overlooked for the biggest prize in science, the inventors of the CCD are finally getting their due.

“Back, say, 30 years ago when I was at Bell Labs, we thought that CCDs could very well be a Nobel Prize,” said Cherry Murray, dean of engineering and physical science at Harvard University and a one-time colleague of Smith and Boyle at Bell Labs. “It had been overlooked for so long… It’s nice to see.”

fig1-plot


Images: Sent by Tony Tyson to Wired.com. 1. George Smith. 2. Tony Tyson. 3. Tony Tyson.

See Also:

WiSci 2.0: Alexis Madrigal’s Twitter, Google Reader feed, and green tech history research site; Wired Science on Twitter and Facebook.




Beyond the Genome
October 7, 2009 at 1:00 pm

genome3s

When scientists finished sequencing the human genome, the answers to diseases were supposed to follow. Six years later, that promise has gone unfulfilled. Genetics just isn’t that useful for predicting who gets sick, and why. The blueprint of life turned out to be an intriguing parts list.

“It’s much more complex than we had thought. There aren’t going to be easy answers,” said Teri Manolio, director of the National Human Genome Research Institute’s Office of Population Genomics. “The genome is constantly surprising us. There’s so much that we don’t know about it.”

Manolio is the lead author of a Nature article entitled “Finding the missing heritability of complex diseases.” Published Wednesday, it’s part of a major change in how scientists see the genome.

In April, several articles in the New England Journal of Medicine featured researchers arguing over why genome-wide association studies — in which thousands of genomes are compared in a hunt for disease-linked patterns — had found so little. Several months later, a massive hunt for schizophrenia genes was described as the field’s “Pearl Harbor.” At a conference this summer at the Jackson Laboratories, the shortcomings of gene-centered explanations were a starting point for talks by some of the world’s most prominent geneticists.

It’s not that genes are suddenly unimportant. Researchers are just acknowledging their variations as pieces of an extraordinarily complicated puzzle, along with how genes are turned on, how many copies are made of each, the shape of the genome itself, and how all of the genome’s protein products mix and interact.

Wired.com talked to Manolio about the future of genomics research.

Wired.com: What do you mean by “missing heritability”?

Teri Manolio: We know that diseases cluster in families. In some diseases, the risk might be two or three times higher than normal, or 30 times higher, for a relative of someone with a disease. But when we do these genomic studies, we find maybe a 50 percent increase in risk. That gap is what’s missing.

Wired.com: The numbers can get tricky. If you’ve found that someone with a certain genetic variant has double the risk of developing a disease, but the heritable risk is a hundred-fold, then we’ve only connected two percent of the heritability to genetics?

Manolio: That’s a fair way of putting it. The gap varies. In some diseases, we’re describing half of the genetic heritability. But that’s unusual. Only macular degeneration has numbers that high. In many diseases, it’s around five percent.

Wired.com: How much of the gap is caused by our inability to link genetics to conditions, and how much has non-genetic causes?

Manolio: There’s a lot of thought that this might be DNA and environment together. If you’re not exposed to adverse environmental factors, then you may never develop a given disease. With a bad enough environmental exposure, you may get a disease regardless of your genetic makeup.

Wired.com: What about aspects of our DNA that we’re just starting to study, like variations in the number of copies we have of each gene, or how genes are activated or physically arranged inside a cell?

Manolio: All of those have been suggested. At least so far, it doesn’t look like copy numbers explain a huge amount of this. But there are other places to look, and I suspect that the answer is going to be, “all of the above.”

Wired.com: How does all this fit with what the public expected of genomics? It seems we had different expectations than the scientific community.

Manolio: Well, to be honest, I think we were a bit naive about things, too. We’d hoped that when we identified where all the genes are, and all the coding regions and all the variations one could have, then that would explain everything. Those were the hopes, and then reality came crashing in.

Wired.com: What about personalized genomics testing? That’s been the big consumer application of genomics so far.

Manolio: Since we’re not explaining a huge mount of the inherited tendencies between people, then the information you get from a genotyping company may not be very apparently useful for predicting your risk of disease in the future. That’s what emerges from many of these studies: There are likely many other factors that increase your risks, and these factors are known and explain more than genomics does now. Genomics is a promising research tool, but right now it’s really a research tool.

Wired.com: How do we find the missing heritability?

Manolio: We’ll follow multiple avenues of research. We have to be humble about how this works.

Wired.com: Do we have the tools?

Manolio: Our sequencing is in good shape — the costs are coming down, we can get everyone’s base pairs read — but interpreting them is a real challenge. Technologies for epigenetics research are still developing. And there will be other needs coming down the pipeline.

Wired.com: Want to put a timetable on the research?

Manolio: I don’t think we can. In the next few years, we’ll see lots of variants associated with diseases. Many will be further investigated, and their functions determined. That’s one of the missing links here: what’s the function of all these things? We have over 400 variants identified in a whole variety of traits, but only in a few do we understand how they change a gene’s function, and how that may change biology. But these are great clues to biology.

Wired.com: Is that a better way of thinking about genetics — not in terms of answers, but clues?

Manolio: Absolutely. And if you’re a glass half-full person, then four years ago, we had practically no associations that we could replicate in multiple populations. Now there are hundreds. All of these are clues, and that’s wonderful. We just need to be patient in figuring out what they mean.

Image: From “Circos: an Information Aesthetic for Comparative Genomics.”

See Also:

Citation: “Finding the missing heritability of complex diseases.” By Teri A. Manolio, Francis S. Collins, Nancy J. Cox, David B. Goldstein, Lucia A. Hindorff, David J. Hunter, Mark I. McCarthy, Erin M. Ramos, Lon R. Cardon, Aravinda Chakravarti, Judy H. Cho, Alan E. Guttmacher, Augustine Kong, Leonid Kruglyak, Elaine Mardis, Charles N. Rotimi, Montgomery Slatkin, David Valle, Alice S. Whittemore, Michael Boehnke, Andrew G. Clark, Evan E. Eichler, Greg Gibson, Jonathan L. Haines, Trudy F. C. Mackay, Steven A. McCarroll & Peter M. Visscher. Nature, Vol. 461, No. 7265. October 8, 2009.

Brandon Keim’s Twitter stream and reportorial outtakes; Wired Science on Twitter. Brandon is currently working on a book about ecosystem and planetary tipping points.




Supermassive Black Holes Collide to Become Even More Super and Massive
October 7, 2009 at 10:01 am

blackholecomposite

New X-ray data from NASA’s Chandra X-ray Observatory added to an image previously captured by the Hubble Space Telescope created this amazing composite image of two black holes on the verge of colliding.

The two supermassive black holes, which show up as two points of light in the center of the galaxy NGC 6240, are only 3,000 light-years apart. Astronomers think the two will eventually combine into a single, larger black hole.

Also combining to make a whole greater than the sum of its parts are the two pieces of this image, shown below. Space photos are often a combination of multiple images and sets of data, designed to bring out the details and beauty of the subject. In this case, Chandra’s X-ray data and Hubble’s optical data come together to create an image so stunning that it looks like it must be an artist’s rendering.

blackholeoptical-copy

Images: X-ray: NASA/CXC/MIT/C.Canizares, M.Nowak. Optical: NASA/STScI.

See Also:

Follow us on Twitter @betsymason and @wiredscience, and on Facebook.



 

This email was sent to carrizolaziale@gmail.comCreate Your Account
Don't want to receive this feed any longer? Unsubscribe here.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

0 comments:

Post a Comment