Site Index opens in new window

Photography and Astronomy - a Marriage Made in Heaven

Text and photography Copyright Denis Crawford
All rights reserved.

Introduction

Did you know that the first recognisable image of the moon was made in 1840, just one year after the introduction of the daguerreotype, the first practical photographic process? How? Photography was recognised by astronomers as a potential asset immediately after Louis Daguerre introduced his invention in 1839. Astronomers found that the photographic process could accumulate light over long periods, meaning that long exposures could gather faint light from objects too dim or distant for the human eye to see. However, it was not until the 1880’s that photography had advanced enough to reveal many more stars than had ever been known before. Photography then revolutionised astronomy, made astrophysics possible and eventually revealed the immense size and great age of the universe itself.

Some History


  Infrared view of Saturn. Image compliments of NASA.

The links between the early development of photography and astronomy are profound. We owe the word "photography" to Sir John Herschel, a well-known astronomer and son of the distinguished Astronomer Royal, Sir William Herschel. Sir John first used the word publicly at a lecture before the Royal Society on March 14, 1839. The word “photography” is derived from the Greek words meaning “writing with light”. Herschel wrote extensively on photography and encouraged its scientific use. He discovered the photographic fixative sodium thiosulphate (hypo), coined the word “snapshot” and was the first to use the words “positive”, “negative” as photographic terms.

It was the French astronomer François Arago who encouraged Louis Daguerre to attempt to photograph the moon - unfortunately Daguerre’s experiment was a failure. That first image of the moon was made by an American physiologist, Dr John William Draper. In fact, scientists and academics who tried to improve the early daguerreotypes often used the moon as their subject. But daguerreotypes were far from ideal - there was no way to make duplicate photos, and they had very low sensitivity (Draper’s first daguerreotype of the moon required a 20-minute exposure). A breakthrough came with the development of dry gelatine plate photography, allowing for the first sharp deep space images to be produced. It was Dr Draper’s son Henry who made the first clear exposure of the Orion Nebula in 1880.

The photographic innovations of the 20th century such as celluloid film, advances in materials for infrared and ultraviolet, and emulsions with improved sensitivity and resolution, greatly improved the range of astrophotography. Gas hypersensitising of film, a technique developed by Kodak in the 1970’s allowed for ever fainter and more distant objects to be photographed.

The 1980’s saw the ground-breaking work of Dr David Malin which allowed for deep space full-colour astrophotography. Colour film can’t produce accurate full colour space images because very long exposures change the colour balance and speed of the film - a photographic problem known as “reciprocity failure”. Malin employed a technique from the earliest day’s of colour photography by exposing three separate, black and white film plates with combinations of filters designed to record blue, green and red light. The three black and white images were combined photographically to produce the final colour image. These pictures show deep space objects much as they might appear in the telescope if our eyes were more sensitive to the colour of very faint light.

Computers now assemble information from different colour channels to form a full colour image digitally as Dr Malin did photographically. The beginning of the 21st century has seen the end of silver-based photography in favour of digital (CCD) capture of deep space images. CCDs are more sensitive than film (a broader spectral sensitivity as well as speed), don’t suffer reciprocity failure, and their output can be read directly into a computer.

Invisible Light

What we see of the universe with our eyes is within the visible spectrum. However, much of the light the universe emits is infrared light that cannot be seen by the naked eye. Infrared astronomy is extremely useful in viewing objects in space, which would normally be invisible to us.


  Jupiter - image compliments of NASA.

In an amazing coincidence, it was Sir William Herschel in 1800 who discovered the presence of infrared. He set up thermometers in each of the rainbow colours of sunlight spread out by a prism. He found that the thermometers registered a higher temperature near the red end of the spectrum, and increased beyond the red into the invisible part of the spectrum. Herschel named it infrared, which means “below red”. Infrared was not used photographically for more than a century because film was not sensitive to it, but today infrared pictures can be taken with an ordinary camera.

The first infrared telescope to be sent into space increased the number of catalogued astronomical sources by about 70 percent! It detected over 350,000 infrared sources. Observing the universe with an infrared telescope is so successful because many elements in space give off their strongest spectral information in the infrared range. Infrared radiation also passes through dusty regions of space, making it possible to see the regions in space hidden from the view of an optical telescope by dust and gas. Infrared observing can also be done during the day since it isn't affected by the visible light of the sun.

In 1801, Johann Wilhelm Ritter, a German physicist found infrared’s counterpart, the ultraviolet (UV) rays, by spectroscopy. Ritter was motivated by Herschel’s discovery of infrared, and his own belief in the balance of nature - invisible radiation beyond the visible red at one end of the spectrum simply had to be paired to invisible radiation beyond the violet at the other end.


  Mars - image compliments of NASA.

Ultraviolet observations have recently revealed features in Jupiter's stratosphere that are transparent in visible light. NASA’s Cassini spacecraft recorded a dark vortex of hydrocarbon haze forming and then dissipating in the upper atmosphere of the gas giant in late 2000. This feature resembled development of ozone holes in Earth's stratosphere. The phenomenon appears to occur only within confined masses of high-altitude polar atmosphere of both planets. This similarity may help scientists understand both processes better.

Haze

When you look up at the night sky, you will see twinkling stars. But what looks like a twinkling star to our eyes is actually steady light that has been distorted by the Earth’s atmosphere. Telescopes on Earth are equally vulnerable to this distortion.

That's why astronomers around the world dreamed of having an observatory in space - a concept first proposed by astronomer Lyman Spitzer in the 1940s. From a position above Earth's atmosphere, a telescope would be able to detect light from stars, galaxies, and other objects in space before that light is absorbed or distorted. The view would be much sharper than that from even the largest telescopes on Earth.

The Hubble Space Telescope was deployed in 1990 and after a few teething problems and a repair mission or two it began to yield images of the stars with clarity previously only dreamed of. The Hubble has quite an array of sophisticated instruments on board but it is the cameras and the images they yield which have captured most people’s interest. The most famous Hubble images come from the Wide Field and Planetary Camera 2 (WFPC2) - until now, the telescope's main camera. The camera can sense a range of wavelengths from ultraviolet to near-infrared light. The WFPC2 doesn't use film to record images, information is collected on CCDs and transmitted to ground-based computers which assemble the data into images.

An example of the clarity of Hubble’s view can be demonstrated by the detection of faint red stars. Previous observations from ground-based telescopes were uncertain because the light from these faint objects is blurred by Earth's atmosphere. This made red stars indistinguishable from the far more distant, diffuse-looking galaxies. Hubble's capabilities make it possible to observe red stars that are 100 times dimmer than those detectable from the ground. Hubble's extremely high resolution sees red stars as distinct points of light, as opposed to the "fuzzy" look of a remote galaxy.

But the best images are yet to come! The new Advanced Camera for Surveys (ACS) - successfully installed on March 7, 2002 has increased the efficiency of Hubble by a factor of 10. This new workhorse gives Hubble twice the field of view and five times the sensitivity allowing astronomers to study the formation and evolution of galaxies, and the distribution of dark matter. Quite an advance since that first image of the Moon captured by Dr. J.W. Draper!

Conclusion

It is the ability of photographic processes to record those things that the human eye can’t see which has made photography an essential element of astronomy. We owe much of our knowledge of the universe to the innovative people who developed this indispensable tool for astronomers.

Editor's note - Denis Crawford works out of Melbourne under his business name Graphic Science. The business combines his lifelong interest in science and photography. His photography can be seen online at www.graphicscience.com.au.

Denis Crawford - NPN 437

Comments on NPN astrophotography articles? Send them to the editor.



Print This Page Download Adobe Acrobat Reader 5.0
Site Map  •   NPN Membership  •   Front Page  •   Reader's Forum  •   Links  •   Gift Shoppe  •   Terms of Use
Copyright Nature Photographers Online Magazine, Inc.  All rights reserved.