Now Reading
What Are You Really Looking at When You’re Looking at a Black Hole?

A simulated view of a black hole from the 2014 film ‘Interstellar’. Image: YouTube


  • Black holes swallow light – yet their strong gravitational pull exerts other effects on their neighbourhoods that give away their presence.
  • These cosmic objects twist spacetime itself around themselves, trap information and warp the path of light. To see a black hole then is to see all these things at once.
  • Imaging a black hole in turn is no small feat, requiring advances in multiple fields, including radio-astronomy, spaceflight, GPS systems and computing.

On May 12, an international team of scientists released a picture of the black hole at the centre of the Milky Way galaxy. Named Sagittarius A*, it weighs seven-million-times as much as the Sun and is 27,000 lightyears away.

Imaging Sagittarius A* was a feat of two parts – physics and technology – and delving into each provides a glimpse of distinct facets of imaging an object so bizarre that it seems to be able to bend light around itself.

I

Physics

Image: Casey Horner/Unsplash

To see something is to see the rays of light coming from the direction of the object. If you’re reading this on your smartphone, its screen is emitting light that reaches your eyes and then your brain, which makes sense of the information encoded in them. If you look just beyond to another object nearby, like your desk or a wall in front – they’re not emitting their own light but scattering light from another source, like the Sun or a lamp.

The light from near a black hole is light from nearby sources that it has distorted. By accurately tracking and studying these distortions, we can catch sight of the black hole itself.

The defining feature of a black hole is its prodigious gravitational pull. It is also what defines all four parts that typically make up a black hole. One is the singularity itself – the point at the black hole’s centre towards which all objects within the black hole move, the point where the gravitational pull is infinite. Second is the event horizon, known colloquially as the black hole’s surface: it marks the distance up to which any object will fall towards the singularity and not be able to escape. The third is the ergosphere, the region of space near the event horizon where objects will get bent around the black hole but not fall inwards into the singularity. The fourth is the accretion disc, a ring of objects orbiting the black hole, like planets around a star.

The international team that imaged Sagittarius A* – called the Event Horizon Telescope (EHT) collaboration – has captured the event horizon, the ergosphere and the accretion disc, but not the singularity. This is obvious: light that’s beyond the event horizon will never have escaped into space, so no information from the singularity can ever reach us.

The event horizon is itself visible only as a dark region surrounded by a bright orange-yellow halo. This dark patch is also called the shadow (and so the headlines about EHT having captured a “black hole’s shadow”). It’s the volume of space around which light has been bent – the physics of which we can understand using Albert Einstein’s general theory of relativity published 107 years ago.

According to general relativity, mass curls spacetime around itself. The more the density of mass in an area, the more the extent of curving. When light moves along such curved spacetime, it appears to us to follow a curved path. (The force that objects feel due to the curved path is what we know as gravity. Put another way, an object’s gravitational pull doesn’t directly bend the light’s path. Instead, it warps spacetime, and light simply flows along its new shape.)

Massive objects distort space (denoted by the grid) and the passage of time (notice the clocks at the nodes). Animation: Lucas Vieira Barbosa/Wikimedia Commons, CC BY-SA 2.0

Black holes are so dense – i.e. pack so much mass into a relatively small volume – that they curve spacetime completely. Imagine spacetime to be tablecloth. If you place a ball underneath, the way the sheet flows around the ball on the top is how spacetime curves around a mass. Black holes, however, would wrap the sheet completely around themselves – so light that’s flowing on the sheet will just round and round in circles, trapped on the surface of the sphere. Thus the name ‘event horizon’: an event is the name for a point on the spacetime continuum, and events on either side of the event horizon can’t cross over to the other side.

Light that’s flowing under the sheet will go straight in and be lost forever. Light that’s flowing in the ergosphere, however, will be bent strongly, and could possibly escape in a different direction. In fact, a black hole’s warping effect is so complete that if you flashed some light at such an angle that it enters the ergosphere, the light could follow a path that goes completely around the black hole and then comes back to you. This is why you see a thin ring of light around a black hole. It sounds like a paradox but that’s the magic of the ergosphere.

The ergosphere also exerts tidal effects on the light: that is, light that enters the ergosphere must co-rotate with the black hole, so it is effectively dragged on. This is called the Lense-Thirring effect. It has two particularly interesting consequences. One, light that enters the ergosphere in the direction opposite to that of the black hole’s rotation will be forced to turn around and start moving along the rotation. Two, nothing can remain stationary in the ergosphere, because here the black hole’s gravity is actively twisting spacetime itself around it.

Then there is the accretion disc, where millions upon millions of tonnes of intergalactic matter – gas, dust and rocks – orbit the black hole. (The ergosphere and the disc can overlap.) Objects in this belt are accelerated and pushed together, and heated up by friction. As a result they emit high-energy electromagnetic radiation like X-rays.

Now consider all of these effects together and you may be able to get a sense of what you’re really looking at when you’re looking at a black hole. There’s a thin ring of light around it. There’s also more light on the side where the black hole is rotating towards you. There’s a bright halo emitting high-energy radiation. And there’s a spheroid patch of obsidian black vaguely near the centre.

In all, it’s a site of intense activity – but the same activity, together with our distance from it, makes it very difficult to actually see these things. This brings us to the technologies we need in order to image a black hole.

II

Technology

The locations of the participating telescopes of the EHT and the Global mm-VLBI Array (GMVA). Image: ESO/O. Furtak, CC BY 4.0

The EHT is not a single telescope but a collection of radio telescopes that function in unison.

Each radio telescope treats the radio signals coming from outer space as waves – and this is why the telescope itself doesn’t ‘see’ the signal like our eyes see light. Instead, a radio telescope is an antenna. It consists of a large dish-like structure that ‘senses’ the radio waves and, with the help of computers, reconstructs the image of their source.

The smallest level of detail in this image is determined by the antenna’s angular resolution. The higher the angular resolution, the more finer details the image will have. The way to increase it is simple: the wider the dish, the higher the angular resolution will be. (To be more exact, the angular resolution of a dish antenna is directly proportional to the diameter of the dish and inversely proportional to the wavelength of the radio waves.) The resolution of the FAST radio observatory in China – one of the best single-dish radio antennae in the world today because of its size – is 174 arcseconds (as).

Even though there are signals of many frequencies coming from near the black hole, radio waves are large enough for most of them to not be hindered by objects in the black hole’s accretion disc – and any other objects that the waves may encounter as they pass through space. On the other hand, X-rays, infrared radiation and visible light are absorbed or scattered away.

However, the radio waves coming from near Sagittarius A* are so weak, having travelled 27,000 light years, and coming through to us in such a wide beam that a single antenna won’t be able to ‘collect’ enough radio waves to put together a meaningful picture of a black hole in a reasonable amount of time – if at all. Or we could if we built a radio telescope with ultra-high angular resolution – requiring a dish as wide as Earth itself[footnote]12,742 km[/footnote]. But this is impossible (the gigantic FAST observatory itself is only 500 metres wide). Instead, physicists and engineers have come up with a clever alternative.

When multiple telescopes work together, the maximum distance between telescopes is called the baseline. In a single radio antenna, the baseline is equal to the diameter of the dish. But the Submillimeter Array (SMA) in Hawaii, for example, consists of eight radio telescopes, each with a six-metre-wide dish, and a total baseline of up to 508 metres. When the SMA works in such a way that the eight antennae each behave as if they were one panel in a larger dish, the angular resolution is determined not by the diameter of each dish but by the overall baseline. As a result, the SMA has more than 80-times better angular resolution than if it were to use each of the antennae separately.

The Submillimeter Array (SMA) of radio telescopes at night, lit by flash, January 2016. Photo: Steven Keys/keysphotography.com

Coordinating multiple telescopes like this is a very complicated task. It requires several computers to ensure they’re pointing at the same part of the sky at the same time and tracking the right wave frequencies. It requires atomic clocks to record the exact time at which signals from the same source reach each of the telescopes. And it requires GPS satellites to keep track of the antennae’s position relative to the source as Earth rotates.

Once all the data has been collected (at the rate of 2 million GB per day), physicists use grid computers or supercomputers to put it together, using algorithms based on the general theory of relativity and other concepts, and filter out the noise, to ultimately produce the image.

Antenna arrays that work thus are called interferometers, and the ‘thus’ is called very-long baseline interferometry. The Giant Metrewave Radio Telescope (GMRT) in Pune is an interferometer consisting of 30 antennae and has a resolution of 2 as (at 1.7 GHz). The EHT is an interferometer that consists of eight arrays on four continents and has the stunning resolution of 20 µas – 10,000 times better than GMRT and a million times better than FAST. This is the sort of resolution required if we’re to image the black hole at the centre of the Milky Way galaxy.

In fact, the EHT collaboration chose Sagittarius A* and M87* as its targets because of its technological limitations. With its stated angular resolution, the EHT interferometer could access the former because it’s the closest to us and because the latter – while 54 million light years away – is one of the largest of its kind known. If the EHT is to image more distant, and likely also smaller, black holes, it will have to increase its baseline further. There are some ideas to do this by adding radio telescopes in space.

§

Gravity is one of the four fundamental forces of nature but it’s extremely unlike the other three, which are the domain of quantum mechanics. This hasn’t stopped nature from giving rise to a curious similarity between the tools that we use to probe gravitational and quantum-mechanical anomalies.

Machines like the detectors used to discover the Higgs boson are so large because they need to produce particles packed with extremely large amounts of energy. The more energy these particles have, the smaller the distance of spacetime they can probe. Yet the gravitational anomalies that are black holes also require colossal assemblages of glass, metal and fire.

Is it the nature of anomalies to exact such costs?

Scroll To Top