Infrared radiation like all electromagnetic
radiation travels in straight lines in space. This is clearly
evident with visible light since shadows are formed where
the light path is interrupted by an object.
emitted from their source in all directions so that if the
source is relatively small the radiation will spread out
equally in all directions and will feel weaker -- less focused. Likewise, the energy
passing through a unit area per unit time - the intensity, will decrease with distance.
The relationship between the
distance from the source and intensity of radiation is expressed
in the inverse square law which states that the intensity
of the radiation from a point source is inversely proportional
to the square of the distance from the source:
I = 1/d2
Where: I = Intensity, d = Distance between the source and the point of calculation
So, to double the distance from an infrared source decreases the intensity to (1/2)2 = 1/4 of its original value.
This is, of course, only strictly true if there is no scattering
or absorption of the radiation but for much radiation
it is effectively true in air and of great practical importance.
The consequence is that small changes of distance will cause
large changes of intensity. Doubling the distance between
the source and the irradiated surface will reduce the intensity
to one-quarter; tripling the distance would reduce the intensity
to one-ninth and so on. Similarly, halving the distance
will quadruple the intensity.