In computing, lithography is the process of imprinting patterns onto semiconductors to use in circuits.
Photolithography is used to transfer a pattern from a photomask to the surface of a substrate.
The first stage is the imposition of a structure on the beam of light, which is passed through a mask and projected onto the silicon wafer.
The ‘resist’, which is a coating on the wafer, undergoes chemical changes when exposed to light. The solubility of the material is modified and formed into a stencil through the application of a solvent.
What’s the next stage?
The substrate is then exposed to chemicals which produce a solid material which condenses on its surface.
Finally, the etching takes place, which chemically removes layers from the surface of the substrate, leaving only the wanted pattern.
More generally, lithography is a term for printing onto a flat surface by coating it in a substance that repels ink except for where the pattern is supposed to be.
The word comes from the Ancient Greek words for ‘stone’ and ‘writing’.