How much computation does a brain use? According to a new estimate its about 1014 - 1017 FLOP/s.
A FLOP is a 'Floating Point Operation' (basically adding subtracting, dividing or multiplying two numbers). Doing lots of these per second is a good way of measuring computing power, and can (arguably) be a good way of measuring brain power.
For reference an iPhone can produce ~1010 FLOP/s and an Nvidia V100 GPU (costing $10,000) about ~1014 FLOP/s. So according to this, a brain could be made using somewhere between 1 and 1,000 GPUs, or a million iPhones. As this article admits, that would also depend on working out the software, data and architecture.
I'm not sure how accurate this is, but its an interesting exercise - a bit like one of those 'how many piano tuners are there in Birmingham' type interview questions. The latest language models are pushing these limits of computational power, but whether they are doing anything quite like a human brain is an open question.