The answer is "not very much".
I purchased a "TriField" EMF Meter Model TF2 out of general curiosity.
Near the floor of my ground-floor room, the RF background reads 0.02 mW/m^2. I could detect RF radiation from a Raspberry Pi 3B+ about 3 inches from the plastic case, beyond that it was indistinguishable from background. For comparison, an Ethernet cable was only detectable at 2 inches. With the meter held against the Pi case I got a much higher reading, 10 mW/m^2.
My conclusion from measuring these things, and thinking about the physics, is that the field is decreasing much faster than the "inverse square law" we learn about in school. That law is for monopoles, where for example an Ethernet cable would have many small loops of opposite orientation which cancel out at larger distances, thus decaying faster than 1/r^2. Presumably a well-designed single-board computer would try to avoid large high-frequency current loops, even if only to save power, and so its assorted clock radiations would have similar decay properties to an Ethernet cable. This is consistent with what I found.
By contrast, when measuring the Pi's WiFi antenna (while doing a ping -f -I wlan0
), it was difficult to detect any decay at all. At first it seemed that the values were increasing with distance; then I decided to use a wooden yardstick, rather than a metal measuring tape, which turned out to be changing the readings. Eventually, twisting and turning the meter and looking for areas of local maximum (the wavelength of 2.4GHz is 5 inches), I came up with some tentative plausible-looking numbers.
RF radiation from WiFi:
1ft 1.1 mW/m^2
2ft 0.6-0.7
3ft 0.5
4ft 0.3
I imagine these could be somewhat far off, since the numbers jump around a lot, but the gist is that the WiFi signal decays much more slowly than the small "unintentional" radiation from the Pi's unshielded processor.
I hope the meter I got is good, it is supposed to measure from 20MHz to 6GHz. For some backgorund, one of the first things I noticed when the meter first arrived was that cell towers produce much more radiation than cell phones. Standing outside or by a window I get a 5mW/m^2 signal from the cell tower a few blocks away; it increases to 15 or 20 near the tower itself. But I'm lucky to pick up 0.1 mW/m^2 a foot away from my phone when it's making a call. I read somewhere that this is due to the fact that towers have better reception with their larger antennas; phones can whisper to towers, while towers have to shout to phones. I can't find the link at the moment. Most of the information I find online states that the opposite should be true, for example the "American Cancer Society" states, without any references "The amount of exposure from living near a cell phone tower is typically many times lower than the exposure from using a cell phone."
It could be that this meter is peculiar; it is definitely consumer-grade. However, it is instantly responsive. The fact that the signal increases when I stand near a window, or near a WiFi router, is unmistakable.
The values I measured are much smaller than the 1.5 W/kg levels used in the 2016 National Toxicology Program study on rats, perhaps 1000 times smaller - not sure how to convert units - although according to another meter I borrowed, with a higher maximum reading, you could approach those levels sitting a couple feet from a WiFi router.
I'm not sure why, two decades ago when my housemate owned a TV, I noticed TV signal interference when operating a PC with no case. In America, VHF is 54-216 MHz, UHF is 470-890 MHz. The CPU of my computer, an iMac, would have run at 200-700 MHz. My current laptop produces 1 mW/m^2 at several inches, which is quite a bit more than the Pi. It seems possible that the differences have a lot to do with the RAM which must run at high frequencies and which is located quite close to the the CPU on a Pi, not so close with the iMac or with my current laptop.