Wired network vs wifi network

Its the same reason WiFi power is limited in routers. It has to be kept at a level that the federal government considers safe to be next too 24 hours a day, its why when you buy a router, you can end up with the same range as a well made $100 router when paying $350

WiFi power is limited in routers because the FCC aims to prevent interference. It's the same reason CB radios are limited to 4 watts.
 
Yeah, well I do wonder, your statement is on current medical knowledge. Like cigarettes used to be.
It is a known fact that part of your brain is warmed up by "non-ionizing radiation" just like a microwave oven (which also uses non-ionizing radiation) warms food when using a cell phone, something visible light doesnt do.
So I dont know about the only way to be hurt by using a device over a period of decades or less. I dont think about it much but there is something to be said that I wouldn't want a cell phone or wifi device glued to my head 24 hours a day.

Its the same reason WiFi power is limited in routers. It has to be kept at a level that the federal government considers safe to be next too 24 hours a day, its why when you buy a router, you can end up with the same range as a well made $100 router when paying $350

(btw- just my thoughts, I do not give much thought to being exposed to Wifi though will admit I would not place a router on near the headboard of my master bedroom bed where I lay in front of it 8 hours a night.

I recommend reading some high school physics before making judgement like that. Let's start with power and distance, assuming you have antenna that are 2D (a pole, not a point, radiate the same power to the circumference of a circle), and your power reached from the antenna to you would be 2 x Pi x R. Wifi you use on your phone and laptop are mostly download from the internet instead of upload, you are not running a server or live streaming your tik tok 247, so that most of the traffic these days (assuming you do not have unlimited data from your phone plan) is from wifi router off cable / fiber / DSL. Those only have emission mostly when you are downloading stuff. The amount of power to broadcast the "hey there is a router here" signal is very low and very short, like, 1/1000 of the amount or even less than full on downloading.

So, looking at the amount of traffic you go down vs up stream you will realize unless you are talking on 2G/3G network all day you are likely not emitting much anything from your phone. How many minutes do you talk a month? 100 mins? even with video conferencing I'm not doing that much, I'd say I've only used about 20 mins a month last month, with way more texting and zoom / webex / team conference than that (like 3 hours a day on average). Your radiation from your phone glue to your head (very short distance, like 2 inches) are very low. Your radiation from wifi router to you (assuming 10 feet) is let's say 20x the amount of traffic to your phone downstream than from your phone to your router, but the distance is likely 5x further (assuming you use your phone 2 feet from your eyes). 2 x Pi x R, with 5x the R means 1/5 the power. The 100mW of power from your router would be equivalent to 20mW power from your phone to your router. Some napkin math here: 20mW power from router for 20mins of actual transmission time per day (because most of the time you are just watching the screen and nothing is transmitting) vs 100mW of power from your phone to your router for 1 mins of actual transmission per day (the transmission is to acknowledge the packet to the server that you have received the data, so they do not retry and will send you new data instead, this is how TCP/IP works), you are getting 20mW x 20mins + 100mW x 1 mins = 400mWmins + 100mWmins = 500 mWmin of power send to your head. This is like having a wifi router stick to your head and download for full 1 mins. So, using smart phone isn't dangerous compare to other stuff like sunbathing. Living next to a cell tower would likely be a much higher risk but not from your phone, those are 20W each antenna from what I remember.

I am not sure if wifi on phone does this but on cellular, at least in LTE, if you are having good reception from the cell tower your phone will reduce the transmission power because it does not need to use that much battery life to transmit and the tower will still get it. The positive side effect is your will get less radiation from your phone despite more radiation from your tower.

In highly congested area, when carrier put up new tower they will reduce the power from nearby tower to make a smaller cell, and reuse the same band again, and everyone will have reduced radiation because the distance from antenna is shorter and they can reduce the transmit power.

Power kills, your microwave oven is 1000W typically, not 100mW. You shouldn't keep a wifi router near your headboard because it is bad for reception to begin with.
 
Last edited:
On the subject of starlink. I think it is about time we have rural coverage in the US and a way to break the cable internet monopoly. People around the world are paying 1/5 of what we pay for home internet and our phone plan were the biggest rip off second to drug cartel before MVNO and no contract phone plan come along. Nobody else in the world pays $70USD a month for 200mbps cable internet with 1.2TB data cap or LTE phone plan (even with the fake unlimited data we advertise for or the fake 5G ATT was advertise for).
 
WiFi power is limited in routers because the FCC aims to prevent interference. It's the same reason CB radios are limited to 4 watts.
Well at least we agree WiFi power is limited no matter what the reason and back to my original post on the subject. You can spend $100 on a well made router and it can equal the performance as far as distance goes of a $300 router because the power of routers is limited by the FCC.

BY the way, Im not dismissing your comment that its to prevent interference. You maybe entirely correct but I do see the subject of radiation brought up in some articles too. This one addresses the interference you spoke of = https://www.extremetech.com/computi...pabilities-tp-link-blocks-open-source-updates

As far as the reason, I only read lightly on it but here is one of dozens of articles on it with a quick search.
https://www.networkworld.com/articl...elines-for-wi-fi-need-to-be-re-evaluated.html

"Since 1996, the FCC has required that all wireless communications devices sold in the United States meet its minimum guidelines for safe human exposure to radio frequency (RF) energy. The FCC’s guidelines and rules regarding RF exposure are based upon standards developed by IEEE and NCRP and input from other federal agencies. These guidelines specify exposure limits for hand-held wireless devices in terms of the Specific Absorption Rate (SAR). The SAR is a measure of the rate that RF energy is absorbed by the body. For exposure to RF energy from wireless devices, the allowable FCC SAR limit is 1.6 watts per kilogram (W/kg), as averaged over one gram of tissue. All wireless devices sold in the U.S. go through a formal FCC approval process to ensure that they do not exceed the maximum allowable SAR level when operating at the device’s highest possible power level."

https://www.air802.com/fcc-rules-and-regulations.html
so no matter what router you buy it cannot exceed these limits.
 
Last edited:
On the subject of starlink. I think it is about time we have rural coverage in the US and a way to break the cable internet monopoly. People around the world are paying 1/5 of what we pay for home internet and our phone plan were the biggest rip off second to drug cartel before MVNO and no contract phone plan come along. Nobody else in the world pays $70USD a month for 200mbps cable internet with 1.2TB data cap or LTE phone plan (even with the fake unlimited data we advertise for or the fake 5G ATT was advertise for).
TMobile home internet is already doing this in rural America.
 
WiFi power is limited in routers because the FCC aims to prevent interference. It's the same reason CB radios are limited to 4 watts.
Last but not least straight form the FCC
"Since 1996, the FCC has required that all wireless communications devices sold in the United States meet its minimum guidelines for safe human exposure to radiofrequency (RF) energy. The FCC’s guidelines and rules regarding RF exposure are based upon standards developed by IEEE and NCRP and input from other federal agencies, such as those listed above."

Source = https://www.fcc.gov/sites/default/files/wireless_devices_and_health_concerns.pdf
 
Wired or optical media will always be "better" than wireless and so will changing your engine oil every 500 miles vs using the OLM. I do networking for a living and have both an Ethernet switch and a good 802.11ac access point in my office. I don't bother plugging my wired capable devices into the Ethernet switch anymore, as Wi-Fi has become that good.
 
Last but not least straight form the FCC
"Since 1996, the FCC has required that all wireless communications devices sold in the United States meet its minimum guidelines for safe human exposure to radiofrequency (RF) energy. The FCC’s guidelines and rules regarding RF exposure are based upon standards developed by IEEE and NCRP and input from other federal agencies, such as those listed above."

Source = https://www.fcc.gov/sites/default/files/wireless_devices_and_health_concerns.pdf

That doesn't mean that the maximum transmit power for Wi-Fi devices is set lower (than they otherwise would be) because they are unlicensed part 15 devices.
 
Same speediest sitting about 15' from a wireless router below me. Guess which one is wifi vs ethernet....🤔


1620163288978.jpg

1620163320595.jpg
 
Same speediest sitting about 15' from a wireless router below me. Guess which one is wifi vs ethernet....🤔


View attachment 56203
View attachment 56204
With those speeds and latency, it really doesn't matter, until you start doing lots of bi-directional applications like voice and video conferencing. Then the half-duplex nature of wifi will rear it's ugly head. Plus 1G Ethernet is always full duplex and it will fall off the cliff at MUCH higher throughputs than even 80Mhz 802.11ac channels. However, for most people it just doesn't matter.
 
That doesn't mean that the maximum transmit power for Wi-Fi devices is set lower (than they otherwise would be) because they are unlicensed part 15 devices.
IM not understanding your post, but in fairness maybe your not understanding mine or my reason for the post.
I think things are being taken different then what I posted, may have to go back to my original post to understand. I know it happens in forums.
My post said in essence the power output of wifi routers (and really any wireless device) is decided by the FCC not by the manufacturer.
Meaning, wifi routers pretty much all put out the same low level signal power output, in essence to limit exposure to human beings AND yes interference. It would cost literally pennies, ok maybe a dollar? to greatly increase the power output of a router by the manufacturer but the FCC has a limit and they all pretty much are at the limit.

Soooo ... a $99 well made router can be just as effective as a $300 router (IN DISTANCE) as they transmit the same amount of power, so its really how well the device can make do with what it has.
Granted the more expensive models you have a better chance of distance, better circuitry but its not a slam dunk, a well designed $99 device will outperform or equal on distance as the amount of power broadcast is limited no matter what router it is at any price.

Im not sure about all the other stuff being brought up and why. I was just making a case for a $99 Motorola router that centrally located will pretty much cover every part of out 3000 sq ft home just as good and better as more expensive units. The reason is its a well made unit minus the frills but a focus on power/distance, granted very basic GUI but it does what's important.
 
Same speediest sitting about 15' from a wireless router below me. Guess which one is wifi vs ethernet....🤔


View attachment 56203
View attachment 56204
One sample vs one sample within rounding error for ping. Upload is identical. Download should in theory be the same for gigabit ethernet but I cannot tell what and how you connect, and if you are not getting at least the wifi speed with your wired connection you are doing your ethernet wrong.

Just curious, how did you connect the ethernet? USB 2.0 to gigabit ethernet? Forgot to turn on jumbo frame?
 
Forgot to turn on jumbo frame?
You're not going to get jumbo frames over a residential internet connection and at these speeds jumbo frames aren't worth the effort and expense. Most carriers support jumbo frames, but those are expensive ports and high throughputs, usually >= 10Gb/s. Virtually all link ag 10Gb/s, 100Gb/s and 400Gb/s support jumbos.
 
You're not going to get jumbo frames over a residential internet connection and at these speeds jumbo frames aren't worth the effort and expense. Most carriers support jumbo frames, but those are expensive ports and high throughputs, usually >= 10Gb/s. Virtually all link ag 10Gb/s, 100Gb/s and 400Gb/s support jumbos.

My understanding is that newer network cards no longer require jumbo frames for max throughput at 1Gb. In the early 2000s, the then-available network cards could benefit from having jumbo frames turned on.
 
My understanding is that newer network cards no longer require jumbo frames for max throughput at 1Gb. In the early 2000s, the then-available network cards could benefit from having jumbo frames turned on.
Throughput with jumbo frames is about header overhead and if you have jumbo frames you have less overhead from headers because you have fewer headers. It's a bit more complex than that, but you get the idea. There are a couple problems with implementing them, ALL interfaces along the path MUST support jumbo frames up to the largest frame that will be used. You'll also get zero benefit with small packets, like VoIP and many other applications that don't fill a 1500 byte packet. Supporting jumbo frames require much administrative overhead, hardware that supports them, and the right type of traffic. Most companies don't bother, a few do.
 
Last edited:
I guess I'm outdated then.

I found this explanation here: https://blog.codinghorror.com/the-promise-and-peril-of-jumbo-frames/

Also a number of years ago,. jumbo frames provided a much bigger boost. Going from 1.5K to 9K regularly doubled performance or more. What has happened since is smarter ethernet NICs: they routinely coalesce interrupts, steer packets from the same flow to the same CPU, and sometimes even reassemble the payload of the 1.5K frames back into larger units. The resistance to standardizing jumbo frames resulted in increased innovation elsewhere to compensate.
 
The faster download is 1000mbps ethernet plugged into a USB-C dongle into a 2020 MacBook Pro. The wifi speed varies in this case better than half the ethernet which is consistent.

I do a ton of builds (docker) locally and notice the download speed on ethernet vs wifi.Thankfully paid by the hour :)
 
Back
Top