I just got the ePMP 4500 AP delivered today and I was not expecting something this large and heavy. It is not a complaint, but I honestly expected something around the size of the Mimosa A6 which is also an AX 8x8 90deg AP. This thing is easily 50% larger and 2x as heavy.
Mimosa says they have spent a lot of effort to make their antenna technology super small, and to build everything from the customer & installer standpoint. If their small antenna performs as they claim, that’s amazing.
Cambium also has a good reputation for extra shielding, extra isolation, better F/B, and limiting side lobes. So Cambium’s larger antenna may be well worth it too.
So if the Mimosa and Cambium are right around the same gain and beamwidth, besides the radio itself, what necessitates the Cambium antenna being so large? Is Mimosa just that good at antenna design or is it a Ubiquiti situation where the antenna gain advertised is very generous and not really accurate?
The cambium has higher gain, so that will result in a little larger antenna. It also supports a higher max TX power, which will result in some additional power amplification circuitry… so slightly more internals. Also, the band entire band that the 4500 operates in is slightly lower than the A6, which again, will result in a slightly larger panel. Aside from those two things, I don’t know what else would be causing the huge size difference. I’ve heard that Mimosa fudges on their antenna specs, but I have no proof and have never used Mimosa, so I can’t say for certain.
It’s interesting that Cambium does include antenna patterns for their 4500 8x8 on the datasheet, but Mimosa does not. That being said, I am a little puzzled by the patterns I’m seeing for the 4500.
Maybe someone more educated in reading antenna patterns can correct me, but it appears as though the coverage is pretty tight in that it looks like the 3dB rolloff is around 45deg, and the 6dB rolloff is around 55deg, with an overall coverage that looks best suited for around 70deg?
If I am reading the patterns right it seems the 3db cutoff depending on band (5.1, 5.5, or 5.8ghz) is between 20-30deg and the 6db cutoff is between 35-45deg giving the antenna somewhere between a 40/60deg or 70-90deg antenna beamwidth.
Now looking at the datasheet Cambium gives 8 channels of patterns which I assume correlates to the 8x8 nature of the system. Each pattern is slightly off from degree 0 with Ch 4 being centered right on degree 0.
What confuses me is I would have assumed for good MU-MIMO grouping that you would have your 90deg AP made up of 8 smaller antenna patterns with a slight overlap but this looks to have multiple large beamwidths that greatly overlap.
I would think that in non MU-MIMO situations that the AP would use the Ch 4 pattern as that looks to be a traditional 90deg pattern centered on 0 and then it would change shape into multiple narrower channels when doing MU-MIMO operations instead of the multiple 90deg patterns that the datasheet seems to show.
The antennas used in a 90-degree beamwidth massive MIMO are 8 x 90-degree antennas housed under one radome. They are grouped 2x2 so there can be 4 streams transmitted or received at the same time. There are 8 antennas in one antenna assembly - the interaction between the individual elements tends to reduce the beamwidth edge cutoff - a pure vertical 90-degree antenna may have 20 dB less gain at 120-degrees, while a multiple polarity, multiple element antenna may have only 10dB less gain at 120-degrees. Using this phenomena, antenna manufacturers may build individual 65-degree antennas under the radome, and the influence of the multiple elements gives 3dB gain reduction actually at some measurement close to 90 degrees. Antenna science is very cool!
Beamforming occurs by subtle phase changes in the streams. The gain occurs when the beam is ‘aimed’ at an SM. At the same time, the gain is reduced in in other directions that the antenna covers. The width of the beam is related to the degree of phase change - the phase cannot be infinitely changed, but changed in steps, which is why the system seems to behave as though it has multiple 15-degree antennas. The RF path is assumed to be reciprocal, meaning that the proper setting for the downlink-which is determined by ‘sounding’- a particular frame sent to the SM, and the SM responds with a signal quality indication of some sort. The stream phase required for the uplink is assumed to be the same as the downlink - sounding happens only in the downlink, not the uplink.
In the past LTE and WiMAX days, beamforming needed 4 antennas to change the phase.
The only way to test beamforming gain is to run a link budget and estimate what RSSI and SNR you should get at a particular SM site. If you had the ability to turn off the beamforming, test, then turn on the beamforming and test, that would work also. Then look at the actual performance you are achieving. The difference in SNR (calculated vs. actual or non-beamformed vs. beamformed) is going to be the effective beamforming gain. The real difference is the modulation level…if you achieve higher MCS values with beamforming compared to the MCS you estimate without beamforming you get at that location, you see the real effect of beamforming - the goals are higher data throughput, and better isolation to undesired signals.
Forgive me if my explanation is not technical enough. The future of broadband is beamforming, and interference cancellation or mitigation on everything. Higher antenna diversity means higher orders of MU-MIMO. Eventually, my prediction is that both AP’s and SM’s will independently beamform in both DL and UL. Who knows what happens after that? I’m 64 years old this year, and hopefully I can ride out my career in wireless with today’s technology!