Has anyone had problems with Ethernet packets being dropped when sending over Canopy? We have a requirement to link 2 MPEG4 encode/decoders via Canopy and I have been assured the bandwidth required is suitable, yet we get a very jittery video output. With a straight cross over cable, all works as expected.
The encoder/decoder modules are not able to set the bit required to utilise the high priority channel.
The encoder / decoder have a nominal packet size of 1024 bytes which should be fine for Canopy.
Does anyone know of a way to override the high priority bit?
Thanks for your help.
I have done similar thing before and the requirements are more stringent.
You need to combine the two IP signals from the Codec first before putting into the Canopy. This is to ensure there is no data collision at the input side of Canopy.
In the past, I can put three MPEG 2 video streaming in multicast mode into one pair of Canopy 20M BH unit without any data lost for a long period of time (about 10 hours continuously even I set the GOP of the Codec at 1 I frame per 30 frames). There is no need to use high priority channel if no other applications being used together with video streaming.
I had also tried to put MPEG 4 into Canopy and the performance is even better as the bandwidth requirement became less. I do not have sufficient Codec and I believe it can carry up to 5 or 6 MPEG 4 video stream simultaneously by one single pair of Canopy.
Hope you can solve your problem.
Thanks for the reply Wirelessman,
The MPEG4 encoder is set to stream the data at 4096Kbps across Canopy.
When checking network utilisation, we get a very consistent 3300Kbps.
With a cross over cable connecting the encoder to the PC, the utilisation jumps to around 4000Kbps. These figures are confirmed by the video monitoring software.
The Canopy link test shows download rates of 4.7Mbps average and peaks of 4.9Mbps, so my conclusion was that bandwidth is not the issue, but the latency of around 16ms, one way, (pings to the AP were 32ms) is.
The video is around 800x600, 16bit colour @ 25 frames/sec and we can’t slow it down due to the customers requirements.
Does this sound reasonable or have I missed the point?
Thanks again for your help
What type of Canopy are you using?
I had used 20M BH units and I need to transmit two live video streaming from the Operating Theatre at the hopsital to the Conference Centre about 2 km away and one live video streaming back from the Conference Centre to Operating Theatre. The bandwidth required for the video from O.T. is 4Mbps at MPEG2 in multicast mode and the channel back from Conference Centre is 3 Mbps (also in multicast mode). The total bandwidth is about 11 Mbps. So I used 20M BH units with downlink rate at about 75% and the uplink rate at 25%. The percentage is depending on which one is Master. In my setup, the bandwidth may be varied so I need to allocate about 20% bandwidth more. The overall performance is excellent as there is no packet lost encountered for more than 10 hours.
So, please provide more details on your setup including the configuration of your Codec and other equipment if any.
For the purpose of the test, we are using a 5.7 AP and SM.
This is the layout.
Analogue Camera ==> MPEG4 Encoder ==> Canopy AP - Radio Link - Canopy SM ==> MPEG4 Decoder ==> Analogue Video System
Even with a cross over cable between the Encoder & Decoder, the system is very suseptable to collisions etc. They had a Netgear switch which was confusing the Link negotiation on the MPEG4 modules and causing many collisions on the Switch.
The encoder is set for 4SIF at a data rate of 4096Kpbs. It transmits 1 frame every frame rather than storing them and sending them as a group. I don’t know much more about the units as we have just been asked to link the modules.
In your confiuration, I assume there is no other equipment between Codec and Canopy. If it is true, you can do this:
1. Check the data rate percentage as you may need 90% from Encoder to Decoder side.
2. Check the link efficiency as the overall data rate may drop down below 4Mbps if the efficiency is below 70%. On the safe side, it should maintain 80% or above.
3. Change the picture management of your Codec to reduce the bandwidth but maintain the picture quality.
Normally, for MPEG live video, buffer is not allowed in order to minimise the latency. With this reason, the video stream will be suffered more by packet lost and data collision. Therefore, it is necessary to understand all the equipment empolyed in your configuration including radio equipment, Codec as well as other network equipment.
By the way, what type of Codec are you using? I have used Indigo and Victor MPEG4 Codec and Dall-merier MPEG2 Codec. All of them can perform satisfactory with Canopy (our clients required to use it in the O.T. of hospital and transmit real time medical images and video to other medical consultants for tele-surgery function). Some of units may require minor modification work and the performance can achieve 4CIF at 25 frames/second with dobly sorround sound without any interruption or disturbance.
If latency is an issue, and in your case - in your case it is - you have to use BH which is NOT polling the remote(s) and have constant <3msec latency.
The new advantage AP’s also decrease the latency to around 5-7MS now. You might want to look into that.
Purely for the purpose of being thorough, we have done further diagnostics on this issue.
We used Ethereal to packet sniff between the Encoder and the Canopy AP. What we discovered was that the packet size coming from the encoder was 1046 bytes and it is well documented in the Canopy training manual that Canopy is most efficient with packets that are 1518 bytes in size. This indicates to me that the problem is unused space in the Canopy Data Slot that is reducing our EFFECTIVE throughput to about 2/3 of the max throughput displayed by the Link Test. Evidence that this is the case comes from the Video Software which shows the data rate being received by the software is around 3.2Mbps which is exactly 2/3 of the 4.7Mbps the Link Test shows.
We switched to UDP which does not use acknowledgements (the packet analysis confirmed this) and so discounts the latency in the system being the cause of the problem. This is my understanding at the moment so this could be incorrect! (Any comments?)
I have made the system work by reducing the stream data rate down, so if I was to use a higher bandwidth product (like a 20Mbps BH unit) this would allow us to use the required data rate from the encoder? (Any comments)
If you reduce the data rate of your encoder, it definitely will improve the output video streaming but the quality will be degraded. With regards to the packet size which you mentioned, I think you can try to increase it at encoder size to a multiple of 1518 bytes (or near that value) and check the performance. However, by my experience, it does not help a lot. I had set the packet size of my encoder at 4096 and can use up to 90% of the Canopy throughput. More important thing is to avoid data collision and packet lost in MPEG video streaming because even a single bit error can corrupt the output picture until the next reference(I) frame received. Normally, if you want to get better picture quality at limited bandwidth, the number of reference(I) frame will be reduced as much as possible. In my applications, I only transmit one I frame in every 30 frames sent.
Once we got hold of a VERY Nice (although expensive) layer 3 switch, some interesting things became apparent.
The encoder was set to a data rate of 4096kbps but this is effective data rate and does NOT include overheads!. The actual data rate was 6Mbps!
No wonder a standard AP/SM link couldn’t do it.
A 10mbps BH set to 85% gave us a sustainable link with sufficient bandwidth for full motion video at 25 fps which had the same quality as watching TV.
Thankyou for your help on this issue.
Which switch did you use? How much?
The Switch was a D-Link DGS-3324SR and was around $4000 AUD