throughput and tx rate in marginal signal conditions
don
don
Fri Jun 6 05:28:03 PDT 2003
On Wed, 2003-06-04 at 03:26, Jean Tourrilhes wrote:
>
> <Shameless plug>
> You may want to read the paper I wrote titled "Fragment
> Adaptive Reduction", it covers some of this subject.
> </Shameless plug>
>
.
>
> Have fun...
>
> Jean
The paper was very interesting. It's a big subject and I seem to have
underestimated the complexity. But now I know why the signal quality is
worst a lunch time.
I've done a hack to at least partly resolve the problem. It does a test
of the number of good packets that were sent between errors before
reducing the tx rate. Less than 10 is the criteria used at present.
However the basic problem is that with the link conditions that I have,
the higher rates get less errors. Dropping the rate can make things
worse. So I've locked the rates so that only 5.5 and 11 are used. Under
frequent error conditions the rate oscillates between the two but the
throughput is around 300K bytes/sec.
If the rate is allowed to fall to 1 meg then the error rate is high and
there are never enough consecutive good packets to try the higher rates.
The rate just stays at 1 meg. This gives a throughput of around 30K
bytes/sec. I think that rate reduction determined only by a test for
consecutive errors may do the same, although it will undoubtedly be
better than dropping after one error.
Obviously my hack only improves things in this specific instance and a
more sophisticated solution is needed to solve the general case. But I
hope that this information is useful.
Thanks to all who have taken an interest in the problem.
Don Magee
> _______________________________________________
> HostAP mailing list
> HostAP at shmoo.com
> http://lists.shmoo.com/mailman/listinfo/hostap
>
More information about the Hostap
mailing list