Duplicate accounting log entries
dskinner at bluefrog.com
Wed Apr 4 10:55:37 CEST 2007
> I am getting duplicate update's for that user from the NAS, where
> everything is identical including the input and output octets, which
> leads me to believe that the traffic is being combined and I actually
> only need 1 of the records.
> If I then make my unique_id column unique I will prevent this duplication.
I can't comment on DSL, but just as some general knowledge...
RADIUS is UDP, so if reply packets from your system are lost, then the
NAS will resend and you will have 2 copies. This can also happen if
your radius server is being slow (perhaps due to SQL inserts) and not
responding in time.
You should try to optimize your database tables for best performance
first (if MySQL, you prob want to use the InnoDB table engine at least
for the radacct table). You will also want to archive that table on a
regular basis. For our system, I found a significant slowdown on
inserts when the table got above about 5 million records.
If you are still getting a lot of duplicates, then you may want to work
with the ppl who own the NAS's to adjust the timeouts. They may have
then set too low for some reason.
More information about the Freeradius-Users