buffered_sql problem
Ricardo LarraƱaga
ricardo.larranaga at gmail.com
Mon Aug 3 17:40:09 CEST 2015
Hi Alan:
Actually, i was about to answer my own quesiton.
So, it looks like the server keeps processing the file, no matter if a
packet had a failed query.
Somehow, my server started getting delayed on processing packets from the
detail file (need to investigate why).
So what i did just know is increasing the load parameter on buffered sql
and after a highly loaded cpu period, the server caught up, deleted the
file and showed me sessions from today.
What version are you using? I dont really have anything configured to
nulligy the duplicated records, so i am wondering know if keep processing
the file even if the insert fails is a new feature?
Thanks.
Regards
On Mon, Aug 3, 2015 at 12:29 PM, <A.L.M.Buxey at lboro.ac.uk> wrote:
> Hi,
>
> > that is already in the db, so those fail, but my question is, will the
> > server keep trying to insert that record and not try any others in the
> > detail.work file?
>
> yep. any nastiness and it gets stuck....and wont proceed. you need
> to nullify it...either stop server, edit the .work file to remove
> the dodgy record or configure server to walk over it... probably
> something hacky like this
>
>
> sql {
> invalid = 2
> fail = 2
> }
> if (fail || noop || invalid) {
> ok
> }
>
> alan
> -
> List info/subscribe/unsubscribe? See
> http://www.freeradius.org/list/users.html
More information about the Freeradius-Users
mailing list