sql - MySQL: bulk updating in table -
i'm using mysql 5.6 , have issue. i'm trying improve bulk update strategy case.
i have table, called reserved_ids, provided external company, assign unique ids invoices. there no other way make this; can't use auto_increment fields or simulated sequences.
i have pl pseudocode make assignment:
start transaction; open invoice_cursor; read_loop: loop fetch invoice_cursor internalid; if done leave read_loop; end if; select min(secuencial) v_secuencial reserved_ids country_code = p_country_id , invoice_type = p_invoice_type; delete reserved_ids secuencial = v_secuencial; update my_invoice set reserved_id = v_secuencial invoice_id = internalid; end loop read_loop; close invoice_cursor; commit;
so, it's take 1 - remove - assign, take next - remove - assign... , on.
this works, it's very slow. don't know if there approach make assignment in faster way. i'm looking insert select..., update statement, assign 1000 or 2000 ids directly, , no 1 one.
please, suggestion helpful me. lot.
edit 1: have added where clause details, because requested user @vmachan . in update...invoice clause, don't filter other criteria, because have direct , indexed invoice id, want update. thanks
finally, have solution. it's faster initial approach.
the update query is
set @a=0; set @b=0; update my_invoice inner join ( select f.invoice_id, i.secuencial reserved_id, concat_ws(/* format final invoice id */) final_my_invoice_number ( select if(@a, @a:=@a+1, @a:=1) current_row, internal_id my_invoice reserved_id null order internal_id asc limit 2000 ) f inner join ( select if(@b, @b:=@b+1, @b:=1) current_row, secuencial reserved_ids order secuencial asc limit 2000 ) using (current_row) ) temp my_invoice.internal_id=temp.internal_id set my_invoice.reserved_id = temp.reserved_id, my_invoice.final_my_invoice_number=temp.final_my_invoice_number
so, autogenerated , correlated secuencial numbers @a , @b, can join 2 different , no related tables my_invoice , reserved_ids.
if want check solution, please execute tricky update following these steps:
- execute @a , first inner select in isolated way:
select if(@a, @a:=@a+1, ...
- execute @b , second inner select in isolated way:
select if(@b, @b:=@b+1, ...
- execute @a, @b , big select builds temp auxiliar table:
select f.invoice_id, ...
- execute update
- finally, remove assigned ids reserved_id table.
assignation time reduced drastically. initial solution 1 one; this, assign 2000 (or more) in 1 single (ok, , little tricky) update.
hope helps.
Comments
Post a Comment