I create a Mysql procedure using cursor, but it's run too slow... It's get between 40 and 60 lines by second.. See:
DELIMITER $$
CREATE PROCEDURE sp_create(IN v_idsorteio INT,OUT afetados INT)
BEGIN
DECLARE done INT default 0;
DECLARE vc_idsocio INT;
DECLARE z INT;
DECLARE cur1 CURSOR FOR select IdSocio from socios where Sorteio=1 and Finalizado='S' and CodClientes IS NOT NULL;
DECLARE CONTINUE HANDLER FOR SQLSTATE '02000' SET done=1;
SET z=1;
OPEN cur1;
FETCH cur1 INTO vc_idsocio;
WHILE done=0 DO
-- SELECT register as t;
insert INTO socios_numeros_sorteio (IdSocio,IdSorteio,NumerodeSorteio) VALUES (vc_idsocio,v_idsorteio,z);
FETCH cur1 INTO vc_idsocio;
SET z = z+1;
END WHILE;
CLOSE cur1;
Select z-1 as total INTO afetados;
END$$
DELIMITER ;
how can I to improve that?
Another simple solution is only to change the engine of the table to MyISAM by running the below query,
Then CALL the procedure again. note: MyISAM make the insertion process very fast
This is slow because you are looping through a resultset, row by row, and performing individual insert statements for each row returned. That's why it's gonna be slow.
Let's briefly summarize what you are doing. First, you are running a query:
(Apparently the order these rows are returned in is not important.)
Then for each row returned from that query, you want to insert a row into another table.
The value for the first column is coming from a value returned by the query. The value for the second column is being assigned a value passed as an argument to the procedure. And the value for the third column is from a counter that starts at 1 and is being incremented by 1 for each row.
MySQL is optimized to perform an operation like this. But it's NOT optimized to do this using a stored procedure that loops through a cursor row by row.
If you are looking to get some reasonable performance, you need to SIGNIFICANTLY REDUCE the number of individual INSERT statements you run, and instead think in terms of processing data in "sets" rather than individual rows. One approach is batch the rows up into "extended insert" statements, which can insert multiple rows at a time. (The number rows you can insert in one statement is effectively limited by max_allowed_packet.)
That approach will significantly improve performance, but it doesn't avoid the overhead of the cursor, fetching each row into procedure variables.
Something like this (in the body of your procedure) is likely to perform much, much better, because it takes the result set from your select and inserts all of the rows into the destination table in one fell swoop, without bothering to mess with updating the values of variables in the procedure.