I am currently using the following code to do a bulk insert of 50 million rows:
declare cnt number;
BEGIN
select max(run_ver_issue_id) into cnt from "ABHINAV"."MV_RUN_VER_ISSUE";
FOR x IN 1 .. 50000000 LOOP
INSERT INTO "ABHINAV"."MV_RUN_VER_ISSUE" NOLOGGING ("RUN_VER_ISSUE_ID","CUST_ISSUE_ID","TITLE","ORIGINAL_ISSUE_ID","SOURCE_FILE_LINE","DISPLAY_TEXT","RUN_VER_ID","SANDBOX_SRC_FILE_ID","ACCOUNT_ID","ISSUE_STATE","CURRENT_SEVERITY","INSERT_TS","MODIFIED_TS","CWE_ID","TEMPLATE_FILE_ID","TEMPLATE_FILE_LINE","CURRENT_EXPLOIT_LEVEL","CNT","SOURCE_FUNCTION_PROTOTYPE","SOURCE_RELATIVE_LOCATION","SOURCE_SCOPE","PROCEDURE_NAME","PROCEDURE_HASH","PROTOTYPE_HASH","STATEMENT_HASH","STATEMENT_HASH_COUNT","STATEMENT_HASH_ORDINAL","SUBMODULE_PATH") VALUES (x+cnt,2,..);
END LOOP;
END;
It takes almost 6 hours it to finish. The table does have an index on RUN_VER_ISSUE_ID, but still doesn't help. I tried using FORALL, getting an error PLS-00430: FORALL iteration variable X is not allowed in this context for the below code:
declare cnt number;
BEGIN
select max(run_ver_issue_id) into cnt from "ABHINAV"."MV_RUN_VER_ISSUE";
FORALL x IN 1 .. 5
INSERT INTO "ABHINAV"."MV_RUN_VER_ISSUE" NOLOGGING ("RUN_VER_ISSUE_ID","CUST_ISSUE_ID","TITLE","ORIGINAL_ISSUE_ID","SOURCE_FILE_LINE","DISPLAY_TEXT","RUN_VER_ID","SANDBOX_SRC_FILE_ID","ACCOUNT_ID","ISSUE_STATE","CURRENT_SEVERITY","INSERT_TS","MODIFIED_TS","CWE_ID","TEMPLATE_FILE_ID","TEMPLATE_FILE_LINE","CURRENT_EXPLOIT_LEVEL","CNT","SOURCE_FUNCTION_PROTOTYPE","SOURCE_RELATIVE_LOCATION","SOURCE_SCOPE","PROCEDURE_NAME","PROCEDURE_HASH","PROTOTYPE_HASH","STATEMENT_HASH","STATEMENT_HASH_COUNT","STATEMENT_HASH_ORDINAL","SUBMODULE_PATH") VALUES (x+cnt,2,.....);
END;
I would request you to please provide your suggestions and comments as to how I can speed this up.
you may use this: