Migrate from Oracle to MySQL

2019-01-03 03:01发布

We ran into serious performance problems with our Oracle database and we would like to try to migrate it to a MySQL-based database (either MySQL directly or, more preferably, Infobright).

The thing is, we need to let the old and the new system overlap for at least some weeks if not months, before we actually know, if all the features of the new database match our needs.

So, here is our situation:

The Oracle database consists of multiple tables with each millions of rows. During the day, there are literally thousands of statements, which we cannot stop for migration.

Every morning, new data is imported into the Oracle database, replacing some thousands of rows. Copying this process is not a problem, so we could, in theory, import in both databases in parallel.

But, and here the challenge lies, for this to work we need to have an export from the Oracle database with a consistent state from one day. (We cannot export some tables on Monday and some others on Tuesday, etc.) This means, that at least the export should be finished in less than one day.

Our first thought was to dump the schema, but I wasn't able to find a tool to import an Oracle dump file into MySQL. Exporting tables in CSV files might work, but I'm afraid it could take too long.

So my question now is:

What should I do? Is there any tool to import Oracle dump files into MySQL? Does anybody have any experience with such a large-scale migration?

PS: Please, don't suggest performance optimization techniques for Oracle, we already tried a lot :-)

Edit: We already tried some ETL tools before, only to find out, that they were not fast enough: Exporting only one table already took more than 4 hours ...

2nd Edit: Come on folks ... did nobody ever try to export a whole database as fast as possible and convert the data so that it can be imported into another database system?

8条回答
爷、活的狠高调
2楼-- · 2019-01-03 03:35

I've used Pentaho Data Integration to migrate from Oracle to MySql (I also migrated the same data to Postresql, which was about 50% quicker, which I guess was largely due to the different JDBC drivers being used). I followed Roland Bouman's instructions here, almost to the letter, and was very pleasantly suprised at how easy it was:

Copy Table data from one DB to another

I don't know whether it will be appropriate for your data load, but it's worth a shot.

查看更多
兄弟一词,经得起流年.
3楼-- · 2019-01-03 03:37

I have built a C# application that can read an Oracle dump (.dmp) file and pump it's tables of data into a SQL Server database.

This application is used nightly on a production basis to migrate a PeopleSoft database to SQL Server. The PeopleSoft database has 1100+ database tables and the Oracle dump file is greater than 4.5GB in size.

This application creates the SQL Server database and tables and then loads all 4.5GB of data in less than 55 minutes running on a dual-core Intel server.

I don't believe it would be too difficult to modify this application to work with other databases provided they have an ADO.NET provider.

查看更多
Ridiculous、
4楼-- · 2019-01-03 03:43

We had the same issue. Needed to get tables and data from oracle dbms to mysql dbms.

We used this tool we found online... It worked well.

http://www.sqlines.com/download

This tool will basically help you:

  1. Connect to your source DBMS(ORACLE)
  2. Connect to destination DBMS(MySQL)
  3. Specify schema and tables in the ORACLE DBMS you want to migrate
  4. Press a "Transfer" button to Run the migration process(running inbuilt migration queries)
  5. Get a transfer log, which will tell how many records were READ from SOURCE and WRITTEN on the destination database, what queries failed.

Hope this will help others that will land on this question.

查看更多
Melony?
5楼-- · 2019-01-03 03:46

Oracle does not supply an out-of-the-box unload utility.

Keep in mind without comprehensive info about your environment (oracle version? server platform? how much data? what datatypes?) everything here is YMMV and you would want to give it a go on your system for performance and timing.

My points 1-3 are just generic data movement ideas. Point 4 is a method that will reduce downtime or interruption to minutes or seconds.

1) There are 3rd party utilities available. I have used a few of these but best for you to check them out yourself for your intended purpose. A few 3rd party products are listed here: OraFaq . Unfortunately a lot of them run on Windows which would slow down the data unload process unless your DB server was on windows and you could run the load utility directly on the server.

2) If you don't have any complex datatypes like LOBs then you can roll your own with SQLPLUS. If you did a table at a time then you can easily parallelize it. Topic has been visited on this site probably more than once, here is an example: Linky

3) If you are 10g+ then External Tables might be a performant way to accomplish this task. If you create some blank external tables with the same structure as your current tables and copy the data to them, the data will be converted to the external table format (a text file). Once again, OraFAQ to the rescue.

4) If you must keep systems in parallel for days/weeks/months then use a change data capture/apply tool for near-zero downtime. Be prepared to pay $$$. I have used Golden Gate Software's tool that can mine the Oracle redo logs and supply insert/update statements to a MySQL Database. You can migrate the bulk of the data with no downtime the week before go-live. Then during your go-live period, shut down the source database, have Golden Gate catch up the last remaining transactions, then open up access to your new target database. I have used this for upgrades and the catch up period was only a few minutes. We already had a site licenses for Golden Gate so it wasn't anything out of pocket for us.

And I'll play the role of Cranky DBA here and say if you can't get Oracle performing well I would love to see a write up of how MySQL fixed your particular issues. If you have an application where you can't touch the SQL, there are still lots of possible ways to tune Oracle. /soapbox

查看更多
我欲成王,谁敢阻挡
6楼-- · 2019-01-03 03:50

yeah, Oracle is pretty slow. :)

You can use any number of ETL tools to move data from Oracle into MySQL. My favourite is SQL Server Integration Services.

If you have Oracle9i or higher, you can implement Change Data Capture. Read more here http://download-east.oracle.com/docs/cd/B14117_01/server.101/b10736/cdc.htm

Then you can take a delta of changes from Oracle to your MySQL or Infobright using any ETL technologies.

查看更多
smile是对你的礼貌
7楼-- · 2019-01-03 03:52

I am used to transfer large data between different databases, anywhere between 10-250 million of records. For example when I use Pentaho, Talend, Java and Ruby to transfer 30 millions of records, my transfers always took over 5 hours. When I tried Perl the transfer time was dramatically reduced to 20 minutes.

The reason behind Perl exceptional performance for data transfer might be that Perl is not an object oriented programming language and treats all variables as strings. Perl does not have to do any type conversion, any type checking or creating objects for each batch record sets. Perl is just query let's say 1,000 records as string and moving data as string along the wire and then conversion to appropriate data type is done by destination database server in the SQL statement that has 1,000 SQL insert statements within it.

Pentaho, Talend, Ruby, Java doing too much data type checking, type conversions, creating too many objects that creating memory demands on OS and making garbage collector go crazy, and that is where slowness begin when I dealing with millions of records.

I usually spawn 8 Perl processes on 8 CPU server that share position of last retrieved record and there you go. I got MONSTER ugly Perl ETL that nobody can beat in performance. In that point the performance depends only on source and destination databases. How many records you can query and insert per second,

Because Perl takes very few CPU instruction cycles to process each requests and inserts, and sucking up data so fast from Oracle, Oracle often thinks that is under denial of service attack and it will shutdown accepting further requests. Then I must increase process and sessions limits on Oracle database to continue.

I am a Java developer, but sometimes even the ugliness of Perl can be use in places where no other modern programming language can compete. If you like to see some of my own work about what I was talking about, you are can visit my two search engine holding almost 500 million of records on sharded MySQL database and feel free to search for your name.

http://find1friend.com/
http://myhealthcare.com/
查看更多
登录 后发表回答