Converting a database from mysql to mongoDb

Is there an easy way to change the database from mysql to mongoDB?

or better if someone offers me a good tutorial to do this

+53
database mysql mongodb
Jun 06 2018-11-11T00:
source share
14 answers

Is there an easy way to change the database from mysql to mongoDB?

Method # 1 : export from MySQL in CSV format, and then use the mongoimport tool . However, this does not always work well in terms of processing binary data dates.

Method # 2 : script transfer in your chosen language. Basically you write a program that reads everything from MySQL one element at a time, and then inserts it into MongoDB.

Method # 2 is better than # 1, but it still doesn't work.

MongoDB uses collections instead of tables. MongoDB does not support joins. In every database I've seen, this means that your data structure in MongoDB is different from the structure in MySQL.

Because of this, there is no “universal tool” for porting SQL to MongoDB. Your data will need to be converted before it reaches MongoDB.

+48
Jun 07 2018-11-11T00:
source share

If you use Ruby, you can also try: Mongify

This is a super easy way to convert your data from RDBS to MongoDB without losing anything.

Mongify will read your mysql database, create a translation file for you, and all you have to do is map how you want to convert the data.

It supports:

  • Autoupdating identifiers (for BSON object identifier)
  • Update Reference Identifiers
  • Type of casting value
  • Insert tables into other documents
  • Before saving filters (to manually make changes to the data)
  • and much more...

Read more about this at http://mongify.com/getting_started.html

There is also a short 5-minute video on the homepage that shows you how easy it is.

+22
Jul 26 2018-12-12T00:
source share

MongoVUE free version can do this automatically for you.

It can connect to both databases and import

+5
Dec 12 '12 at 13:38
source share

I kind of partially relate to TalendOpenStudio for such migration jobs. This is an eclipse-based solution for visually creating "data transfer scripts." I do not like visual programming, but this is a problem, I am making an exception.

Adrien Mogenet created the MongoDBConnection plugin for mongodb.

This is probably too complicated for a “simple” migration, but ut is a cool tool.

Remember, however, that the Nix proposal is likely to save you time if it is one of the moves.

+1
Jun 06 '11 at 12:05
source share

You can use the QCubed framework ( http://qcu.be ). The procedure will be something like this:

  • Install QCubed ( http://www.thetrozone.com/qcubed-installation )
  • Make the code in your database. (Http://www.thetrozone.com/php-code-generation-qcubed-eliminating-sql-hassle)
  • Disconnect your database offline from the rest of the world so that only one operation works.
  • Now write a script that will read all rows from all database tables and use getJson for all objects to get json. Then you can use the data to convert to an array and embed it in mongoDB!
+1
Jun 26 '12 at 4:11
source share

Here is what I did with Node.js for this purpose:

var mysql = require('mysql'); var MongoClient = require('mongodb').MongoClient; function getMysqlTables(mysqlConnection, callback) { mysqlConnection.query("show full tables where Table_Type = 'BASE TABLE';", function(error, results, fields) { if (error) { callback(error); } else { var tables = []; results.forEach(function (row) { for (var key in row) { if (row.hasOwnProperty(key)) { if(key.startsWith('Tables_in')) { tables.push(row[key]); } } } }); callback(null, tables); } }); } function tableToCollection(mysqlConnection, tableName, mongoCollection, callback) { var sql = 'SELECT * FROM ' + tableName + ';'; mysqlConnection.query(sql, function (error, results, fields) { if (error) { callback(error); } else { if (results.length > 0) { mongoCollection.insertMany(results, {}, function (error) { if (error) { callback(error); } else { callback(null); } }); } else { callback(null); } } }); } MongoClient.connect("mongodb://localhost:27017/importedDb", function (error, db) { if (error) throw error; var MysqlCon = mysql.createConnection({ host: 'localhost', user: 'root', password: 'root', port: 8889, database: 'dbToExport' }); MysqlCon.connect(); var jobs = 0; getMysqlTables(MysqlCon, function(error, tables) { tables.forEach(function(table) { var collection = db.collection(table); ++jobs; tableToCollection(MysqlCon, table, collection, function(error) { if (error) throw error; --jobs; }); }) }); // Waiting for all jobs to complete before closing databases connections. var interval = setInterval(function() { if(jobs<=0) { clearInterval(interval); console.log('done!'); db.close(); MysqlCon.end(); } }, 300); }); 
+1
Mar 10 '17 at 22:41
source share

If someone else is looking for a solution, I found that the easiest way is to write a PHP script to connect to your SQL database, get the information you need using the usual Select statement, convert the information to JSON using PHP JSON encodes functions and simply prints your results to a file or directly to MongoDB. It's actually quite simple and straightforward, the only thing you need to do is double check your result with the Json validator, you may have to use functions like explode to replace certain characters and characters to make it valid. I have done this before, but currently I do not have a script, but from what I remember, it is literally half a page of code.

Oh, also remember that Mongo is a document repository, so some data mapping is required to be acceptable with mongo.

+1
Aug 14 '17 at 14:01
source share

For anyone coming to this with the same problem, you can check out this Github project . This is an ongoing development that will help you transfer data from a MySQL database to MongoDB simply by running a simple command.

It will generate MongoDB schemas in TypeScript so you can use them later in your project. Each MySQL table will be a MongoDB collection, and the data types will be effectively converted to their MongoDB-compatible.

Documentation for the same can be found in the README.md project. Feel free to log in and contribute. I would like to help if necessary.

+1
Oct 03 '18 at 10:45
source share

If you are looking for a tool for this, good luck.

My suggestion is to simply select your language of choice and read it and write to another.

0
Jun 06 2018-11-11T00:
source share

If I could quote Matt Briggs (he once solved my problem):

The FAR driver path is the most straightforward. Import / export tools are fantastic, but only if you use them as a pair. You are on a wild trip if your table includes dates and you are trying to export from db and import into mongo.

You are also lucky to be in C #. We use ruby ​​and have 32 million table rows that we migrated to mongo. Our final solution was to create a crazy sql statement in postgres that prints json (including some pretty funny things for correctly synchronizing dates) and passes the result of this query on the command line to mongoimport. It was an incredible day to write, and it is not something that can really be changed.

So, if you can handle this, use ado.net with the mongo driver. If not, I wish you the best :-)

(note that this comes from the common mango fauna)

MySQL is very similar to other SQL databases, so I send you to topić: Convert SQL table to mongoDB document

0
Jun 06 2018-11-11T00:
source share

You can use the following project. It requires the solr configuration file to be written. He is very simple and direct.

http://code.google.com/p/sql-to-mongo-importer/

0
Jul 23 '11 at 6:10
source share

Try the following: Automatically convert a MySQL dump to Mongo using simple r2n mappings. https://github.com/virtimus/mysql2mongo

0
Feb 13 '13 at 23:35
source share

I think one of the easiest ways is to export the MySQL database to JSON, and then use mongorestore to import it into the MongoDB database.

Step 1: Export MySQL Database to JSON

Upload mysql dump file to MySQL database if necessary

Open MySQL Workbench and connect to MySQL database

Go to Schema Viewer> Select Database> Tables> right-click the table name to export

Select "Export Tabular Data Wizard"

Set the file format to .json and enter a file name, for example tablename.json.

Note. All tables must be exported separately.

Step 2: Import the JSON files into MongoDB using the mongorestore command

The mongorestore command should be run from the server command line (not the mongo shell)

Note that you may need to provide authentication information, as well as the --jsonArray parameter, see the mongorestore documentation for more information.

 mongoimport -d dbname -u ${MONGO_USERNAME} -p ${MONGO_PASSWORD} --authenticationDatabase admin -c collectionname --jsonArray --file tablename.json 

Note. This method will not work if the source MySQL database contains BLOB / binary data.

0
May 24 '19 at 17:24
source share

I used Studio 3T after I gave up fighting some configuration problems with mongify. They have a good tutorial: https://www.youtube.com/watch?time_continue=17&v=jeS81XyNSYc

0
Jul 16 '19 at 17:43
source share



All Articles