C # project with different databases, storing the same data in different formats

Ok, the title is long, but my problem is here:

I have about 10-12 different objects, and I have to work with them in my software. It's simple, I just create classes and do whatever I want. But all the classes representing the database rows and the main problem - my software should be able to read / write these objects from / to different databases (oracle, mysql, interbase, postgres, firebird). Objects are stored in different databases, even in different tables in different fields (names and types) [and all this leads to the fact that these databases are not synchronized with each other).

Thus, I get problems writing code to ensure all the work: let him think about 5 databases * 10 objects = 50 classes for all and expand in the future ...

Can someone tell me the right way, best practice how to build it?

+4
source share
4 answers

First of all, write your class to wrap all these calls for different databases. You will probably use the data providers provided by the database to connect to each of them.

This driver provides a number of classes for working with databases using ado.net. Use these classes in your transfer class to import / export data. Typically, these data providers have a way to map the database type to the .net clr types.

And you need to find key points for each database, for example: If the database stores information for tables, table fields, indexes, keys.

EDIT:

  • Read the table in the database storing its information and field information for this table (field names, type, length, etc.)
  • In this class that you write to save this information in a table, create a field for storing the CLR type too (as I said, these data providers provide a way to map the database type to the type of .net clr type)
  • Well, now you have the table information corresponding to the characteristic for each database.

Now you can start import / export operations.

For example, suppose you read an oracle and write in sql.

  • You are reading table information for this table.
  • Read the data from the table (select) and save the data to datatable
  • Now all you have to do is insert into the sql table, using, of course, the data written from oracle.
  • And remember the information you got at the table? You must observe the type mapping for each database.

Cm:

http://technet.microsoft.com/en-us/library/ms151817.aspx

+3
source

Objects are stored in different databases, even in different tables in different fields (names and types)

If the field names are different, you need to map your fields to the object at the data level. No need to create 50 classes. 10 is enough. when you read data from a database map into a class and use that class in the application. Similarly, when you are going to save data, you must read the fields from one database and move to the database accordingly.

+2
source

I would fulfill your requirements as follows:

First, I would add several dbml files (Linq2SQL), one for each database. You can use ADO.NET or something else if you want. You will have to name your generated data objects in different ways ... something like MySqlDataObject, OracleDataObject, FirebirdDataObject, etc.

In your user interface, you only need one set of data objects. Therefore, at your data level, you will need methods that populate each data object from different databases ... for example CopyToMySqlDataObject(ClrDataObject clrDataObject) , CopyToOracleDataObject(ClrDataObject clrDataObject) , CopyToFirebirdDataObject(ClrDataObject clrDataObject) .

Finally, I would use the DataProvider class, which refers to all of your data source objects. This class will place every action that is available and in every method will call every data source. Take a look at this example:

 public int UpdateDataObject(ClrDataObject clrDataObject) { using (MySqlDataContext dataContext = new MySqlDataContext()) { MySqlDataObject mySqlDataObject = dataContext.MySqlDataObjects.Where(d => d.Id == clrDataObject.Id).FirstOrDefault(); CopyToMySqlDataObject(clrDataObject, mySqlDataObject); dataContext.SubmitChanges(ConflictMode.FailOnFirstConflict); return 0; } ... using (OracleDataContext dataContext = new OracleDataContext()) { OracleDataObject oracleDataObject = dataContext.OracleDataObjects.Where(d => d.Id == clrDataObject.Id).FirstOrDefault(); CopyToOracleDataObject(clrDataObject, oracleDataObject); dataContext.SubmitChanges(ConflictMode.FailOnFirstConflict); return 0; } } 
+1
source

a repository template and a single responsibility for each repository, and possibly a common repository that calls each of the specific repository classes.

each database will have a repository class responsible for working with this database.

therefore, if you had a client class (probably a DTO at the repository level) with the name Name, then each repository is responsible for reading the "name" field from the corresponding table and then matching it with the DTO field, you should not have another class domain level clients for each database, it's just nuts ...

If you use dependency injection, you can inject these repositories and safely delete them in turn when they are not needed ...

If you need transaction support, you can add another layer to the repository and collapse your own transaction coordinator, etc.

The problem with the @Sheridan example above is that you add and delete databases that keep changing, but don't close for modification ...

+1
source

Source: https://habr.com/ru/post/1499137/


All Articles