Using dapper to replace full-fledged OR / M

I'm really impressed with the Dapper micro OR / M, I really would like to use it as a side-by-side companion with some full-fledged OR / M, and I would actually be instead. I did not understand at all whether there is a strategy for deserializing the hierarchy from db: for example, the returned object for the row of the recordset will depend on the field (for example, the so-called “discriminator” in NH). In addition, the hierarchy can split more of the table through the join, so the type representing the row will depend on the existence of the record in another table. Having a hierarchy represented by a mixture of the above strategy would be that NH, for example, does not support, but exists in "relational life." So the questions are:

  • Does Dapper help with this scenario?
  • will this scenario weaken the efforts of Dapper in terms of performance?

Another topic is caching. The Dapper cache for requests is a bit more aggressive, isn’t it better to have some “session context” and have a request cache for each session, or will it again offend the main Dapper motives?

+6
source share
1 answer

Dapper does not currently support custom build logic; I assume that you are asking for something like:

class Post {} class Question : Post { .. } class Answer : Post { ... } Func<IDbDataReader, Func<IDbDataReader, Post>> factoryLocator = ... my magic factory locater; cnn.Query<Post>(@" select * from Posts p left join Questions q on q.Id = p.Id left join Answers a on a.Id = p.Id", factoryLocator: factoryLocator); 

We decided not to apply such a logical reason, so we never had to solve such a problem in real life. It also introduces sufficient internal complexity and sufficient external complexity (since you need to enable post is Question ).

I am not categorically against including this feature, if you can make a good argument for including, and the patch is simple. I also want to add hooks to Dapper so you can introduce such functions.

As for the caching strategy, we find that in the general case, we never inflate the cache; inflating occurs only if you abuse dapper, say, by creating non-parameterized SQL. I fully support adding a hook that will allow you to specify your own cache provider instead of the one used by ConcurrentDictionary .

+5
source

Source: https://habr.com/ru/post/914658/


All Articles