Azure table storage query in .NET with property names unknown at design time

Is there a TSQL equivalent for storing Azure tables? I want to add hock requests to .NET when property names are unknown at design time.

From my understatement of LINQ, you need to reference existing public properties.

var selectedOrders = from o in context.Orders where o.Freight > 30 orderby o.ShippedDate descending select o; 

Freight and ShippedDate must be publicly available, defined at design time. I have no structured properties (or even structured classes).

What if I do not know the names of the properties at design time? You can add new property names to the table in a very convenient mode, but how can they be used?

You can define a dynamic request through the REST API

  Request Syntax: GET /myaccount/Customers()?$filter=(Rating%20ge%203)%20and%20(Rating%20le%206)$select=PartitionKey,RowKey,Address,CustomerSince HTTP/1.1 

Are there tools for using REST in .NET (dynamically)?

From the REST API documentation: Use the logical operators defined in the .NET Client Library for the ADO.NET Data Services Framework to compare a property with a value. Note that it is not possible to map a property to a dynamic value; one side of the expression must be constant. http://msdn.microsoft.com/en-us/library/windowsazure/dd894031.aspx

If you need to use SQL, if you need queries like TSQL, then OK.

I think I'm learning that Table Storage is designed to serialize classes (especially if you have many many instances to serialize). This link: http://msdn.microsoft.com/en-us/library/windowsazure/dd179423.aspx The schema for a table is defined as a C # class. This is the model used by ADO.NET Data Services. The scheme is known only to the client application and simplifies access to data. The server does not apply this scheme.

  [DataServiceKey("PartitionKey", "RowKey")] public class Blog { // ChannelName public string PartitionKey { get; set; } // PostedDate public string RowKey { get; set; } // User defined properties public string Text { get; set; } public int Rating { get; set; } public string RatingAsString { get; } protected string Id { get; set; } } 

The user will upload the file, which will go to the BLOB and string fields to describe the file. It must be able to scale to millions of records. There will only be a search in two required fields (properties): custID and package. No need to look for other fields, but I need to save them and let the user simply add new fields to the package. It needs to scale to millions of records, and therefore the BLOB repository is suitable for files. I think that I am leaving Table Storage - this is the ability to use REST on the client to load files and fields. You need to optimize up to 100,000 downloads at a time and support a reboot. The load will be relatively small batches, and it probably won't be REST, since I need to do some server-side validation validation.

What I'm going to do is two tables. Where the second is for dynamic data.

  Master PartitionKey CustID RowKey GUID string batch string filename Fields PartitionKey CustID+Guid RowKey fieldName string value 

The field name must be unique. Requests for Master will be by CustID or CustID and package. Field queries will be owned by PartitionKey. Comments please.

+4
source share
7 answers

I also created a library for using dynamic types with table storage:

To use it, first create a context object:

 var context = new DynamicTableContext("TableName", credentials); 

Then insert the record easily:

 context.Insert(new { PartitionKey="1", RowKey="1", Value1="Hello", Value2="World" }); 

You can do the same with the dictionary:

 var dictionary = new Dictionary<string, object>(); dictionary["PartitionKey"] = "2"; dictionary["RowKey"] = "2"; dictionary["Value3"] = "FooBar"; context.Insert(dictionary); 

Getting the object directly, just pass the values ​​for the section key and line string:

 dynamic entity = content.Get("1", "1"); 

You can also submit a request:

 foreach (dynamic item in context.Query("Value1 eq 'Hello'")) { Console.WriteLine(item.RowKey); } 

It is available on github here: https://github.com/richorama/AzureSugar

+3
source

This is not about table storage, but about LINQ. You can write dynamic LINQ (unfortunately, lose why LINQ is so awesome) using expression trees. LINQ is actually just Expression trees in the background.

So, three answers to your question:

  • Here's how to write dynamic LINQ

  • Here is LINQ dynamic library info to make things less ugly

  • And finally, you cannot use the order for table storage queries :)

+1
source

I am working on a dynamic client for the REST API. It is called Cyan , you can find it on Codeplex or in nuget, which is looking for "Cyan".

Your request:

 var cyan = new CyanTableClient("account", "password") var items = cyan.Query("Customers", filter: "(Rating ge 3) and (Rating le 6)", fields: new[] { "PartitionKey", "RowKey", "Address", "CustomerSince" }); 

Then you can access the fields as follows:

 var item = items.First(); // the address string address = item.Address; // or you can cast to CyanEntity and access fields from a Dictionary var entity = (CyanEntity)item; string address2 = (string)entity.Fields["Address"]; 

This is still work, please send me your feedback and feel free to contribute!

+1
source

I wrote an Azure table storage client that supports both static and late binding (via a dictionary). Any table property not contained in the entity type will be recorded in the dictionary. It also supports arrays, enumerations, big data, serialization, and more. More job opportunities too!

You can get it at http://www.lucifure.com .

+1
source

I had a similar problem, and I approached it initially, having a table similar to the "Fields" table - a lot of very small entries with a key / value. I ran into two problems: one - this Nagle Algorithm will ruin your inserts if you do not execute it asynchronously (up to 500 ms per insert) and the other - Azure scaling goals limit you to 20,000 objects / sec in the storage account.

In the end, I solved it by saving the Dictionary<string, string> using custom read / write methods, and I wrote that in How to save arbitrary key pairs in an Azure storage table?

0
source

The Azure Table service does not support ordering by request; it only supports where , select , first , firstorsdefault , from and take . You will write two separate queries.

0
source

It is much easier to do this now, using the new WindowsAzure.Storage service level. This library also has significant performance and latency improvements over older implementations of WCF data services.

You can get unknown objects from the table store this way using DynamicTableEntity . Then all properties are returned in the dictionary.

See relevant examples here .

0
source

Source: https://habr.com/ru/post/1390257/


All Articles