Sharepoint performance while limiting load and capacity

I am working on a new storage system for a business solutions package that consists of about 40 applications. Some of these applications generate documents (mainly docx, some pdf) that are currently saved and organized in a shared folder.

Applications generate about 150,000-200,000 documents per year on average, and these documents should be stored in a more consistent and reliable form (i.e., a separate SQL database).

Sharepoint is a leading candidate, as we plan to use it in the future, and then - the capabilities of DMS. I read about the limitations of the document library, i.e. 2000 files in a folder with a volume of up to 1,000,000 files in all folders of the document library. I also read that the 2000 limit can be circumvented, but this affects performance. What I did not find is a real world experience with so many files in one library. And what happens if I increase the folder limit to 50,000, for example, what impact will it affect performance (slower requests to read / edit / write documents via web services, especially writing if it checks for duplicate file names, indexing, searching etc..).

One important note: we will not use the sharepoint web portal at all if we do not, but instead do everything through our applications through web services, so viewing data more slowly is not a problem.

+3
source share
2 answers

You can have as many elements in the document library as you want if your last paragraph is right (you won’t get access to the information through the portal itself)

DMS 7 . , , SPWeb.GetFile(guid), , , SQL ( GUID )

+6

2000 , , .

2000 , . , 2000 ( ), .

, . , sharepoint ( sql-), , , .

: . sql (), .

+3

Source: https://habr.com/ru/post/1724617/


All Articles