This has very little to do with powershell and everything related to your data, and how important it is to update it. A simple caching scheme would be to use a temporary system in which, after N minutes, a request to your back level of data would pull out a new copy and reset the timer. It seems you already have an idea what your specific rules should be. I donโt think that two consecutive "dir" commands should always lead to two pulls from the backup storage, but you think so about this system. So do it.
UPDATE
Perhaps a simple guideline might be that you should update your data only once after issuing a vendor command. The list of built-in commands that work with provider items consists of:
- Clear item
- Copy-item
- Get-item
- Invoke-item
- Move-Item
- New-item
- Remove-Item
- Rename-Item
- Set-item
In addition, the list of built-in commands that work with the properties of a provider item consists of:
- Clear-ItemProperty
- Copy-ItemProperty
- Get-ItemProperty
- Move-ItemProperty
- New-ItemProperty
- Remove-ItemProperty
- Rename-ItemProperty
- Set-ItemProperty
And finally, for reading / writing content we use:
- Add content
- Clear content
- Get-content
- Set-content
Each of these commands has a corresponding method in NavigationCmdletProvider (for hierarchical data stores), and here you can update your data. When implementing the New / Move / Rename / Remove / Set / Clear methods and other data modification methods, you should use some optimistic concurrency methodology, since the provider instances in PowerShell are not single; at any time there can be one or more copies.
I wrote a provider that takes its implementation from a script, which might turn out to be a simpler prototype of things. See http://psprovider.codeplex.com/
Hope this helps.
source share