Edit: The answer below does not ultimately solve my case. This is because some Spark subfolders (or some of its dependencies) were created, but not all of them. The frequent need to create such paths will make any project unviable. So I launched Spark (PySpark in my case) as an administrator, who solved the problem. So, in the end, this is probably a permissions issue.
Original answer:
I solved the same problem as on my local Windows machine (not in the cluster). Since there were no problems with permissions, I created a directory that Spark could not create, i.e. I created the following folder as a local user, and I did not need to change any permissions for this folder.
C:\Users\<username>\AppData\Local\Temp\blockmgr-97439a5f-45b0-4257-a773-2b7650d17142
source share