Scrapy is still an option.
Speed / Performance / Efficiency
Scrapy is written with Twisted, a popular event-driven network for Python. Thus, it is implemented using non-blocking (aka asynchronous) for concurrency.
Database pipelining
You mentioned that you want your data to be pipelined to the database, as you may know that Scrapy has an Item Pipelines function:
After the item has been cleaned by a spider, it is sent to the Pipeline item, which processes it through several components that are executed sequentially.
Thus, each page can be written to the database immediately after loading it.
Code Organization
Scrapy offers you a good and clear project structure, where you can logically set parameters, spiders, elements, pipelines, etc. Even this simplifies and simplifies your code.
Time for code
Scrapy does a lot of work for you backstage. This forces you to focus on the code itself and the logic itself, rather than thinking about the "metal" part: creating processes, threads, etc.
But, at the same time, Scrapy can be an overhead. Remember that Scrapy was designed (and is great) for scanning, cleaning data from a web page. If you just want to load a bunch of pages without looking at them - then yes, grequests are a good alternative.
source share