You have two sides to this thing that are obvious. First, how will you store recipes that will be models. Obviously, models will not clean up other sites, because they have one responsibility: maintaining reliable data. Your controller (s), which initiates the scrambling and storage process, should also not contain a cleanup code (although they will call it).
In Ruby, we donβt deal with abstract classes and interfaces β it's ducky, so it's enough for your scrapers to implement a well-known method or set of methods β your scraping engines should be similar, especially in terms of the public methods that they reveal.
You put your scrapers - and here's the lame answer - anywhere. lib great, but if you want to create a plugin, that might also be a bad idea. See My question here - with the terrific answer of the famous Rails guy Yehuda Katz - for some other ideas, but overall: there is no right answer. However, there are some wrong ones.
source share